Compare commits
68 Commits
v0.44.8
...
multiproce
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5bdd52df25 | ||
|
|
3cde0b79c9 | ||
|
|
e2bd262f75 | ||
|
|
db26532bab | ||
|
|
7a73b9168d | ||
|
|
a43bfc5a33 | ||
|
|
1d6bc4e09e | ||
|
|
3e14b718ef | ||
|
|
1ae6270561 | ||
|
|
55a601c07e | ||
|
|
7d67b81879 | ||
|
|
cd02144ac3 | ||
|
|
9b247acd1c | ||
|
|
942126ea3d | ||
|
|
2b9ea11701 | ||
|
|
b3d3e14ffe | ||
|
|
62ae5db9fd | ||
|
|
77a49a09a1 | ||
|
|
06c5bbfcfd | ||
|
|
f3063d35be | ||
|
|
e32090bf39 | ||
|
|
79dcfb38a8 | ||
|
|
7ab500740b | ||
|
|
911bd30d28 | ||
|
|
282857eae0 | ||
|
|
d8c2f99c06 | ||
|
|
16d3f74366 | ||
|
|
5fc28139ea | ||
|
|
b7b6876688 | ||
|
|
235dea329c | ||
|
|
5afdf6fc20 | ||
|
|
385059e973 | ||
|
|
62aed02070 | ||
|
|
6843b8661d | ||
|
|
9da747ea9d | ||
|
|
22964afc69 | ||
|
|
3bc53fd92b | ||
|
|
bd31120569 | ||
|
|
6af124e4d3 | ||
|
|
b3b1d8f193 | ||
|
|
785580115b | ||
|
|
b4bd04c146 | ||
|
|
e88c6b8a59 | ||
|
|
74868238f3 | ||
|
|
61a300250d | ||
|
|
d8dbc0866f | ||
|
|
586d96ae74 | ||
|
|
81032a5745 | ||
|
|
c2d726beaf | ||
|
|
3bafdf7bfd | ||
|
|
edcc7ea34f | ||
|
|
6261a7b5c9 | ||
|
|
881832c92d | ||
|
|
47d4dc7ef0 | ||
|
|
10ce81bf98 | ||
|
|
98b3d9f81e | ||
|
|
81cbb7dcc4 | ||
|
|
9517876bd0 | ||
|
|
231d132792 | ||
|
|
9ada5dfea4 | ||
|
|
476c94407f | ||
|
|
458da0e9b2 | ||
|
|
66673012ac | ||
|
|
46f8b6dc5a | ||
|
|
ee81e69ece | ||
|
|
3927f05267 | ||
|
|
a010ab5a29 | ||
|
|
c49bebd412 |
@@ -257,7 +257,9 @@
|
||||
"avatar_url": "https://avatars.githubusercontent.com/u/21261491?v=4",
|
||||
"profile": "https://github.com/oPromessa",
|
||||
"contributions": [
|
||||
"bug"
|
||||
"bug",
|
||||
"ideas",
|
||||
"test"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -315,6 +317,15 @@
|
||||
"code",
|
||||
"bug"
|
||||
]
|
||||
},
|
||||
{
|
||||
"login": "xwu64",
|
||||
"name": "Xiaoliang Wu",
|
||||
"avatar_url": "https://avatars.githubusercontent.com/u/10580396?v=4",
|
||||
"profile": "https://github.com/xwu64",
|
||||
"contributions": [
|
||||
"code"
|
||||
]
|
||||
}
|
||||
],
|
||||
"contributorsPerLine": 7,
|
||||
|
||||
111
CHANGELOG.md
@@ -4,6 +4,117 @@ All notable changes to this project will be documented in this file. Dates are d
|
||||
|
||||
Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
|
||||
|
||||
#### [v0.45.8](https://github.com/RhetTbull/osxphotos/compare/v0.45.6...v0.45.8)
|
||||
|
||||
> 5 February 2022
|
||||
|
||||
- Fixed exiftool to ignore unsupported file types, #615 [`1ae6270`](https://github.com/RhetTbull/osxphotos/commit/1ae627056113fc4655f1b24cfbbdf0efc04489e7)
|
||||
- Updated tests [`55a601c`](https://github.com/RhetTbull/osxphotos/commit/55a601c07ea1384623c55d5c1d26b568df5d7823)
|
||||
- Additional fix for #615 [`1d6bc4e`](https://github.com/RhetTbull/osxphotos/commit/1d6bc4e09e3c2359a21f842fadd781920606812e)
|
||||
|
||||
#### [v0.45.6](https://github.com/RhetTbull/osxphotos/compare/v0.45.5...v0.45.6)
|
||||
|
||||
> 5 February 2022
|
||||
|
||||
- Fix for unicode in query strings, #618 [`9b247ac`](https://github.com/RhetTbull/osxphotos/commit/9b247acd1cc4b2def59fdd18a6fb3c8eb9914f11)
|
||||
- Fix for --name searching only original_filename on Photos 5+, #594 [`cd02144`](https://github.com/RhetTbull/osxphotos/commit/cd02144ac33cc1c13a20358133971c84d35b8a57)
|
||||
|
||||
#### [v0.45.5](https://github.com/RhetTbull/osxphotos/compare/v0.45.4...v0.45.5)
|
||||
|
||||
> 5 February 2022
|
||||
|
||||
- Fix for #561, no really, I mean it this time [`b3d3e14`](https://github.com/RhetTbull/osxphotos/commit/b3d3e14ffe41fbb22edb614b24f3985f379766a2)
|
||||
- Updated docs [skip ci] [`2b9ea11`](https://github.com/RhetTbull/osxphotos/commit/2b9ea11701799af9a661a8e2af70fca97235f487)
|
||||
- Updated tests for #561 [skip ci] [`77a49a0`](https://github.com/RhetTbull/osxphotos/commit/77a49a09a1bee74113a7114c543fbc25fa410ffc)
|
||||
|
||||
#### [v0.45.4](https://github.com/RhetTbull/osxphotos/compare/v0.45.3...v0.45.4)
|
||||
|
||||
> 3 February 2022
|
||||
|
||||
- docs: add oPromessa as a contributor for ideas, test [`#611`](https://github.com/RhetTbull/osxphotos/pull/611)
|
||||
- Fix for filenames with special characters, #561, #618 [`f3063d3`](https://github.com/RhetTbull/osxphotos/commit/f3063d35be3c96342d83dbd87ddd614a2001bff4)
|
||||
- Updated docs [skip ci] [`06c5bbf`](https://github.com/RhetTbull/osxphotos/commit/06c5bbfcfdf591a4a5d43f1456adaa27385fe01a)
|
||||
- Added progress counter, #601 [`7ab5007`](https://github.com/RhetTbull/osxphotos/commit/7ab500740b28594dcd778140e10991f839220e9d)
|
||||
- Updated known issues [skip ci] [`e32090b`](https://github.com/RhetTbull/osxphotos/commit/e32090bf39cb786171b49443f878ffdbab774420)
|
||||
|
||||
#### [v0.45.3](https://github.com/RhetTbull/osxphotos/compare/v0.45.2...v0.45.3)
|
||||
|
||||
> 29 January 2022
|
||||
|
||||
- Added --timestamp option for --verbose, #600 [`d8c2f99`](https://github.com/RhetTbull/osxphotos/commit/d8c2f99c06bc6f72bf2cb1a13c5765824fe3cbba)
|
||||
- Updated docs [skip ci] [`5fc2813`](https://github.com/RhetTbull/osxphotos/commit/5fc28139ea0374bc3e228c0432b8a41ada430389)
|
||||
- Updated formatting for elapsed time, #604 [`16d3f74`](https://github.com/RhetTbull/osxphotos/commit/16d3f743664396d43b3b3028a5e7a919ec56d9e1)
|
||||
|
||||
#### [v0.45.2](https://github.com/RhetTbull/osxphotos/compare/v0.45.0...v0.45.2)
|
||||
|
||||
> 29 January 2022
|
||||
|
||||
- Implemented #605, refactor out export2 [`235dea3`](https://github.com/RhetTbull/osxphotos/commit/235dea329c98ab8fa61565c09a1b4a83e5d99043)
|
||||
- Fix for #564, --preview with --download-missing [`5afdf6f`](https://github.com/RhetTbull/osxphotos/commit/5afdf6fc20a3cb6eb2b0217d8b3be20295eb7ba4)
|
||||
|
||||
#### [v0.45.0](https://github.com/RhetTbull/osxphotos/compare/v0.44.13...v0.45.0)
|
||||
|
||||
> 28 January 2022
|
||||
|
||||
- Performance improvements and refactoring, #462, partial for #591 [`22964af`](https://github.com/RhetTbull/osxphotos/commit/22964afc6988166218413125d7a62348bb858a83)
|
||||
- Refactored photoexporter for performance, #591 [`6843b86`](https://github.com/RhetTbull/osxphotos/commit/6843b8661d41d42368794c77304fc07194e7af18)
|
||||
- Performance improvements, partial for #591 [`3bc53fd`](https://github.com/RhetTbull/osxphotos/commit/3bc53fd92b3222c6959e7aa12310811db41b83fe)
|
||||
|
||||
#### [v0.44.13](https://github.com/RhetTbull/osxphotos/compare/v0.44.12...v0.44.13)
|
||||
|
||||
> 24 January 2022
|
||||
|
||||
- Removed exportdb requirement from PhotoTemplate [`6af124e`](https://github.com/RhetTbull/osxphotos/commit/6af124e4d3a0e26c48f435452920020cd42afa1c)
|
||||
- Version bump [`bd31120`](https://github.com/RhetTbull/osxphotos/commit/bd3112056920806f565be2c0c12caf4f2aff5231)
|
||||
|
||||
#### [v0.44.12](https://github.com/RhetTbull/osxphotos/compare/v0.44.11...v0.44.12)
|
||||
|
||||
> 23 January 2022
|
||||
|
||||
- Added query options to repl, #597 [`7855801`](https://github.com/RhetTbull/osxphotos/commit/785580115b29f5ccb895de22be1243f56dbb43dc)
|
||||
- Added run command, #598 [`b4bd04c`](https://github.com/RhetTbull/osxphotos/commit/b4bd04c1461d0b427937f541403305bc979bcf4f)
|
||||
- Bug fix for get_photos_library_version [`e88c6b8`](https://github.com/RhetTbull/osxphotos/commit/e88c6b8a59dfd947f6cf3c7eac9c92519ab781a3)
|
||||
|
||||
#### [v0.44.11](https://github.com/RhetTbull/osxphotos/compare/v0.44.10...v0.44.11)
|
||||
|
||||
> 23 January 2022
|
||||
|
||||
- creat unit test for __all__ [`#599`](https://github.com/RhetTbull/osxphotos/pull/599)
|
||||
- Performance improvements, added --profile [`7486823`](https://github.com/RhetTbull/osxphotos/commit/74868238f3b1ee18feb744f137f5c14ef8e36ffc)
|
||||
|
||||
#### [v0.44.10](https://github.com/RhetTbull/osxphotos/compare/v0.44.9...v0.44.10)
|
||||
|
||||
> 22 January 2022
|
||||
|
||||
- Create __all__ for all python files [`#589`](https://github.com/RhetTbull/osxphotos/pull/589)
|
||||
- Create __all__ for the file cli.py [`#587`](https://github.com/RhetTbull/osxphotos/pull/587)
|
||||
- docs: add xwu64 as a contributor for code [`#585`](https://github.com/RhetTbull/osxphotos/pull/585)
|
||||
- add __all__ to files "adjustmentsinfo.py" and "albuminfo.py" [`#584`](https://github.com/RhetTbull/osxphotos/pull/584)
|
||||
- More refactoring of export code, #462 [`6261a7b`](https://github.com/RhetTbull/osxphotos/commit/6261a7b5c96ac43aece66b72b9e27a90854accfa)
|
||||
- Added ExportOptions to photoexporter.py, #462 [`9517876`](https://github.com/RhetTbull/osxphotos/commit/9517876bd06572238648a6362a309063b86007e7)
|
||||
- Blackified files [`3bafdf7`](https://github.com/RhetTbull/osxphotos/commit/3bafdf7bfd5f7992b2e0c12496c55e7be1f57455)
|
||||
- More refactoring of export code, #462 [`c2d726b`](https://github.com/RhetTbull/osxphotos/commit/c2d726beafabe76cf4d5fb3213447c900129b8c0)
|
||||
- Refactored photoexporter sidecar writing, #462 [`458da0e`](https://github.com/RhetTbull/osxphotos/commit/458da0e9b2b82a78cec30191c5bf1ee2ed993acf)
|
||||
|
||||
#### [v0.44.9](https://github.com/RhetTbull/osxphotos/compare/v0.44.8...v0.44.9)
|
||||
|
||||
> 9 January 2022
|
||||
|
||||
- Added diff command [`3927f05`](https://github.com/RhetTbull/osxphotos/commit/3927f052670b2a1c31cced1f8278a0ffe519a3eb)
|
||||
- Added uuid command [`a010ab5`](https://github.com/RhetTbull/osxphotos/commit/a010ab5a299470782b938e689a7ddc336513065e)
|
||||
|
||||
#### [v0.44.8](https://github.com/RhetTbull/osxphotos/compare/v0.44.7...v0.44.8)
|
||||
|
||||
> 9 January 2022
|
||||
|
||||
- docs: add ahti123 as a contributor for code, bug [`#578`](https://github.com/RhetTbull/osxphotos/pull/578)
|
||||
- changing photos_5 version constant to satisfy version 5001 [`#577`](https://github.com/RhetTbull/osxphotos/pull/577)
|
||||
- Added grep command to CLI [`4dd838b`](https://github.com/RhetTbull/osxphotos/commit/4dd838b8bcb639eba3df9cb60a7cd28f45b22833)
|
||||
- Added test for #576 [`92fced7`](https://github.com/RhetTbull/osxphotos/commit/92fced75da38f1c47be8d3d9d4ee22463ad029b9)
|
||||
- Added sqlgrep [`53c701c`](https://github.com/RhetTbull/osxphotos/commit/53c701cc0ebd38db255c1ce694391b38dbb5fe01)
|
||||
- Fix for #575, database version 5001 [`5a8105f`](https://github.com/RhetTbull/osxphotos/commit/5a8105f5a02080368ad22717c064afcb0748f646)
|
||||
- Updated docs [skip ci] [`64a0760`](https://github.com/RhetTbull/osxphotos/commit/64a0760a47205a452e015a860f39f45bba67164a)
|
||||
|
||||
#### [v0.44.7](https://github.com/RhetTbull/osxphotos/compare/v0.44.6...v0.44.7)
|
||||
|
||||
> 8 January 2022
|
||||
|
||||
11
MANIFEST.in
@@ -1,6 +1,7 @@
|
||||
include README.md
|
||||
include README.rst
|
||||
include osxphotos/templates/*
|
||||
include osxphotos/*.json
|
||||
include osxphotos/*.md
|
||||
include osxphotos/phototemplate.tx
|
||||
include osxphotos/phototemplate.md
|
||||
include osxphotos/queries/*
|
||||
include osxphotos/queries/*
|
||||
include osxphotos/templates/*
|
||||
include README.md
|
||||
include README.rst
|
||||
147
README.md
@@ -5,7 +5,7 @@
|
||||

|
||||
[](https://pepy.tech/project/osxphotos)
|
||||
<!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section -->
|
||||
[](#contributors)
|
||||
[](#contributors)
|
||||
<!-- ALL-CONTRIBUTORS-BADGE:END -->
|
||||
|
||||
OSXPhotos provides the ability to interact with and query Apple's Photos.app library on macOS. You can query the Photos library database — for example, file name, file path, and metadata such as keywords/tags, persons/faces, albums, etc. You can also easily export both the original and edited photos.
|
||||
@@ -38,6 +38,7 @@ OSXPhotos provides the ability to interact with and query Apple's Photos.app lib
|
||||
+ [Raw Photos](#raw-photos)
|
||||
+ [Template System](#template-system)
|
||||
+ [ExifTool](#exiftoolExifTool)
|
||||
+ [PhotoExporter](#photoexporter)
|
||||
+ [Text Detection](#textdetection)
|
||||
+ [Utility Functions](#utility-functions)
|
||||
* [Examples](#examples)
|
||||
@@ -142,6 +143,7 @@ Options:
|
||||
Commands:
|
||||
about Print information about osxphotos including license.
|
||||
albums Print out albums found in the Photos library.
|
||||
diff Compare two Photos databases and print out differences
|
||||
dump Print list of all photos & associated info from the Photos...
|
||||
export Export photos from the Photos database.
|
||||
help Print help; for help on commands: help <command>.
|
||||
@@ -154,8 +156,10 @@ Commands:
|
||||
places Print out places found in the Photos library.
|
||||
query Query the Photos database using 1 or more search options; if...
|
||||
repl Run interactive osxphotos REPL shell (useful for debugging,...
|
||||
snap Create snapshot of Photos database to use with diff command
|
||||
tutorial Display osxphotos tutorial.
|
||||
uninstall Uninstall Python packages from the osxphotos environment
|
||||
uuid Print out unique IDs (UUID) of photos selected in Photos
|
||||
```
|
||||
|
||||
To get help on a specific command, use `osxphotos help <command_name>`
|
||||
@@ -597,6 +601,7 @@ Options:
|
||||
library, 2. system library, 3.
|
||||
~/Pictures/Photos Library.photoslibrary
|
||||
-V, --verbose Print verbose output.
|
||||
--timestamp Add time stamp to verbose output
|
||||
--keyword KEYWORD Search for photos with keyword KEYWORD. If
|
||||
more than one keyword, treated as "OR", e.g.
|
||||
find photos matching any keyword
|
||||
@@ -1175,6 +1180,9 @@ Options:
|
||||
--save-config <config file path>
|
||||
Save options to file for use with --load-
|
||||
config. File format is TOML.
|
||||
-M, --multiprocess NUMBER_OF_PROCESSES
|
||||
Run export in parallel using
|
||||
NUMBER_OF_PROCESSES processes. [x>=1]
|
||||
--help Show this message and exit.
|
||||
|
||||
** Export **
|
||||
@@ -1720,7 +1728,7 @@ Substitution Description
|
||||
{lf} A line feed: '\n', alias for {newline}
|
||||
{cr} A carriage return: '\r'
|
||||
{crlf} a carriage return + line feed: '\r\n'
|
||||
{osxphotos_version} The osxphotos version, e.g. '0.44.8'
|
||||
{osxphotos_version} The osxphotos version, e.g. '0.45.8'
|
||||
{osxphotos_cmd_line} The full command line used to run osxphotos
|
||||
|
||||
The following substitutions may result in multiple values. Thus if specified for
|
||||
@@ -2763,25 +2771,27 @@ Returns a JSON representation of all photo info.
|
||||
Returns a dictionary representation of all photo info.
|
||||
|
||||
#### `export()`
|
||||
`export(dest, filename=None, edited=False, live_photo=False, export_as_hardlink=False, overwrite=False, increment=True, sidecar_json=False, sidecar_exiftool=False, sidecar_xmp=False, use_photos_export=False, timeout=120, exiftool=False, use_albums_as_keywords=False, use_persons_as_keywords=False)`
|
||||
`export(dest, filename=None, edited=False, live_photo=False, export_as_hardlink=False, overwrite=False, increment=True, sidecar_json=False, sidecar_exiftool=False, sidecar_xmp=False, download_missing=False, use_photos_export=False, use_photokit=True, timeout=120, exiftool=False, use_albums_as_keywords=False, use_persons_as_keywords=False)`
|
||||
|
||||
Export photo from the Photos library to another destination on disk.
|
||||
- dest: must be valid destination path as str (or exception raised).
|
||||
- filename (optional): name of picture as str; if not provided, will use current filename. **NOTE**: if provided, user must ensure file extension (suffix) is correct. For example, if photo is .CR2 file, edited image may be .jpeg. If you provide an extension different than what the actual file is, export will print a warning but will happily export the photo using the incorrect file extension. e.g. to get the extension of the edited photo, look at [PhotoInfo.path_edited](#path_edited).
|
||||
- edited: boolean; if True (default=False), will export the edited version of the photo (or raise exception if no edited version)
|
||||
- export_as_hardlink: boolean; if True (default=False), will hardlink files instead of copying them
|
||||
- overwrite: boolean; if True (default=False), will overwrite files if they alreay exist
|
||||
- live_photo: boolean; if True (default=False), will also export the associted .mov for live photos; exported live photo will be named filename.mov
|
||||
- increment: boolean; if True (default=True), will increment file name until a non-existent name is found
|
||||
- sidecar_json: (boolean, default = False); if True will also write a json sidecar with metadata in format readable by exiftool; sidecar filename will be dest/filename.json where filename is the stem of the photo name
|
||||
- sidecar_json: (boolean, default = False); if True will also write a json sidecar with metadata in format readable by exiftool; sidecar filename will be dest/filename.json where filename is the stem of the photo name; resulting json file will include tag group names (e.g. `exiftool -G -j`)
|
||||
- sidecar_exiftool: (boolean, default = False); if True will also write a json sidecar with metadata in format readable by exiftool; sidecar filename will be dest/filename.json where filename is the stem of the photo name; resulting json file will not include tag group names (e.g. `exiftool -j`)
|
||||
- sidecar_xmp: (boolean, default = False); if True will also write a XMP sidecar with metadata; sidecar filename will be dest/filename.xmp where filename is the stem of the photo name
|
||||
- use_photos_export: boolean; (default=False), if True will attempt to export photo via applescript interaction with Photos; useful for forcing download of missing photos. This only works if the Photos library being used is the default library (last opened by Photos) as applescript will directly interact with whichever library Photos is currently using.
|
||||
- edited: bool; if True (default=False), will export the edited version of the photo (or raise exception if no edited version)
|
||||
- export_as_hardlink: bool; if True (default=False), will hardlink files instead of copying them
|
||||
- overwrite: bool; if True (default=False), will overwrite files if they alreay exist
|
||||
- live_photo: bool; if True (default=False), will also export the associted .mov for live photos; exported live photo will be named filename.mov
|
||||
- increment: bool; if True (default=True), will increment file name until a non-existent name is found
|
||||
- sidecar_json: (bool, default = False); if True will also write a json sidecar with metadata in format readable by exiftool; sidecar filename will be dest/filename.json where filename is the stem of the photo name
|
||||
- sidecar_json: (bool, default = False); if True will also write a json sidecar with metadata in format readable by exiftool; sidecar filename will be dest/filename.json where filename is the stem of the photo name; resulting json file will include tag group names (e.g. `exiftool -G -j`)
|
||||
- sidecar_exiftool: (bool, default = False); if True will also write a json sidecar with metadata in format readable by exiftool; sidecar filename will be dest/filename.json where filename is the stem of the photo name; resulting json file will not include tag group names (e.g. `exiftool -j`)
|
||||
- sidecar_xmp: (bool, default = False); if True will also write a XMP sidecar with metadata; sidecar filename will be dest/filename.xmp where filename is the stem of the photo name
|
||||
- use_photos_export: (bool, default=False); if True will attempt to export photo via AppleScript or PhotoKit interaction with Photos
|
||||
- download_missing: (bool, default=False); if True will attempt to export photo via AppleScript or PhotoKit interaction with Photos if missing
|
||||
- use_photokit: (bool, default=True); if True will attempt to export photo via photokit instead of AppleScript when used with use_photos_export or download_missing
|
||||
- timeout: (int, default=120) timeout in seconds used with use_photos_export
|
||||
- exiftool: (boolean, default = False) if True, will use [exiftool](https://exiftool.org/) to write metadata directly to the exported photo; exiftool must be installed and in the system path
|
||||
- use_albums_as_keywords: (boolean, default = False); if True, will use album names as keywords when exporting metadata with exiftool or sidecar
|
||||
- use_persons_as_keywords: (boolean, default = False); if True, will use person names as keywords when exporting metadata with exiftool or sidecar
|
||||
- exiftool: (bool, default = False) if True, will use [exiftool](https://exiftool.org/) to write metadata directly to the exported photo; exiftool must be installed and in the system path
|
||||
- use_albums_as_keywords: (bool, default = False); if True, will use album names as keywords when exporting metadata with exiftool or sidecar
|
||||
- use_persons_as_keywords: (bool, default = False); if True, will use person names as keywords when exporting metadata with exiftool or sidecar
|
||||
|
||||
Returns: list of paths to exported files. More than one file could be exported, for example if live_photo=True, both the original image and the associated .mov file will be exported
|
||||
|
||||
@@ -3622,7 +3632,7 @@ The following template field substitutions are availabe for use the templating s
|
||||
|{lf}|A line feed: '\n', alias for {newline}|
|
||||
|{cr}|A carriage return: '\r'|
|
||||
|{crlf}|a carriage return + line feed: '\r\n'|
|
||||
|{osxphotos_version}|The osxphotos version, e.g. '0.44.8'|
|
||||
|{osxphotos_version}|The osxphotos version, e.g. '0.45.8'|
|
||||
|{osxphotos_cmd_line}|The full command line used to run osxphotos|
|
||||
|{album}|Album(s) photo is contained in|
|
||||
|{folder_album}|Folder path + album photo is contained in. e.g. 'Folder/Subfolder/Album' or just 'Album' if no enclosing folder|
|
||||
@@ -3706,6 +3716,105 @@ osxphotos.exiftool also provides an `ExifToolCaching` class which caches all met
|
||||
|
||||
`ExifTool()` runs `exiftool` as a subprocess using the `-stay_open True` flag to keep the process running in the background. The subprocess will be cleaned up when your main script terminates. `ExifTool()` uses a singleton pattern to ensure that only one instance of `exiftool` is created. Multiple instances of `ExifTool()` will all use the same `exiftool` subprocess.
|
||||
|
||||
### <a name="photoexporter">PhotoExporter</a>
|
||||
|
||||
[PhotoInfo.export()](#photoinfo) provides a simple method to export a photo. This method actually calls `PhotoExporter.export()` to do the export. `PhotoExporter` provides many more options to configure the export and report results and this is what the osxphotos command line export tools uses.
|
||||
|
||||
#### `export(dest, filename=None, options: Optional[ExportOptions]=None) -> ExportResults`
|
||||
|
||||
Export a photo.
|
||||
|
||||
Args:
|
||||
|
||||
- dest: must be valid destination path or exception raised
|
||||
- filename: (optional): name of exported picture; if not provided, will use current filename
|
||||
- options (ExportOptions): optional ExportOptions instance
|
||||
|
||||
Returns: ExportResults instance
|
||||
|
||||
*Note*: to use dry run mode, you must set options.dry_run=True and also pass in memory version of export_db, and no-op fileutil (e.g. ExportDBInMemory and FileUtilNoOp) in options.export_db and options.fileutil respectively.
|
||||
|
||||
#### `ExportOptions`
|
||||
|
||||
Options class for exporting photos with `export`
|
||||
|
||||
Attributes:
|
||||
|
||||
- convert_to_jpeg (bool): if True, converts non-jpeg images to jpeg
|
||||
- description_template (str): optional template string that will be rendered for use as photo description
|
||||
- download_missing: (bool, default=False): if True will attempt to export photo via applescript interaction with Photos if missing (see also use_photokit, use_photos_export)
|
||||
- dry_run: (bool, default=False): set to True to run in "dry run" mode
|
||||
- edited: (bool, default=False): if True will export the edited version of the photo otherwise exports the original version
|
||||
- exiftool_flags (list of str): optional list of flags to pass to exiftool when using exiftool option, e.g ["-m", "-F"]
|
||||
- exiftool: (bool, default = False): if True, will use exiftool to write metadata to export file
|
||||
- export_as_hardlink: (bool, default=False): if True, will hardlink files instead of copying them
|
||||
- export_db: (ExportDB_ABC): instance of a class that conforms to ExportDB_ABC with methods for getting/setting data related to exported files to compare update state
|
||||
- fileutil: (FileUtilABC): class that conforms to FileUtilABC with various file utilities
|
||||
- ignore_date_modified (bool): for use with sidecar and exiftool; if True, sets EXIF:ModifyDate to EXIF:DateTimeOriginal even if date_modified is set
|
||||
- ignore_signature (bool, default=False): ignore file signature when used with update (look only at filename)
|
||||
- increment (bool, default=True): if True, will increment file name until a non-existant name is found if overwrite=False and increment=False, export will fail if destination file already exists
|
||||
- jpeg_ext (str): if set, will use this value for extension on jpegs converted to jpeg with convert_to_jpeg; if not set, uses jpeg; do not include the leading "."
|
||||
- jpeg_quality (float in range 0.0 <= jpeg_quality <= 1.0): a value of 1.0 specifies use best quality, a value of 0.0 specifies use maximum compression.
|
||||
- keyword_template (list of str): list of template strings that will be rendered as used as keywords
|
||||
- live_photo (bool, default=False): if True, will also export the associated .mov for live photos
|
||||
- location (bool): if True, include location in exported metadata
|
||||
- merge_exif_keywords (bool): if True, merged keywords found in file's exif data (requires exiftool)
|
||||
- merge_exif_persons (bool): if True, merged persons found in file's exif data (requires exiftool)
|
||||
- overwrite (bool, default=False): if True will overwrite files if they already exist
|
||||
- persons (bool): if True, include persons in exported metadata
|
||||
- preview_suffix (str): optional string to append to end of filename for preview images
|
||||
- preview (bool): if True, also exports preview image
|
||||
- raw_photo (bool, default=False): if True, will also export the associated RAW photo
|
||||
- render_options (RenderOptions): optional osxphotos.phototemplate.RenderOptions instance to specify options for rendering templates
|
||||
- replace_keywords (bool): if True, keyword_template replaces any keywords, otherwise it's additive
|
||||
- sidecar_drop_ext (bool, default=False): if True, drops the photo's extension from sidecar filename (e.g. 'IMG_1234.json' instead of 'IMG_1234.JPG.json')
|
||||
- sidecar: bit field (int): set to one or more of SIDECAR_XMP, SIDECAR_JSON, SIDECAR_EXIFTOOL
|
||||
- SIDECAR_JSON: if set will write a json sidecar with data in format readable by exiftool sidecar filename will be dest/filename.json; includes exiftool tag group names (e.g. `exiftool -G -j`)
|
||||
- SIDECAR_EXIFTOOL: if set will write a json sidecar with data in format readable by exiftool sidecar filename will be dest/filename.json; does not include exiftool tag group names (e.g. `exiftool -j`)
|
||||
- SIDECAR_XMP: if set will write an XMP sidecar with IPTC data sidecar filename will be dest/filename.xmp
|
||||
- strip (bool): if True, strip whitespace from rendered templates
|
||||
- timeout (int, default=120): timeout in seconds used with use_photos_export
|
||||
- touch_file (bool, default=False): if True, sets file's modification time upon photo date
|
||||
- update (bool, default=False): if True export will run in update mode, that is, it will not export the photo if the current version already exists in the destination
|
||||
- use_albums_as_keywords (bool, default = False): if True, will include album names in keywords when exporting metadata with exiftool or sidecar
|
||||
- use_persons_as_keywords (bool, default = False): if True, will include person names in keywords when exporting metadata with exiftool or sidecar
|
||||
- use_photos_export (bool, default=False): if True will attempt to export photo via applescript interaction with Photos even if not missing (see also use_photokit, download_missing)
|
||||
- use_photokit (bool, default=False): if True, will use photokit to export photos when use_photos_export is True
|
||||
- verbose (Callable): optional callable function to use for printing verbose text during processing; if None (default), does not print output.
|
||||
|
||||
#### `ExportResults`
|
||||
|
||||
`PhotoExporter().export()` returns an instance of this class.
|
||||
|
||||
`ExportResults` has the following properties:
|
||||
|
||||
- exported: list of all exported files (A single call to export could export more than one file, e.g. original file, preview, live video, raw, etc.)
|
||||
- new: list of new files exported when used with update=True
|
||||
- updated: list of updated files when used with update=True
|
||||
- skipped: list of skipped files when used with update=True
|
||||
- exif_updated: list of updated files when used with update=True and exiftool
|
||||
- touched: list of files touched during export (e.g. file date/time updated with touch_file=True)
|
||||
- to_touch: Reserved for internal use of export
|
||||
- converted_to_jpeg: list of files converted to jpeg when convert_to_jpeg=True
|
||||
- sidecar_json_written: list of JSON sidecars written
|
||||
- sidecar_json_skipped: list of JSON sidecars skipped when update=True
|
||||
- sidecar_exiftool_written: list of exiftool sidecars written
|
||||
- sidecar_exiftool_skipped: list of exiftool sidecars skipped when update=True
|
||||
- sidecar_xmp_written: list of XMP sidecars written
|
||||
- sidecar_xmp_skipped: list of XMP sidecars skipped when update=True
|
||||
- missing: list of missing files
|
||||
- error: list of tuples containing (filename, error) if error generated during export
|
||||
- exiftool_warning: list of warnings generated by exiftool during export
|
||||
- exiftool_error: list of errors generated by exiftool during export
|
||||
- xattr_written: list of files with extended attributes written during export
|
||||
- xattr_skipped: list of files where extended attributes were skipped when update=True
|
||||
- deleted_files: reserved for use by osxphotos CLI
|
||||
- deleted_directories: reserved for use by osxphotos CLI
|
||||
- exported_album: reserved for use by osxphotos CLI
|
||||
- skipped_album: reserved for use by osxphotos CLI
|
||||
- missing_album: reserved for use by osxphotos CLI
|
||||
|
||||
|
||||
### <a name="textdetection">Text Detection</a>
|
||||
|
||||
The [PhotoInfo.detected_text()](#detected_text_method) and the `{detected_text}` template will perform text detection on the photos in your library. Text detection is a slow process so to avoid unnecessary re-processing of photos, osxphotos will cache the results of the text detection process as an extended attribute on the photo image file. Extended attributes do not modify the actual file. The extended attribute is named `osxphotos.metadata:detected_text` and can be viewed using the built-in [xattr](https://ss64.com/osx/xattr.html) command or my [osxmetadata](https://github.com/RhetTbull/osxmetadata) tool. If you want to remove the cached attribute, you can do so with osxmetadata as follows:
|
||||
@@ -3842,7 +3951,7 @@ Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/d
|
||||
<td align="center"><a href="https://github.com/mkirkland4874"><img src="https://avatars.githubusercontent.com/u/36466711?v=4?s=75" width="75px;" alt=""/><br /><sub><b>mkirkland4874</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3Amkirkland4874" title="Bug reports">🐛</a> <a href="#example-mkirkland4874" title="Examples">💡</a></td>
|
||||
<td align="center"><a href="https://github.com/jcommisso07"><img src="https://avatars.githubusercontent.com/u/3111054?v=4?s=75" width="75px;" alt=""/><br /><sub><b>Joseph Commisso</b></sub></a><br /><a href="#data-jcommisso07" title="Data">🔣</a></td>
|
||||
<td align="center"><a href="https://github.com/dssinger"><img src="https://avatars.githubusercontent.com/u/1817903?v=4?s=75" width="75px;" alt=""/><br /><sub><b>David Singer</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3Adssinger" title="Bug reports">🐛</a></td>
|
||||
<td align="center"><a href="https://github.com/oPromessa"><img src="https://avatars.githubusercontent.com/u/21261491?v=4?s=75" width="75px;" alt=""/><br /><sub><b>oPromessa</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3AoPromessa" title="Bug reports">🐛</a></td>
|
||||
<td align="center"><a href="https://github.com/oPromessa"><img src="https://avatars.githubusercontent.com/u/21261491?v=4?s=75" width="75px;" alt=""/><br /><sub><b>oPromessa</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3AoPromessa" title="Bug reports">🐛</a> <a href="#ideas-oPromessa" title="Ideas, Planning, & Feedback">🤔</a> <a href="https://github.com/RhetTbull/osxphotos/commits?author=oPromessa" title="Tests">⚠️</a></td>
|
||||
<td align="center"><a href="http://spencerchang.me"><img src="https://avatars.githubusercontent.com/u/14796580?v=4?s=75" width="75px;" alt=""/><br /><sub><b>Spencer Chang</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3Aspencerc99" title="Bug reports">🐛</a></td>
|
||||
</tr>
|
||||
<tr>
|
||||
@@ -3851,6 +3960,7 @@ Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/d
|
||||
<td align="center"><a href="https://hyfen.net"><img src="https://avatars.githubusercontent.com/u/6291?v=4?s=75" width="75px;" alt=""/><br /><sub><b>Andrew Louis</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/commits?author=hyfen" title="Documentation">📖</a> <a href="https://github.com/RhetTbull/osxphotos/commits?author=hyfen" title="Code">💻</a></td>
|
||||
<td align="center"><a href="https://github.com/neebah"><img src="https://avatars.githubusercontent.com/u/71442026?v=4?s=75" width="75px;" alt=""/><br /><sub><b>neebah</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3Aneebah" title="Bug reports">🐛</a></td>
|
||||
<td align="center"><a href="https://github.com/ahti123"><img src="https://avatars.githubusercontent.com/u/22232632?v=4?s=75" width="75px;" alt=""/><br /><sub><b>Ahti Liin</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/commits?author=ahti123" title="Code">💻</a> <a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3Aahti123" title="Bug reports">🐛</a></td>
|
||||
<td align="center"><a href="https://github.com/xwu64"><img src="https://avatars.githubusercontent.com/u/10580396?v=4?s=75" width="75px;" alt=""/><br /><sub><b>Xiaoliang Wu</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/commits?author=xwu64" title="Code">💻</a></td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
@@ -3867,7 +3977,6 @@ My goal is make osxphotos as reliable and comprehensive as possible. The test s
|
||||
|
||||
- Audio-only files are not handled. It is possible to store audio-only files in Photos. osxphotos currently only handles images and videos. See [Issue #436](https://github.com/RhetTbull/osxphotos/issues/436)
|
||||
- Face coordinates (mouth, left eye, right eye) may not be correct for images where the head is tilted. See [Issue #196](https://github.com/RhetTbull/osxphotos/issues/196).
|
||||
- Raw images imported to Photos with an associated jpeg preview are not handled correctly by osxphotos. osxphotos query and export will operate on the jpeg preview instead of the raw image as will `PhotoInfo.path`. If the user selects "Use RAW as original" in Photos, the raw image will be exported or operated on but the jpeg will be ignored. See [Issue #101](https://github.com/RhetTbull/osxphotos/issues/101). Note: Beta version of fix for this bug is implemented in the current version of osxphotos.
|
||||
- The `--download-missing` option for `osxphotos export` does not work correctly with burst images. It will download the primary image but not the other burst images. See [Issue #75](https://github.com/RhetTbull/osxphotos/issues/75).
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Sphinx build info version 1
|
||||
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
|
||||
config: fff79f4920939baa44eddc90423972ec
|
||||
config: bf43bf49b725c31ce72a8823e4f8012b
|
||||
tags: 645f666f9bcd5a90fca523b33c5a78b7
|
||||
|
||||
2
docs/_static/documentation_options.js
vendored
@@ -1,6 +1,6 @@
|
||||
var DOCUMENTATION_OPTIONS = {
|
||||
URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'),
|
||||
VERSION: '0.44.8',
|
||||
VERSION: '0.45.8',
|
||||
LANGUAGE: 'None',
|
||||
COLLAPSE_INDEX: false,
|
||||
BUILDER: 'html',
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
<meta charset="utf-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" /><meta name="generator" content="Docutils 0.17.1: http://docutils.sourceforge.net/" />
|
||||
|
||||
<title>osxphotos command line interface (CLI) — osxphotos 0.44.8 documentation</title>
|
||||
<title>osxphotos command line interface (CLI) — osxphotos 0.45.8 documentation</title>
|
||||
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
|
||||
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
|
||||
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>Index — osxphotos 0.44.8 documentation</title>
|
||||
<title>Index — osxphotos 0.45.8 documentation</title>
|
||||
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
|
||||
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
|
||||
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
<meta charset="utf-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" /><meta name="generator" content="Docutils 0.17.1: http://docutils.sourceforge.net/" />
|
||||
|
||||
<title>Welcome to osxphotos’s documentation! — osxphotos 0.44.8 documentation</title>
|
||||
<title>Welcome to osxphotos’s documentation! — osxphotos 0.45.8 documentation</title>
|
||||
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
|
||||
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
|
||||
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
<meta charset="utf-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" /><meta name="generator" content="Docutils 0.17.1: http://docutils.sourceforge.net/" />
|
||||
|
||||
<title>osxphotos — osxphotos 0.44.8 documentation</title>
|
||||
<title>osxphotos — osxphotos 0.45.8 documentation</title>
|
||||
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
|
||||
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
|
||||
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
<meta charset="utf-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" /><meta name="generator" content="Docutils 0.17.1: http://docutils.sourceforge.net/" />
|
||||
|
||||
<title>osxphotos package — osxphotos 0.44.8 documentation</title>
|
||||
<title>osxphotos package — osxphotos 0.45.8 documentation</title>
|
||||
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
|
||||
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
|
||||
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>Search — osxphotos 0.44.8 documentation</title>
|
||||
<title>Search — osxphotos 0.45.8 documentation</title>
|
||||
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
|
||||
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
|
||||
|
||||
|
||||
@@ -14,6 +14,7 @@ datas = [
|
||||
("osxphotos/phototemplate.tx", "osxphotos"),
|
||||
("osxphotos/phototemplate.md", "osxphotos"),
|
||||
("osxphotos/tutorial.md", "osxphotos"),
|
||||
("osxphotos/exiftool_filetypes.json", "osxphotos"),
|
||||
]
|
||||
package_imports = [["photoscript", ["photoscript.applescript"]]]
|
||||
for package, files in package_imports:
|
||||
|
||||
@@ -1,13 +1,45 @@
|
||||
from ._constants import AlbumSortOrder
|
||||
from ._version import __version__
|
||||
from .exiftool import ExifTool
|
||||
from .photoexporter import ExportResults, PhotoExporter
|
||||
from .export_db import ExportDB, ExportDBInMemory, ExportDBNoOp
|
||||
from .fileutil import FileUtil, FileUtilNoOp
|
||||
from .momentinfo import MomentInfo
|
||||
from .personinfo import PersonInfo
|
||||
from .photoexporter import ExportOptions, ExportResults, PhotoExporter
|
||||
from .photoinfo import PhotoInfo
|
||||
from .photosdb import PhotosDB
|
||||
from .photosdb._photosdb_process_comments import CommentInfo, LikeInfo
|
||||
from .phototemplate import PhotoTemplate
|
||||
from .placeinfo import PlaceInfo
|
||||
from .queryoptions import QueryOptions
|
||||
from .scoreinfo import ScoreInfo
|
||||
from .searchinfo import SearchInfo
|
||||
from .utils import _debug, _get_logger, _set_debug
|
||||
|
||||
# TODO: Add test for imageTimeZoneOffsetSeconds = None
|
||||
# TODO: Add special albums and magic albums
|
||||
__all__ = [
|
||||
"__version__",
|
||||
"_debug",
|
||||
"_get_logger",
|
||||
"_set_debug",
|
||||
"AlbumSortOrder",
|
||||
"CommentInfo",
|
||||
"ExifTool",
|
||||
"ExportDB",
|
||||
"ExportDBInMemory",
|
||||
"ExportDBNoOp",
|
||||
"ExportOptions",
|
||||
"ExportResults",
|
||||
"FileUtil",
|
||||
"FileUtilNoOp",
|
||||
"LikeInfo",
|
||||
"MomentInfo",
|
||||
"PersonInfo",
|
||||
"PhotoExporter",
|
||||
"PhotoInfo",
|
||||
"PhotosDB",
|
||||
"PhotoTemplate",
|
||||
"PlaceInfo",
|
||||
"QueryOptions",
|
||||
"ScoreInfo",
|
||||
"SearchInfo",
|
||||
]
|
||||
|
||||
@@ -42,7 +42,10 @@ _PHOTOS_5_VERSION = "5000" # I've seen both 5001 and 6000. 6000 is most common
|
||||
# Ranges for model version by Photos version
|
||||
_PHOTOS_5_MODEL_VERSION = [13000, 13999]
|
||||
_PHOTOS_6_MODEL_VERSION = [14000, 14999]
|
||||
_PHOTOS_7_MODEL_VERSION = [15000, 15999] # Monterey developer preview is 15134
|
||||
_PHOTOS_7_MODEL_VERSION = [
|
||||
15000,
|
||||
15999,
|
||||
] # Monterey developer preview is 15134, 12.1 is 15331
|
||||
|
||||
# some table names differ between Photos 5 and Photos 6
|
||||
_DB_TABLE_NAMES = {
|
||||
@@ -98,6 +101,8 @@ _TESTED_OS_VERSIONS = [
|
||||
("11", "4"),
|
||||
("11", "5"),
|
||||
("11", "6"),
|
||||
("12", "0"),
|
||||
("12", "1"),
|
||||
]
|
||||
|
||||
# Photos 5 has persons who are empty string if unidentified face
|
||||
@@ -209,7 +214,8 @@ SEARCH_CATEGORY_PHOTO_NAME = 2056
|
||||
|
||||
|
||||
# Max filename length on MacOS
|
||||
MAX_FILENAME_LEN = 255
|
||||
# subtract 6 chars for the lock file extension in form: ".filename.lock"
|
||||
MAX_FILENAME_LEN = 255 - 6
|
||||
|
||||
# Max directory name length on MacOS
|
||||
MAX_DIRNAME_LEN = 255
|
||||
@@ -258,7 +264,7 @@ EXTENDED_ATTRIBUTE_NAMES_QUOTED = [f"'{x}'" for x in EXTENDED_ATTRIBUTE_NAMES]
|
||||
OSXPHOTOS_EXPORT_DB = ".osxphotos_export.db"
|
||||
|
||||
# bit flags for burst images ("burstPickType")
|
||||
BURST_PICK_TYPE_NONE = 0b0 # 0: sometimes used for single images with a burst UUID
|
||||
BURST_PICK_TYPE_NONE = 0b0 # 0: sometimes used for single images with a burst UUID
|
||||
BURST_NOT_SELECTED = 0b10 # 2: burst image is not selected
|
||||
BURST_DEFAULT_PICK = 0b100 # 4: burst image is the one Photos picked to be key image before any selections made
|
||||
BURST_SELECTED = 0b1000 # 8: burst image is selected
|
||||
@@ -300,3 +306,21 @@ class AlbumSortOrder(Enum):
|
||||
|
||||
|
||||
TEXT_DETECTION_CONFIDENCE_THRESHOLD = 0.75
|
||||
|
||||
# stat sort order for cProfile: https://docs.python.org/3/library/profile.html#pstats.Stats.sort_stats
|
||||
PROFILE_SORT_KEYS = [
|
||||
"calls",
|
||||
"cumulative",
|
||||
"cumtime",
|
||||
"file",
|
||||
"filename",
|
||||
"module",
|
||||
"ncalls",
|
||||
"pcalls",
|
||||
"line",
|
||||
"name",
|
||||
"nfl",
|
||||
"stdname",
|
||||
"time",
|
||||
"tottime",
|
||||
]
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
""" version info """
|
||||
|
||||
__version__ = "0.44.8"
|
||||
__version__ = "0.45.8"
|
||||
|
||||
@@ -16,6 +16,8 @@ import zlib
|
||||
|
||||
from .datetime_utils import datetime_naive_to_utc
|
||||
|
||||
__all__ = ["AdjustmentsDecodeError", "AdjustmentsInfo"]
|
||||
|
||||
|
||||
class AdjustmentsDecodeError(Exception):
|
||||
"""Could not decode adjustments plist file"""
|
||||
@@ -73,37 +75,37 @@ class AdjustmentsInfo:
|
||||
|
||||
@property
|
||||
def plist(self):
|
||||
"""The actual adjustments plist content as a dict """
|
||||
"""The actual adjustments plist content as a dict"""
|
||||
return self._plist
|
||||
|
||||
@property
|
||||
def data(self):
|
||||
"""The raw adjustments data as a binary blob """
|
||||
"""The raw adjustments data as a binary blob"""
|
||||
return self._data
|
||||
|
||||
@property
|
||||
def editor(self):
|
||||
"""The editor bundle ID for app/plug-in which made the adjustments """
|
||||
"""The editor bundle ID for app/plug-in which made the adjustments"""
|
||||
return self._editor_bundle_id
|
||||
|
||||
@property
|
||||
def format_id(self):
|
||||
"""The value of the adjustmentFormatIdentifier field in the plist """
|
||||
"""The value of the adjustmentFormatIdentifier field in the plist"""
|
||||
return self._format_identifier
|
||||
|
||||
@property
|
||||
def base_version(self):
|
||||
"""Value of adjustmentBaseVersion field """
|
||||
"""Value of adjustmentBaseVersion field"""
|
||||
return self._base_version
|
||||
|
||||
@property
|
||||
def format_version(self):
|
||||
"""The value of the adjustmentFormatVersion in the plist """
|
||||
"""The value of the adjustmentFormatVersion in the plist"""
|
||||
return self._format_version
|
||||
|
||||
@property
|
||||
def timestamp(self):
|
||||
"""The time stamp of the adjustment as timezone aware datetime.datetime object or None if no timestamp """
|
||||
"""The time stamp of the adjustment as timezone aware datetime.datetime object or None if no timestamp"""
|
||||
return self._timestamp
|
||||
|
||||
@property
|
||||
|
||||
@@ -24,6 +24,15 @@ from ._constants import (
|
||||
from .datetime_utils import get_local_tz
|
||||
from .query_builder import get_query
|
||||
|
||||
__all__ = [
|
||||
"sort_list_by_keys",
|
||||
"AlbumInfoBaseClass",
|
||||
"AlbumInfo",
|
||||
"ImportInfo",
|
||||
"ProjectInfo",
|
||||
"FolderInfo",
|
||||
]
|
||||
|
||||
|
||||
def sort_list_by_keys(values, sort_keys):
|
||||
"""Sorts list values by a second list sort_keys
|
||||
|
||||
1723
osxphotos/cli.py
@@ -22,6 +22,17 @@ from .phototemplate import (
|
||||
get_template_help,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"ExportCommand",
|
||||
"template_help",
|
||||
"tutorial_help",
|
||||
"rich_text",
|
||||
"strip_md_header_and_links",
|
||||
"strip_md_links",
|
||||
"strip_html_comments",
|
||||
"get_tutorial_text",
|
||||
]
|
||||
|
||||
|
||||
# TODO: The following help text could probably be done as mako template
|
||||
class ExportCommand(click.Command):
|
||||
|
||||
@@ -1,9 +1,16 @@
|
||||
""" ConfigOptions class to load/save config settings for osxphotos CLI """
|
||||
import toml
|
||||
|
||||
__all__ = [
|
||||
"ConfigOptionsException",
|
||||
"ConfigOptionsInvalidError",
|
||||
"ConfigOptionsLoadError",
|
||||
"ConfigOptions",
|
||||
]
|
||||
|
||||
|
||||
class ConfigOptionsException(Exception):
|
||||
""" Invalid combination of options. """
|
||||
"""Invalid combination of options."""
|
||||
|
||||
def __init__(self, message):
|
||||
self.message = message
|
||||
@@ -19,10 +26,10 @@ class ConfigOptionsLoadError(ConfigOptionsException):
|
||||
|
||||
|
||||
class ConfigOptions:
|
||||
""" data class to store and load options for osxphotos commands """
|
||||
"""data class to store and load options for osxphotos commands"""
|
||||
|
||||
def __init__(self, name, attrs, ignore=None):
|
||||
""" init ConfigOptions class
|
||||
"""init ConfigOptions class
|
||||
|
||||
Args:
|
||||
name: name for these options, will be used for section heading in TOML file when saving/loading from file
|
||||
@@ -53,21 +60,21 @@ class ConfigOptions:
|
||||
raise KeyError(f"Missing argument: {attr}")
|
||||
|
||||
def validate(self, exclusive=None, inclusive=None, dependent=None, cli=False):
|
||||
""" validate combinations of otions
|
||||
|
||||
"""validate combinations of otions
|
||||
|
||||
Args:
|
||||
exclusive: list of tuples in form [("option_1", "option_2")...] which are exclusive;
|
||||
ie. either option_1 can be set or option_2 but not both;
|
||||
inclusive: list of tuples in form [("option_1", "option_2")...] which are inclusive;
|
||||
exclusive: list of tuples in form [("option_1", "option_2")...] which are exclusive;
|
||||
ie. either option_1 can be set or option_2 but not both;
|
||||
inclusive: list of tuples in form [("option_1", "option_2")...] which are inclusive;
|
||||
ie. if either option_1 or option_2 is set, the other must be set
|
||||
dependent: list of tuples in form [("option_1", ("option_2", "option_3"))...]
|
||||
dependent: list of tuples in form [("option_1", ("option_2", "option_3"))...]
|
||||
where if option_1 is set, then at least one of the options in the second tuple must also be set
|
||||
cli: bool, set to True if called to validate CLI options;
|
||||
cli: bool, set to True if called to validate CLI options;
|
||||
will prepend '--' to option names in InvalidOptions.message and change _ to - in option names
|
||||
|
||||
|
||||
Returns:
|
||||
True if all options valid
|
||||
|
||||
|
||||
Raises:
|
||||
InvalidOption if any combination of options is invalid
|
||||
InvalidOption.message will be descriptive message of invalid options
|
||||
@@ -121,7 +128,7 @@ class ConfigOptions:
|
||||
return True
|
||||
|
||||
def write_to_file(self, filename):
|
||||
""" Write self to TOML file
|
||||
"""Write self to TOML file
|
||||
|
||||
Args:
|
||||
filename: full path to TOML file to write; filename will be overwritten if it exists
|
||||
@@ -141,7 +148,7 @@ class ConfigOptions:
|
||||
toml.dump({self._name: data}, fd)
|
||||
|
||||
def load_from_file(self, filename, override=False):
|
||||
""" Load options from a TOML file.
|
||||
"""Load options from a TOML file.
|
||||
|
||||
Args:
|
||||
filename: full path to TOML file
|
||||
|
||||
@@ -2,69 +2,71 @@
|
||||
|
||||
import datetime
|
||||
|
||||
__all__ = ["DateTimeFormatter"]
|
||||
|
||||
|
||||
class DateTimeFormatter:
|
||||
""" provides property access to formatted datetime.datetime strftime values """
|
||||
"""provides property access to formatted datetime.datetime strftime values"""
|
||||
|
||||
def __init__(self, dt: datetime.datetime):
|
||||
self.dt = dt
|
||||
|
||||
@property
|
||||
def date(self):
|
||||
""" ISO date in form 2020-03-22 """
|
||||
"""ISO date in form 2020-03-22"""
|
||||
return self.dt.date().isoformat()
|
||||
|
||||
@property
|
||||
def year(self):
|
||||
""" 4 digit year """
|
||||
"""4 digit year"""
|
||||
return f"{self.dt.year}"
|
||||
|
||||
@property
|
||||
def yy(self):
|
||||
""" 2 digit year """
|
||||
"""2 digit year"""
|
||||
return f"{self.dt.strftime('%y')}"
|
||||
|
||||
@property
|
||||
def mm(self):
|
||||
""" 2 digit month """
|
||||
"""2 digit month"""
|
||||
return f"{self.dt.strftime('%m')}"
|
||||
|
||||
@property
|
||||
def month(self):
|
||||
""" Month as locale's full name """
|
||||
"""Month as locale's full name"""
|
||||
return f"{self.dt.strftime('%B')}"
|
||||
|
||||
@property
|
||||
def mon(self):
|
||||
""" Month as locale's abbreviated name """
|
||||
"""Month as locale's abbreviated name"""
|
||||
return f"{self.dt.strftime('%b')}"
|
||||
|
||||
@property
|
||||
def dd(self):
|
||||
""" 2-digit day of the month """
|
||||
"""2-digit day of the month"""
|
||||
return f"{self.dt.strftime('%d')}"
|
||||
|
||||
@property
|
||||
def dow(self):
|
||||
""" Day of week as locale's name """
|
||||
"""Day of week as locale's name"""
|
||||
return f"{self.dt.strftime('%A')}"
|
||||
|
||||
@property
|
||||
def doy(self):
|
||||
""" Julian day of year starting from 001 """
|
||||
"""Julian day of year starting from 001"""
|
||||
return f"{self.dt.strftime('%j')}"
|
||||
|
||||
@property
|
||||
def hour(self):
|
||||
""" 2-digit hour """
|
||||
"""2-digit hour"""
|
||||
return f"{self.dt.strftime('%H')}"
|
||||
|
||||
@property
|
||||
def min(self):
|
||||
""" 2-digit minute """
|
||||
"""2-digit minute"""
|
||||
return f"{self.dt.strftime('%M')}"
|
||||
|
||||
@property
|
||||
def sec(self):
|
||||
""" 2-digit second """
|
||||
"""2-digit second"""
|
||||
return f"{self.dt.strftime('%S')}"
|
||||
|
||||
@@ -2,13 +2,23 @@
|
||||
|
||||
import datetime
|
||||
|
||||
__all__ = [
|
||||
"get_local_tz",
|
||||
"datetime_has_tz",
|
||||
"datetime_tz_to_utc",
|
||||
"datetime_remove_tz",
|
||||
"datetime_naive_to_utc",
|
||||
"datetime_naive_to_local",
|
||||
"datetime_utc_to_local",
|
||||
]
|
||||
|
||||
|
||||
def get_local_tz(dt):
|
||||
""" Return local timezone as datetime.timezone tzinfo for dt
|
||||
|
||||
"""Return local timezone as datetime.timezone tzinfo for dt
|
||||
|
||||
Args:
|
||||
dt: datetime.datetime
|
||||
|
||||
|
||||
Returns:
|
||||
local timezone for dt as datetime.timezone
|
||||
|
||||
@@ -22,14 +32,14 @@ def get_local_tz(dt):
|
||||
|
||||
|
||||
def datetime_has_tz(dt):
|
||||
""" Return True if datetime dt has tzinfo else False
|
||||
"""Return True if datetime dt has tzinfo else False
|
||||
|
||||
Args:
|
||||
dt: datetime.datetime
|
||||
|
||||
|
||||
Returns:
|
||||
True if dt is timezone aware, else False
|
||||
|
||||
|
||||
Raises:
|
||||
TypeError if dt is not a datetime.datetime object
|
||||
"""
|
||||
@@ -41,15 +51,15 @@ def datetime_has_tz(dt):
|
||||
|
||||
|
||||
def datetime_tz_to_utc(dt):
|
||||
""" Convert datetime.datetime object with timezone to UTC timezone
|
||||
"""Convert datetime.datetime object with timezone to UTC timezone
|
||||
|
||||
Args:
|
||||
dt: datetime.datetime object
|
||||
|
||||
Returns:
|
||||
datetime.datetime in UTC timezone
|
||||
|
||||
Raises:
|
||||
|
||||
Raises:
|
||||
TypeError if dt is not datetime.datetime object
|
||||
ValueError if dt does not have timeone information
|
||||
"""
|
||||
@@ -64,14 +74,14 @@ def datetime_tz_to_utc(dt):
|
||||
|
||||
|
||||
def datetime_remove_tz(dt):
|
||||
""" Remove timezone from a datetime.datetime object
|
||||
"""Remove timezone from a datetime.datetime object
|
||||
|
||||
Args:
|
||||
dt: datetime.datetime object with tzinfo
|
||||
|
||||
|
||||
Returns:
|
||||
dt without any timezone info (naive datetime object)
|
||||
|
||||
dt without any timezone info (naive datetime object)
|
||||
|
||||
Raises:
|
||||
TypeError if dt is not a datetime.datetime object
|
||||
"""
|
||||
@@ -83,15 +93,15 @@ def datetime_remove_tz(dt):
|
||||
|
||||
|
||||
def datetime_naive_to_utc(dt):
|
||||
""" Convert naive (timezone unaware) datetime.datetime
|
||||
"""Convert naive (timezone unaware) datetime.datetime
|
||||
to aware timezone in UTC timezone
|
||||
|
||||
Args:
|
||||
dt: datetime.datetime without timezone
|
||||
|
||||
|
||||
Returns:
|
||||
datetime.datetime with UTC timezone
|
||||
|
||||
|
||||
Raises:
|
||||
TypeError if dt is not a datetime.datetime object
|
||||
ValueError if dt is not a naive/timezone unaware object
|
||||
@@ -111,15 +121,15 @@ def datetime_naive_to_utc(dt):
|
||||
|
||||
|
||||
def datetime_naive_to_local(dt):
|
||||
""" Convert naive (timezone unaware) datetime.datetime
|
||||
"""Convert naive (timezone unaware) datetime.datetime
|
||||
to aware timezone in local timezone
|
||||
|
||||
Args:
|
||||
dt: datetime.datetime without timezone
|
||||
|
||||
|
||||
Returns:
|
||||
datetime.datetime with local timezone
|
||||
|
||||
|
||||
Raises:
|
||||
TypeError if dt is not a datetime.datetime object
|
||||
ValueError if dt is not a naive/timezone unaware object
|
||||
@@ -139,7 +149,7 @@ def datetime_naive_to_local(dt):
|
||||
|
||||
|
||||
def datetime_utc_to_local(dt):
|
||||
""" Convert datetime.datetime object in UTC timezone to local timezone
|
||||
"""Convert datetime.datetime object in UTC timezone to local timezone
|
||||
|
||||
Args:
|
||||
dt: datetime.datetime object
|
||||
|
||||
@@ -2,6 +2,8 @@
|
||||
|
||||
from dataclasses import dataclass
|
||||
|
||||
__all__ = ["ExifInfo"]
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class ExifInfo:
|
||||
|
||||
@@ -11,12 +11,23 @@ import html
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import pathlib
|
||||
import re
|
||||
import shutil
|
||||
import subprocess
|
||||
from abc import ABC, abstractmethod
|
||||
from functools import lru_cache # pylint: disable=syntax-error
|
||||
|
||||
__all__ = [
|
||||
"escape_str",
|
||||
"exiftool_can_write",
|
||||
"ExifTool",
|
||||
"ExifToolCaching",
|
||||
"get_exiftool_path",
|
||||
"terminate_exiftool",
|
||||
"unescape_str",
|
||||
]
|
||||
|
||||
# exiftool -stay_open commands outputs this EOF marker after command is run
|
||||
EXIFTOOL_STAYOPEN_EOF = "{ready}"
|
||||
EXIFTOOL_STAYOPEN_EOF_LEN = len(EXIFTOOL_STAYOPEN_EOF)
|
||||
@@ -24,6 +35,24 @@ EXIFTOOL_STAYOPEN_EOF_LEN = len(EXIFTOOL_STAYOPEN_EOF)
|
||||
# list of exiftool processes to cleanup when exiting or when terminate is called
|
||||
EXIFTOOL_PROCESSES = []
|
||||
|
||||
# exiftool supported file types, created by utils/exiftool_supported_types.py
|
||||
EXIFTOOL_FILETYPES_JSON = "exiftool_filetypes.json"
|
||||
with (pathlib.Path(__file__).parent / EXIFTOOL_FILETYPES_JSON).open("r") as f:
|
||||
EXIFTOOL_SUPPORTED_FILETYPES = json.load(f)
|
||||
|
||||
|
||||
def exiftool_can_write(suffix: str) -> bool:
|
||||
"""Return True if exiftool supports writing to a file with the given suffix, otherwise False"""
|
||||
if not suffix:
|
||||
return False
|
||||
suffix = suffix.lower()
|
||||
if suffix[0] == ".":
|
||||
suffix = suffix[1:]
|
||||
return (
|
||||
suffix in EXIFTOOL_SUPPORTED_FILETYPES
|
||||
and EXIFTOOL_SUPPORTED_FILETYPES[suffix]["write"]
|
||||
)
|
||||
|
||||
|
||||
def escape_str(s):
|
||||
"""escape string for use with exiftool -E"""
|
||||
|
||||
4976
osxphotos/exiftool_filetypes.json
Normal file
@@ -10,11 +10,17 @@ import sys
|
||||
from abc import ABC, abstractmethod
|
||||
from io import StringIO
|
||||
from sqlite3 import Error
|
||||
from typing import Union
|
||||
|
||||
from ._constants import OSXPHOTOS_EXPORT_DB
|
||||
from ._version import __version__
|
||||
from .utils import normalize_fs_path
|
||||
|
||||
__all__ = ["ExportDB_ABC", "ExportDBNoOp", "ExportDB", "ExportDBInMemory"]
|
||||
|
||||
OSXPHOTOS_EXPORTDB_VERSION = "4.3"
|
||||
OSXPHOTOS_EXPORTDB_VERSION_MIGRATE_FILEPATH = "4.3"
|
||||
|
||||
OSXPHOTOS_EXPORTDB_VERSION = "4.0"
|
||||
OSXPHOTOS_ABOUT_STRING = f"Created by osxphotos version {__version__} (https://github.com/RhetTbull/osxphotos) on {datetime.datetime.now()}"
|
||||
|
||||
|
||||
@@ -102,15 +108,18 @@ class ExportDB_ABC(ABC):
|
||||
self,
|
||||
filename,
|
||||
uuid,
|
||||
orig_stat,
|
||||
exif_stat,
|
||||
converted_stat,
|
||||
edited_stat,
|
||||
info_json,
|
||||
exif_json,
|
||||
orig_stat=None,
|
||||
exif_stat=None,
|
||||
converted_stat=None,
|
||||
edited_stat=None,
|
||||
info_json=None,
|
||||
exif_json=None,
|
||||
):
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_connection(self):
|
||||
pass
|
||||
|
||||
class ExportDBNoOp(ExportDB_ABC):
|
||||
"""An ExportDB with NoOp methods"""
|
||||
@@ -181,15 +190,17 @@ class ExportDBNoOp(ExportDB_ABC):
|
||||
self,
|
||||
filename,
|
||||
uuid,
|
||||
orig_stat,
|
||||
exif_stat,
|
||||
converted_stat,
|
||||
edited_stat,
|
||||
info_json,
|
||||
exif_json,
|
||||
orig_stat=None,
|
||||
exif_stat=None,
|
||||
converted_stat=None,
|
||||
edited_stat=None,
|
||||
info_json=None,
|
||||
exif_json=None,
|
||||
):
|
||||
pass
|
||||
|
||||
def get_connection(self):
|
||||
pass
|
||||
|
||||
class ExportDB(ExportDB_ABC):
|
||||
"""Interface to sqlite3 database used to store state information for osxphotos export command"""
|
||||
@@ -209,12 +220,13 @@ class ExportDB(ExportDB_ABC):
|
||||
"""query database for filename and return UUID
|
||||
returns None if filename not found in database
|
||||
"""
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
|
||||
conn = self._conn
|
||||
filepath_normalized = self._normalize_filepath_relative(filename)
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
"SELECT uuid FROM files WHERE filepath_normalized = ?", (filename,)
|
||||
"SELECT uuid FROM files WHERE filepath_normalized = ?",
|
||||
(filepath_normalized,),
|
||||
)
|
||||
results = c.fetchone()
|
||||
uuid = results[0] if results else None
|
||||
@@ -226,8 +238,8 @@ class ExportDB(ExportDB_ABC):
|
||||
def set_uuid_for_file(self, filename, uuid):
|
||||
"""set UUID of filename to uuid in the database"""
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path))
|
||||
filename_normalized = filename.lower()
|
||||
conn = self._conn
|
||||
filename_normalized = self._normalize_filepath(filename)
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -243,11 +255,11 @@ class ExportDB(ExportDB_ABC):
|
||||
"""set stat info for filename
|
||||
filename: filename to set the stat info for
|
||||
stat: a tuple of length 3: mode, size, mtime"""
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
|
||||
filename = self._normalize_filepath_relative(filename)
|
||||
if len(stats) != 3:
|
||||
raise ValueError(f"expected 3 elements for stat, got {len(stats)}")
|
||||
|
||||
conn = self._conn
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -264,8 +276,8 @@ class ExportDB(ExportDB_ABC):
|
||||
"""get stat info for filename
|
||||
returns: tuple of (mode, size, mtime)
|
||||
"""
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
|
||||
conn = self._conn
|
||||
filename = self._normalize_filepath_relative(filename)
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -300,11 +312,11 @@ class ExportDB(ExportDB_ABC):
|
||||
"""set stat info for filename (after exiftool has updated it)
|
||||
filename: filename to set the stat info for
|
||||
stat: a tuple of length 3: mode, size, mtime"""
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
|
||||
filename = self._normalize_filepath_relative(filename)
|
||||
if len(stats) != 3:
|
||||
raise ValueError(f"expected 3 elements for stat, got {len(stats)}")
|
||||
|
||||
conn = self._conn
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -321,8 +333,8 @@ class ExportDB(ExportDB_ABC):
|
||||
"""get stat info for filename (after exiftool has updated it)
|
||||
returns: tuple of (mode, size, mtime)
|
||||
"""
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
|
||||
conn = self._conn
|
||||
filename = self._normalize_filepath_relative(filename)
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -355,7 +367,7 @@ class ExportDB(ExportDB_ABC):
|
||||
|
||||
def get_info_for_uuid(self, uuid):
|
||||
"""returns the info JSON struct for a UUID"""
|
||||
conn = self._conn
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute("SELECT json_info FROM info WHERE uuid = ?", (uuid,))
|
||||
@@ -369,7 +381,7 @@ class ExportDB(ExportDB_ABC):
|
||||
|
||||
def set_info_for_uuid(self, uuid, info):
|
||||
"""sets the info JSON struct for a UUID"""
|
||||
conn = self._conn
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -382,8 +394,8 @@ class ExportDB(ExportDB_ABC):
|
||||
|
||||
def get_exifdata_for_file(self, filename):
|
||||
"""returns the exifdata JSON struct for a file"""
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
|
||||
conn = self._conn
|
||||
filename = self._normalize_filepath_relative(filename)
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -400,8 +412,8 @@ class ExportDB(ExportDB_ABC):
|
||||
|
||||
def set_exifdata_for_file(self, filename, exifdata):
|
||||
"""sets the exifdata JSON struct for a file"""
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
|
||||
conn = self._conn
|
||||
filename = self._normalize_filepath_relative(filename)
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -414,8 +426,8 @@ class ExportDB(ExportDB_ABC):
|
||||
|
||||
def get_sidecar_for_file(self, filename):
|
||||
"""returns the sidecar data and signature for a file"""
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
|
||||
conn = self._conn
|
||||
filename = self._normalize_filepath_relative(filename)
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -442,8 +454,8 @@ class ExportDB(ExportDB_ABC):
|
||||
|
||||
def set_sidecar_for_file(self, filename, sidecar_data, sidecar_sig):
|
||||
"""sets the sidecar data and signature for a file"""
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
|
||||
conn = self._conn
|
||||
filename = self._normalize_filepath_relative(filename)
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -456,7 +468,7 @@ class ExportDB(ExportDB_ABC):
|
||||
|
||||
def get_previous_uuids(self):
|
||||
"""returns list of UUIDs of previously exported photos found in export database"""
|
||||
conn = self._conn
|
||||
conn = self.get_connection()
|
||||
previous_uuids = []
|
||||
try:
|
||||
c = conn.cursor()
|
||||
@@ -469,7 +481,7 @@ class ExportDB(ExportDB_ABC):
|
||||
|
||||
def get_detected_text_for_uuid(self, uuid):
|
||||
"""Get the detected_text for a uuid"""
|
||||
conn = self._conn
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -486,7 +498,7 @@ class ExportDB(ExportDB_ABC):
|
||||
|
||||
def set_detected_text_for_uuid(self, uuid, text_json):
|
||||
"""Set the detected text for uuid"""
|
||||
conn = self._conn
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -504,52 +516,65 @@ class ExportDB(ExportDB_ABC):
|
||||
self,
|
||||
filename,
|
||||
uuid,
|
||||
orig_stat,
|
||||
exif_stat,
|
||||
converted_stat,
|
||||
edited_stat,
|
||||
info_json,
|
||||
exif_json,
|
||||
orig_stat=None,
|
||||
exif_stat=None,
|
||||
converted_stat=None,
|
||||
edited_stat=None,
|
||||
info_json=None,
|
||||
exif_json=None,
|
||||
):
|
||||
"""sets all the data for file and uuid at once"""
|
||||
"""sets all the data for file and uuid at once; if any value is None, does not set it"""
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path))
|
||||
filename_normalized = filename.lower()
|
||||
conn = self._conn
|
||||
filename_normalized = self._normalize_filepath(filename)
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
# update files table (if needed);
|
||||
# this statement works around fact that there was no unique constraint on files.filepath_normalized
|
||||
c.execute(
|
||||
"INSERT OR REPLACE INTO files(filepath, filepath_normalized, uuid) VALUES (?, ?, ?);",
|
||||
"""INSERT OR IGNORE INTO files(filepath, filepath_normalized, uuid) VALUES (?, ?, ?);""",
|
||||
(filename, filename_normalized, uuid),
|
||||
)
|
||||
|
||||
c.execute(
|
||||
"UPDATE files "
|
||||
+ "SET orig_mode = ?, orig_size = ?, orig_mtime = ? "
|
||||
+ "WHERE filepath_normalized = ?;",
|
||||
(*orig_stat, filename_normalized),
|
||||
)
|
||||
c.execute(
|
||||
"UPDATE files "
|
||||
+ "SET exif_mode = ?, exif_size = ?, exif_mtime = ? "
|
||||
+ "WHERE filepath_normalized = ?;",
|
||||
(*exif_stat, filename_normalized),
|
||||
)
|
||||
c.execute(
|
||||
"INSERT OR REPLACE INTO converted(filepath_normalized, mode, size, mtime) VALUES (?, ?, ?, ?);",
|
||||
(filename_normalized, *converted_stat),
|
||||
)
|
||||
c.execute(
|
||||
"INSERT OR REPLACE INTO edited(filepath_normalized, mode, size, mtime) VALUES (?, ?, ?, ?);",
|
||||
(filename_normalized, *edited_stat),
|
||||
)
|
||||
c.execute(
|
||||
"INSERT OR REPLACE INTO info(uuid, json_info) VALUES (?, ?);",
|
||||
(uuid, info_json),
|
||||
)
|
||||
c.execute(
|
||||
"INSERT OR REPLACE INTO exifdata(filepath_normalized, json_exifdata) VALUES (?, ?);",
|
||||
(filename_normalized, exif_json),
|
||||
)
|
||||
if orig_stat is not None:
|
||||
c.execute(
|
||||
"UPDATE files "
|
||||
+ "SET orig_mode = ?, orig_size = ?, orig_mtime = ? "
|
||||
+ "WHERE filepath_normalized = ?;",
|
||||
(*orig_stat, filename_normalized),
|
||||
)
|
||||
|
||||
if exif_stat is not None:
|
||||
c.execute(
|
||||
"UPDATE files "
|
||||
+ "SET exif_mode = ?, exif_size = ?, exif_mtime = ? "
|
||||
+ "WHERE filepath_normalized = ?;",
|
||||
(*exif_stat, filename_normalized),
|
||||
)
|
||||
|
||||
if converted_stat is not None:
|
||||
c.execute(
|
||||
"INSERT OR REPLACE INTO converted(filepath_normalized, mode, size, mtime) VALUES (?, ?, ?, ?);",
|
||||
(filename_normalized, *converted_stat),
|
||||
)
|
||||
|
||||
if edited_stat is not None:
|
||||
c.execute(
|
||||
"INSERT OR REPLACE INTO edited(filepath_normalized, mode, size, mtime) VALUES (?, ?, ?, ?);",
|
||||
(filename_normalized, *edited_stat),
|
||||
)
|
||||
|
||||
if info_json is not None:
|
||||
c.execute(
|
||||
"INSERT OR REPLACE INTO info(uuid, json_info) VALUES (?, ?);",
|
||||
(uuid, info_json),
|
||||
)
|
||||
|
||||
if exif_json is not None:
|
||||
c.execute(
|
||||
"INSERT OR REPLACE INTO exifdata(filepath_normalized, json_exifdata) VALUES (?, ?);",
|
||||
(filename_normalized, exif_json),
|
||||
)
|
||||
conn.commit()
|
||||
except Error as e:
|
||||
logging.warning(e)
|
||||
@@ -557,16 +582,23 @@ class ExportDB(ExportDB_ABC):
|
||||
def close(self):
|
||||
"""close the database connection"""
|
||||
try:
|
||||
self._conn.close()
|
||||
if self._conn:
|
||||
self._conn.close()
|
||||
self._conn = None
|
||||
except Error as e:
|
||||
logging.warning(e)
|
||||
|
||||
def get_connection(self):
|
||||
if self._conn is None:
|
||||
self._conn = self._open_export_db(self._dbfile)
|
||||
return self._conn
|
||||
|
||||
def _set_stat_for_file(self, table, filename, stats):
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
|
||||
filename = self._normalize_filepath_relative(filename)
|
||||
if len(stats) != 3:
|
||||
raise ValueError(f"expected 3 elements for stat, got {len(stats)}")
|
||||
|
||||
conn = self._conn
|
||||
conn = self.get_connection()
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
f"INSERT OR REPLACE INTO {table}(filepath_normalized, mode, size, mtime) VALUES (?, ?, ?, ?);",
|
||||
@@ -575,8 +607,8 @@ class ExportDB(ExportDB_ABC):
|
||||
conn.commit()
|
||||
|
||||
def _get_stat_for_file(self, table, filename):
|
||||
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
|
||||
conn = self._conn
|
||||
filename = self._normalize_filepath_relative(filename)
|
||||
conn = self.get_connection()
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
f"SELECT mode, size, mtime FROM {table} WHERE filepath_normalized = ?",
|
||||
@@ -611,10 +643,20 @@ class ExportDB(ExportDB_ABC):
|
||||
version_info = self._get_database_version(conn)
|
||||
if version_info[1] < OSXPHOTOS_EXPORTDB_VERSION:
|
||||
self._create_db_tables(conn)
|
||||
if version_info[1] < OSXPHOTOS_EXPORTDB_VERSION_MIGRATE_FILEPATH:
|
||||
self._migrate_normalized_filepath(conn)
|
||||
self.was_upgraded = (version_info[1], OSXPHOTOS_EXPORTDB_VERSION)
|
||||
else:
|
||||
self.was_upgraded = ()
|
||||
self.version = OSXPHOTOS_EXPORTDB_VERSION
|
||||
|
||||
# turn on performance optimizations
|
||||
c = conn.cursor()
|
||||
c.execute("PRAGMA journal_mode=WAL;")
|
||||
c.execute("PRAGMA synchronous=NORMAL;")
|
||||
c.execute("PRAGMA cache_size=-100000;")
|
||||
c.execute("PRAGMA temp_store=MEMORY;")
|
||||
|
||||
return conn
|
||||
|
||||
def _get_db_connection(self, dbfile):
|
||||
@@ -660,6 +702,22 @@ class ExportDB(ExportDB_ABC):
|
||||
exif_size INTEGER,
|
||||
exif_mtime REAL
|
||||
); """,
|
||||
"sql_files_table_migrate": """ CREATE TABLE IF NOT EXISTS files_migrate (
|
||||
id INTEGER PRIMARY KEY,
|
||||
filepath TEXT NOT NULL,
|
||||
filepath_normalized TEXT NOT NULL,
|
||||
uuid TEXT,
|
||||
orig_mode INTEGER,
|
||||
orig_size INTEGER,
|
||||
orig_mtime REAL,
|
||||
exif_mode INTEGER,
|
||||
exif_size INTEGER,
|
||||
exif_mtime REAL,
|
||||
UNIQUE(filepath_normalized)
|
||||
); """,
|
||||
"sql_files_migrate": """ INSERT INTO files_migrate SELECT * FROM files;""",
|
||||
"sql_files_drop_tables": """ DROP TABLE files;""",
|
||||
"sql_files_alter": """ ALTER TABLE files_migrate RENAME TO files;""",
|
||||
"sql_runs_table": """ CREATE TABLE IF NOT EXISTS runs (
|
||||
id INTEGER PRIMARY KEY,
|
||||
datetime TEXT,
|
||||
@@ -739,7 +797,7 @@ class ExportDB(ExportDB_ABC):
|
||||
cmd = sys.argv[0]
|
||||
args = " ".join(sys.argv[1:]) if len(sys.argv) > 1 else ""
|
||||
cwd = os.getcwd()
|
||||
conn = self._conn
|
||||
conn = self.get_connection()
|
||||
try:
|
||||
c = conn.cursor()
|
||||
c.execute(
|
||||
@@ -751,6 +809,32 @@ class ExportDB(ExportDB_ABC):
|
||||
except Error as e:
|
||||
logging.warning(e)
|
||||
|
||||
def _normalize_filepath(self, filepath: Union[str, pathlib.Path]) -> str:
|
||||
"""normalize filepath for unicode, lower case"""
|
||||
return normalize_fs_path(str(filepath)).lower()
|
||||
|
||||
def _normalize_filepath_relative(self, filepath: Union[str, pathlib.Path]) -> str:
|
||||
"""normalize filepath for unicode, relative path (to export dir), lower case"""
|
||||
filepath = str(pathlib.Path(filepath).relative_to(self._path))
|
||||
return normalize_fs_path(str(filepath)).lower()
|
||||
|
||||
def _migrate_normalized_filepath(self, conn):
|
||||
"""Fix all filepath_normalized columns for unicode normalization"""
|
||||
# Prior to database version 4.3, filepath_normalized was not normalized for unicode
|
||||
c = conn.cursor()
|
||||
for table in ["converted", "edited", "exifdata", "files", "sidecar"]:
|
||||
old_values = c.execute(
|
||||
f"SELECT filepath_normalized, id FROM {table}"
|
||||
).fetchall()
|
||||
new_values = [
|
||||
(self._normalize_filepath(filepath_normalized), id_)
|
||||
for filepath_normalized, id_ in old_values
|
||||
]
|
||||
c.executemany(
|
||||
f"UPDATE {table} SET filepath_normalized=? WHERE id=?", new_values
|
||||
)
|
||||
conn.commit()
|
||||
|
||||
|
||||
class ExportDBInMemory(ExportDB):
|
||||
"""In memory version of ExportDB
|
||||
|
||||
@@ -11,9 +11,11 @@ import Foundation
|
||||
|
||||
from .imageconverter import ImageConverter
|
||||
|
||||
__all__ = ["FileUtilABC", "FileUtilMacOS", "FileUtil", "FileUtilNoOp"]
|
||||
|
||||
|
||||
class FileUtilABC(ABC):
|
||||
""" Abstract base class for FileUtil """
|
||||
"""Abstract base class for FileUtil"""
|
||||
|
||||
@classmethod
|
||||
@abstractmethod
|
||||
@@ -67,14 +69,14 @@ class FileUtilABC(ABC):
|
||||
|
||||
|
||||
class FileUtilMacOS(FileUtilABC):
|
||||
""" Various file utilities """
|
||||
"""Various file utilities"""
|
||||
|
||||
@classmethod
|
||||
def hardlink(cls, src, dest):
|
||||
""" Hardlinks a file from src path to dest path
|
||||
src: source path as string
|
||||
dest: destination path as string
|
||||
Raises exception if linking fails or either path is None """
|
||||
"""Hardlinks a file from src path to dest path
|
||||
src: source path as string
|
||||
dest: destination path as string
|
||||
Raises exception if linking fails or either path is None"""
|
||||
|
||||
if src is None or dest is None:
|
||||
raise ValueError("src and dest must not be None", src, dest)
|
||||
@@ -90,17 +92,17 @@ class FileUtilMacOS(FileUtilABC):
|
||||
|
||||
@classmethod
|
||||
def copy(cls, src, dest):
|
||||
""" Copies a file from src path to dest path
|
||||
|
||||
"""Copies a file from src path to dest path
|
||||
|
||||
Args:
|
||||
src: source path as string; must be a valid file path
|
||||
dest: destination path as string
|
||||
dest may be either directory or file; in either case, src file must not exist in dest
|
||||
Note: src and dest may be either a string or a pathlib.Path object
|
||||
|
||||
|
||||
Returns:
|
||||
True if copy succeeded
|
||||
|
||||
|
||||
Raises:
|
||||
OSError if copy fails
|
||||
TypeError if either path is None
|
||||
@@ -124,7 +126,7 @@ class FileUtilMacOS(FileUtilABC):
|
||||
|
||||
@classmethod
|
||||
def unlink(cls, filepath):
|
||||
""" unlink filepath; if it's pathlib.Path, use Path.unlink, otherwise use os.unlink """
|
||||
"""unlink filepath; if it's pathlib.Path, use Path.unlink, otherwise use os.unlink"""
|
||||
if isinstance(filepath, pathlib.Path):
|
||||
filepath.unlink()
|
||||
else:
|
||||
@@ -132,7 +134,7 @@ class FileUtilMacOS(FileUtilABC):
|
||||
|
||||
@classmethod
|
||||
def rmdir(cls, dirpath):
|
||||
""" remove directory filepath; dirpath must be empty """
|
||||
"""remove directory filepath; dirpath must be empty"""
|
||||
if isinstance(dirpath, pathlib.Path):
|
||||
dirpath.rmdir()
|
||||
else:
|
||||
@@ -140,7 +142,7 @@ class FileUtilMacOS(FileUtilABC):
|
||||
|
||||
@classmethod
|
||||
def utime(cls, path, times):
|
||||
""" Set the access and modified time of path. """
|
||||
"""Set the access and modified time of path."""
|
||||
os.utime(path, times)
|
||||
|
||||
@classmethod
|
||||
@@ -152,7 +154,7 @@ class FileUtilMacOS(FileUtilABC):
|
||||
mtime1 -- optional, pass alternate file modification timestamp for f1; will be converted to int
|
||||
|
||||
Return value:
|
||||
True if the file signatures as returned by stat are the same, False otherwise.
|
||||
True if the file signatures as returned by stat are the same, False otherwise.
|
||||
Does not do a byte-by-byte comparison.
|
||||
"""
|
||||
|
||||
@@ -179,27 +181,26 @@ class FileUtilMacOS(FileUtilABC):
|
||||
return False
|
||||
|
||||
s1 = cls._sig(os.stat(f1))
|
||||
|
||||
if s1[0] != stat.S_IFREG or s2[0] != stat.S_IFREG:
|
||||
return False
|
||||
return s1 == s2
|
||||
|
||||
@classmethod
|
||||
def file_sig(cls, f1):
|
||||
""" return os.stat signature for file f1 """
|
||||
"""return os.stat signature for file f1"""
|
||||
return cls._sig(os.stat(f1))
|
||||
|
||||
@classmethod
|
||||
def convert_to_jpeg(cls, src_file, dest_file, compression_quality=1.0):
|
||||
""" converts image file src_file to jpeg format as dest_file
|
||||
"""converts image file src_file to jpeg format as dest_file
|
||||
|
||||
Args:
|
||||
src_file: image file to convert
|
||||
dest_file: destination path to write converted file to
|
||||
compression quality: JPEG compression quality in range 0.0 <= compression_quality <= 1.0; default 1.0 (best quality)
|
||||
|
||||
Returns:
|
||||
True if success, otherwise False
|
||||
Args:
|
||||
src_file: image file to convert
|
||||
dest_file: destination path to write converted file to
|
||||
compression quality: JPEG compression quality in range 0.0 <= compression_quality <= 1.0; default 1.0 (best quality)
|
||||
|
||||
Returns:
|
||||
True if success, otherwise False
|
||||
"""
|
||||
converter = ImageConverter()
|
||||
return converter.write_jpeg(
|
||||
@@ -208,40 +209,40 @@ class FileUtilMacOS(FileUtilABC):
|
||||
|
||||
@classmethod
|
||||
def rename(cls, src, dest):
|
||||
""" Copy src to dest
|
||||
"""Copy src to dest
|
||||
|
||||
Args:
|
||||
src: path to source file
|
||||
dest: path to destination file
|
||||
|
||||
|
||||
Returns:
|
||||
Name of renamed file (dest)
|
||||
|
||||
|
||||
"""
|
||||
os.rename(str(src), str(dest))
|
||||
return dest
|
||||
|
||||
@staticmethod
|
||||
def _sig(st):
|
||||
""" return tuple of (mode, size, mtime) of file based on os.stat
|
||||
Args:
|
||||
st: os.stat signature
|
||||
"""return tuple of (mode, size, mtime) of file based on os.stat
|
||||
Args:
|
||||
st: os.stat signature
|
||||
"""
|
||||
# use int(st.st_mtime) because ditto does not copy fractional portion of mtime
|
||||
return (stat.S_IFMT(st.st_mode), st.st_size, int(st.st_mtime))
|
||||
|
||||
|
||||
class FileUtil(FileUtilMacOS):
|
||||
""" Various file utilities """
|
||||
"""Various file utilities"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class FileUtilNoOp(FileUtil):
|
||||
""" No-Op implementation of FileUtil for testing / dry-run mode
|
||||
all methods with exception of cmp, cmp_file_sig and file_cmp are no-op
|
||||
cmp and cmp_file_sig functions as FileUtil methods do
|
||||
file_cmp returns mock data
|
||||
"""No-Op implementation of FileUtil for testing / dry-run mode
|
||||
all methods with exception of cmp, cmp_file_sig and file_cmp are no-op
|
||||
cmp and cmp_file_sig functions as FileUtil methods do
|
||||
file_cmp returns mock data
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
|
||||
@@ -15,6 +15,8 @@ from Foundation import NSDictionary
|
||||
# needed to capture system-level stderr
|
||||
from wurlitzer import pipes
|
||||
|
||||
__all__ = ["ImageConversionError", "ImageConverter"]
|
||||
|
||||
|
||||
class ImageConversionError(Exception):
|
||||
"""Base class for exceptions in this module."""
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
__all__ = ["MomentInfo"]
|
||||
"""MomentInfo class with details about photo moments."""
|
||||
|
||||
|
||||
|
||||
@@ -1,9 +1,20 @@
|
||||
""" utility functions for validating/sanitizing path components """
|
||||
|
||||
import re
|
||||
|
||||
import pathvalidate
|
||||
|
||||
from ._constants import MAX_DIRNAME_LEN, MAX_FILENAME_LEN
|
||||
|
||||
__all__ = [
|
||||
"is_valid_filepath",
|
||||
"sanitize_dirname",
|
||||
"sanitize_filename",
|
||||
"sanitize_filepath",
|
||||
"sanitize_filestem_with_count",
|
||||
"sanitize_pathpart",
|
||||
]
|
||||
|
||||
|
||||
def sanitize_filepath(filepath):
|
||||
"""sanitize a filepath"""
|
||||
@@ -45,6 +56,26 @@ def sanitize_filename(filename, replacement=":"):
|
||||
return filename
|
||||
|
||||
|
||||
def sanitize_filestem_with_count(file_stem: str, file_suffix: str) -> str:
|
||||
"""Sanitize a filestem that may end in (1), (2), etc. to ensure it + file_suffix doesn't exceed MAX_FILENAME_LEN"""
|
||||
filename_len = len(file_stem) + len(file_suffix)
|
||||
if filename_len <= MAX_FILENAME_LEN:
|
||||
return file_stem
|
||||
|
||||
drop = filename_len - MAX_FILENAME_LEN
|
||||
match = re.match(r"(.*)(\(\d+\))$", file_stem)
|
||||
if not match:
|
||||
# filename doesn't end in (1), (2), etc.
|
||||
# truncate filename to MAX_FILENAME_LEN
|
||||
return file_stem[:-drop]
|
||||
|
||||
# filename ends in (1), (2), etc.
|
||||
file_stem = match.group(1)
|
||||
file_count = match.group(2)
|
||||
file_stem = file_stem[:-drop]
|
||||
return f"{file_stem}{file_count}"
|
||||
|
||||
|
||||
def sanitize_dirname(dirname, replacement=":"):
|
||||
"""replace any illegal characters in a directory name and truncate directory name if needed
|
||||
|
||||
|
||||
@@ -6,6 +6,8 @@ import math
|
||||
|
||||
from collections import namedtuple
|
||||
|
||||
__all__ = ["PersonInfo", "FaceInfo", "rotate_image_point"]
|
||||
|
||||
MWG_RS_Area = namedtuple("MWG_RS_Area", ["x", "y", "h", "w"])
|
||||
MPRI_Reg_Rect = namedtuple("MPRI_Reg_Rect", ["x", "y", "h", "w"])
|
||||
|
||||
@@ -51,7 +53,7 @@ class PersonInfo:
|
||||
|
||||
@property
|
||||
def photos(self):
|
||||
""" Returns list of PhotoInfo objects associated with this person """
|
||||
"""Returns list of PhotoInfo objects associated with this person"""
|
||||
return self._db.photos_by_uuid(self._db._dbfaces_pk[self._pk])
|
||||
|
||||
@property
|
||||
@@ -71,7 +73,7 @@ class PersonInfo:
|
||||
return []
|
||||
|
||||
def asdict(self):
|
||||
""" Returns dictionary representation of class instance """
|
||||
"""Returns dictionary representation of class instance"""
|
||||
keyphoto = self.keyphoto.uuid if self.keyphoto is not None else None
|
||||
return {
|
||||
"uuid": self.uuid,
|
||||
@@ -83,7 +85,7 @@ class PersonInfo:
|
||||
}
|
||||
|
||||
def json(self):
|
||||
""" Returns JSON representation of class instance """
|
||||
"""Returns JSON representation of class instance"""
|
||||
return json.dumps(self.asdict())
|
||||
|
||||
def __str__(self):
|
||||
@@ -201,7 +203,7 @@ class FaceInfo:
|
||||
|
||||
@property
|
||||
def person_info(self):
|
||||
""" PersonInfo instance for person associated with this face """
|
||||
"""PersonInfo instance for person associated with this face"""
|
||||
try:
|
||||
return self._person
|
||||
except AttributeError:
|
||||
@@ -210,7 +212,7 @@ class FaceInfo:
|
||||
|
||||
@property
|
||||
def photo(self):
|
||||
""" PhotoInfo instance associated with this face """
|
||||
"""PhotoInfo instance associated with this face"""
|
||||
try:
|
||||
return self._photo
|
||||
except AttributeError:
|
||||
@@ -292,7 +294,7 @@ class FaceInfo:
|
||||
return [(x0, y0), (x1, y1)]
|
||||
|
||||
def roll_pitch_yaw(self):
|
||||
""" Roll, pitch, yaw of face in radians as tuple """
|
||||
"""Roll, pitch, yaw of face in radians as tuple"""
|
||||
info = self._info
|
||||
roll = 0 if info["roll"] is None else info["roll"]
|
||||
pitch = 0 if info["pitch"] is None else info["pitch"]
|
||||
@@ -302,19 +304,19 @@ class FaceInfo:
|
||||
|
||||
@property
|
||||
def roll(self):
|
||||
""" Return roll angle in radians of the face region """
|
||||
"""Return roll angle in radians of the face region"""
|
||||
roll, _, _ = self.roll_pitch_yaw()
|
||||
return roll
|
||||
|
||||
@property
|
||||
def pitch(self):
|
||||
""" Return pitch angle in radians of the face region """
|
||||
"""Return pitch angle in radians of the face region"""
|
||||
_, pitch, _ = self.roll_pitch_yaw()
|
||||
return pitch
|
||||
|
||||
@property
|
||||
def yaw(self):
|
||||
""" Return yaw angle in radians of the face region """
|
||||
"""Return yaw angle in radians of the face region"""
|
||||
_, _, yaw = self.roll_pitch_yaw()
|
||||
return yaw
|
||||
|
||||
@@ -402,7 +404,7 @@ class FaceInfo:
|
||||
return (int(xr), int(yr))
|
||||
|
||||
def asdict(self):
|
||||
""" Returns dict representation of class instance """
|
||||
"""Returns dict representation of class instance"""
|
||||
roll, pitch, yaw = self.roll_pitch_yaw()
|
||||
return {
|
||||
"_pk": self._pk,
|
||||
@@ -451,7 +453,7 @@ class FaceInfo:
|
||||
}
|
||||
|
||||
def json(self):
|
||||
""" Return JSON representation of FaceInfo instance """
|
||||
"""Return JSON representation of FaceInfo instance"""
|
||||
return json.dumps(self.asdict())
|
||||
|
||||
def __str__(self):
|
||||
|
||||
@@ -35,6 +35,9 @@ from ._constants import (
|
||||
BURST_KEY,
|
||||
BURST_NOT_SELECTED,
|
||||
BURST_SELECTED,
|
||||
SIDECAR_EXIFTOOL,
|
||||
SIDECAR_JSON,
|
||||
SIDECAR_XMP,
|
||||
TEXT_DETECTION_CONFIDENCE_THRESHOLD,
|
||||
)
|
||||
from .adjustmentsinfo import AdjustmentsInfo
|
||||
@@ -43,7 +46,7 @@ from .exifinfo import ExifInfo
|
||||
from .exiftool import ExifToolCaching, get_exiftool_path
|
||||
from .momentinfo import MomentInfo
|
||||
from .personinfo import FaceInfo, PersonInfo
|
||||
from .photoexporter import PhotoExporter
|
||||
from .photoexporter import ExportOptions, PhotoExporter
|
||||
from .phototemplate import PhotoTemplate, RenderOptions
|
||||
from .placeinfo import PlaceInfo4, PlaceInfo5
|
||||
from .query_builder import get_query
|
||||
@@ -51,7 +54,9 @@ from .scoreinfo import ScoreInfo
|
||||
from .searchinfo import SearchInfo
|
||||
from .text_detection import detect_text
|
||||
from .uti import get_preferred_uti_extension, get_uti_for_extension
|
||||
from .utils import _debug, _get_resource_loc, findfiles
|
||||
from .utils import _debug, _get_resource_loc, list_directory
|
||||
|
||||
__all__ = ["PhotoInfo", "PhotoInfoNone"]
|
||||
|
||||
|
||||
class PhotoInfo:
|
||||
@@ -364,7 +369,7 @@ class PhotoInfo:
|
||||
# In Photos 5, raw is in same folder as original but with _4.ext
|
||||
# Unless "Copy Items to the Photos Library" is not checked
|
||||
# then RAW image is not renamed but has same name is jpeg buth with raw extension
|
||||
# Current implementation uses findfiles to find images with the correct raw UTI extension
|
||||
# Current implementation finds images with the correct raw UTI extension
|
||||
# in same folder as the original and with same stem as original in form: original_stem*.raw_ext
|
||||
# TODO: I don't like this -- would prefer a more deterministic approach but until I have more
|
||||
# data on how Photos stores and retrieves RAW images, this seems to be working
|
||||
@@ -400,8 +405,7 @@ class PhotoInfo:
|
||||
# raw files have same name as original but with _4.raw_ext appended
|
||||
# I believe the _4 maps to PHAssetResourceTypeAlternatePhoto = 4
|
||||
# see: https://developer.apple.com/documentation/photokit/phassetresourcetype/phassetresourcetypealternatephoto?language=objc
|
||||
glob_str = f"{filestem}_4*"
|
||||
raw_file = findfiles(glob_str, filepath)
|
||||
raw_file = list_directory(filepath, startswith=f"{filestem}_4")
|
||||
if not raw_file:
|
||||
photopath = None
|
||||
else:
|
||||
@@ -1488,28 +1492,48 @@ class PhotoInfo:
|
||||
"""
|
||||
|
||||
exporter = PhotoExporter(self)
|
||||
return exporter.export(
|
||||
dest=dest,
|
||||
filename=filename,
|
||||
sidecar = 0
|
||||
if sidecar_json:
|
||||
sidecar |= SIDECAR_JSON
|
||||
if sidecar_exiftool:
|
||||
sidecar |= SIDECAR_EXIFTOOL
|
||||
if sidecar_xmp:
|
||||
sidecar |= SIDECAR_XMP
|
||||
|
||||
if not filename:
|
||||
if not edited:
|
||||
filename = self.original_filename
|
||||
else:
|
||||
original_name = pathlib.Path(self.original_filename)
|
||||
if self.path_edited:
|
||||
ext = pathlib.Path(self.path_edited).suffix
|
||||
else:
|
||||
uti = self.uti_edited if edited and self.uti_edited else self.uti
|
||||
ext = get_preferred_uti_extension(uti)
|
||||
ext = "." + ext
|
||||
filename = original_name.stem + "_edited" + ext
|
||||
|
||||
options = ExportOptions(
|
||||
description_template=description_template,
|
||||
edited=edited,
|
||||
live_photo=live_photo,
|
||||
raw_photo=raw_photo,
|
||||
export_as_hardlink=export_as_hardlink,
|
||||
overwrite=overwrite,
|
||||
increment=increment,
|
||||
sidecar_json=sidecar_json,
|
||||
sidecar_exiftool=sidecar_exiftool,
|
||||
sidecar_xmp=sidecar_xmp,
|
||||
use_photos_export=use_photos_export,
|
||||
timeout=timeout,
|
||||
exiftool=exiftool,
|
||||
export_as_hardlink=export_as_hardlink,
|
||||
increment=increment,
|
||||
keyword_template=keyword_template,
|
||||
live_photo=live_photo,
|
||||
overwrite=overwrite,
|
||||
raw_photo=raw_photo,
|
||||
render_options=render_options,
|
||||
sidecar=sidecar,
|
||||
timeout=timeout,
|
||||
use_albums_as_keywords=use_albums_as_keywords,
|
||||
use_persons_as_keywords=use_persons_as_keywords,
|
||||
keyword_template=keyword_template,
|
||||
description_template=description_template,
|
||||
render_options=render_options,
|
||||
use_photos_export=use_photos_export,
|
||||
)
|
||||
|
||||
results = exporter.export(dest, filename=filename, options=options)
|
||||
return results.exported
|
||||
|
||||
def _get_album_uuids(self, project=False):
|
||||
"""Return list of album UUIDs this photo is found in
|
||||
|
||||
|
||||
@@ -36,6 +36,28 @@ from .fileutil import FileUtil
|
||||
from .uti import get_preferred_uti_extension
|
||||
from .utils import _get_os_version, increment_filename
|
||||
|
||||
__all__ = [
|
||||
"NSURL_to_path",
|
||||
"path_to_NSURL",
|
||||
"check_photokit_authorization",
|
||||
"request_photokit_authorization",
|
||||
"PhotoKitError",
|
||||
"PhotoKitFetchFailed",
|
||||
"PhotoKitAuthError",
|
||||
"PhotoKitExportError",
|
||||
"PhotoKitMediaTypeError",
|
||||
"ImageData",
|
||||
"AVAssetData",
|
||||
"PHAssetResourceData",
|
||||
"PhotoKitNotificationDelegate",
|
||||
"PhotoAsset",
|
||||
"SlowMoVideoExporter",
|
||||
"VideoAsset",
|
||||
"LivePhotoRequest",
|
||||
"LivePhotoAsset",
|
||||
"PhotoLibrary",
|
||||
]
|
||||
|
||||
# NOTE: This requires user have granted access to the terminal (e.g. Terminal.app or iTerm)
|
||||
# to access Photos. This should happen automatically the first time it's called. I've
|
||||
# not figured out how to get the call to requestAuthorization_ to actually work in the case
|
||||
|
||||
@@ -8,6 +8,8 @@ from more_itertools import chunked
|
||||
from .photoinfo import PhotoInfo
|
||||
from .utils import noop
|
||||
|
||||
__all__ = ["PhotosAlbum"]
|
||||
|
||||
|
||||
class PhotosAlbum:
|
||||
def __init__(self, name: str, verbose: Optional[callable] = None):
|
||||
|
||||
@@ -10,9 +10,9 @@ from ..utils import _open_sql_file, normalize_unicode
|
||||
|
||||
|
||||
def _process_comments(self):
|
||||
""" load the comments and likes data from the database
|
||||
this is a PhotosDB method that should be imported in
|
||||
the PhotosDB class definition in photosdb.py
|
||||
"""load the comments and likes data from the database
|
||||
this is a PhotosDB method that should be imported in
|
||||
the PhotosDB class definition in photosdb.py
|
||||
"""
|
||||
self._db_hashed_person_id = {}
|
||||
self._db_comments_uuid = {}
|
||||
@@ -24,7 +24,7 @@ def _process_comments(self):
|
||||
|
||||
@dataclass
|
||||
class CommentInfo:
|
||||
""" Class for shared photo comments """
|
||||
"""Class for shared photo comments"""
|
||||
|
||||
datetime: datetime.datetime
|
||||
user: str
|
||||
@@ -37,7 +37,7 @@ class CommentInfo:
|
||||
|
||||
@dataclass
|
||||
class LikeInfo:
|
||||
""" Class for shared photo likes """
|
||||
"""Class for shared photo likes"""
|
||||
|
||||
datetime: datetime.datetime
|
||||
user: str
|
||||
@@ -50,16 +50,16 @@ class LikeInfo:
|
||||
# The following methods do not get imported into PhotosDB
|
||||
# but will get called by _process_comments
|
||||
def _process_comments_4(photosdb):
|
||||
""" process comments and likes info for Photos <= 4
|
||||
photosdb: PhotosDB instance """
|
||||
"""process comments and likes info for Photos <= 4
|
||||
photosdb: PhotosDB instance"""
|
||||
raise NotImplementedError(
|
||||
f"Not implemented for database version {photosdb._db_version}."
|
||||
)
|
||||
|
||||
|
||||
def _process_comments_5(photosdb):
|
||||
""" process comments and likes info for Photos >= 5
|
||||
photosdb: PhotosDB instance """
|
||||
"""process comments and likes info for Photos >= 5
|
||||
photosdb: PhotosDB instance"""
|
||||
|
||||
db = photosdb._tmp_db
|
||||
|
||||
|
||||
@@ -7,10 +7,11 @@ from .._constants import _DB_TABLE_NAMES, _PHOTOS_4_VERSION
|
||||
from ..utils import _db_is_locked, _debug, _open_sql_file
|
||||
from .photosdb_utils import get_db_version
|
||||
|
||||
|
||||
def _process_exifinfo(self):
|
||||
""" load the exif data from the database
|
||||
this is a PhotosDB method that should be imported in
|
||||
the PhotosDB class definition in photosdb.py
|
||||
"""load the exif data from the database
|
||||
this is a PhotosDB method that should be imported in
|
||||
the PhotosDB class definition in photosdb.py
|
||||
"""
|
||||
if self._db_version <= _PHOTOS_4_VERSION:
|
||||
_process_exifinfo_4(self)
|
||||
@@ -23,20 +24,20 @@ def _process_exifinfo(self):
|
||||
|
||||
|
||||
def _process_exifinfo_4(photosdb):
|
||||
""" process exif info for Photos <= 4
|
||||
photosdb: PhotosDB instance """
|
||||
"""process exif info for Photos <= 4
|
||||
photosdb: PhotosDB instance"""
|
||||
photosdb._db_exifinfo_uuid = {}
|
||||
raise NotImplementedError(f"search info not implemented for this database version")
|
||||
|
||||
|
||||
def _process_exifinfo_5(photosdb):
|
||||
""" process exif info for Photos >= 5
|
||||
photosdb: PhotosDB instance """
|
||||
"""process exif info for Photos >= 5
|
||||
photosdb: PhotosDB instance"""
|
||||
|
||||
db = photosdb._tmp_db
|
||||
|
||||
asset_table = _DB_TABLE_NAMES[photosdb._photos_ver]["ASSET"]
|
||||
|
||||
|
||||
(conn, cursor) = _open_sql_file(db)
|
||||
|
||||
result = conn.execute(
|
||||
|
||||
@@ -22,8 +22,7 @@ from .photosdb_utils import get_db_version
|
||||
|
||||
|
||||
def _process_faceinfo(self):
|
||||
""" Process face information
|
||||
"""
|
||||
"""Process face information"""
|
||||
|
||||
self._db_faceinfo_pk = {}
|
||||
self._db_faceinfo_uuid = {}
|
||||
@@ -36,7 +35,7 @@ def _process_faceinfo(self):
|
||||
|
||||
|
||||
def _process_faceinfo_4(photosdb):
|
||||
""" Process face information for Photos 4 databases
|
||||
"""Process face information for Photos 4 databases
|
||||
|
||||
Args:
|
||||
photosdb: an OSXPhotosDB instance
|
||||
@@ -172,7 +171,7 @@ def _process_faceinfo_4(photosdb):
|
||||
|
||||
|
||||
def _process_faceinfo_5(photosdb):
|
||||
""" Process face information for Photos 5 databases
|
||||
"""Process face information for Photos 5 databases
|
||||
|
||||
Args:
|
||||
photosdb: an OSXPhotosDB instance
|
||||
|
||||
@@ -22,8 +22,8 @@ from .photosdb_utils import get_db_version
|
||||
|
||||
|
||||
def _process_scoreinfo(self):
|
||||
""" Process computed photo scores
|
||||
Note: Only works on Photos version == 5.0
|
||||
"""Process computed photo scores
|
||||
Note: Only works on Photos version == 5.0
|
||||
"""
|
||||
|
||||
# _db_scoreinfo_uuid is dict in form {uuid: {score values}}
|
||||
@@ -38,7 +38,7 @@ def _process_scoreinfo(self):
|
||||
|
||||
|
||||
def _process_scoreinfo_5(photosdb):
|
||||
""" Process computed photo scores for Photos 5 databases
|
||||
"""Process computed photo scores for Photos 5 databases
|
||||
|
||||
Args:
|
||||
photosdb: an OSXPhotosDB instance
|
||||
@@ -147,4 +147,4 @@ def _process_scoreinfo_5(photosdb):
|
||||
scores["well_timed_shot"] = row[27]
|
||||
photosdb._db_scoreinfo_uuid[uuid] = scores
|
||||
|
||||
conn.close()
|
||||
conn.close()
|
||||
|
||||
@@ -35,10 +35,10 @@ from ..utils import _db_is_locked, _debug, _open_sql_file, normalize_unicode
|
||||
|
||||
|
||||
def _process_searchinfo(self):
|
||||
""" load machine learning/search term label info from a Photos library
|
||||
db_connection: a connection to the SQLite database file containing the
|
||||
search terms. In Photos 5, this is called psi.sqlite
|
||||
Note: Only works on Photos version == 5.0 """
|
||||
"""load machine learning/search term label info from a Photos library
|
||||
db_connection: a connection to the SQLite database file containing the
|
||||
search terms. In Photos 5, this is called psi.sqlite
|
||||
Note: Only works on Photos version == 5.0"""
|
||||
|
||||
# _db_searchinfo_uuid is dict in form {uuid : [list of associated search info records]
|
||||
self._db_searchinfo_uuid = _db_searchinfo_uuid = {}
|
||||
@@ -155,7 +155,7 @@ def _process_searchinfo(self):
|
||||
|
||||
@property
|
||||
def labels(self):
|
||||
""" return list of all search info labels found in the library """
|
||||
"""return list of all search info labels found in the library"""
|
||||
if self._db_version <= _PHOTOS_4_VERSION:
|
||||
logging.warning(f"SearchInfo not implemented for this library version")
|
||||
return []
|
||||
@@ -165,7 +165,7 @@ def labels(self):
|
||||
|
||||
@property
|
||||
def labels_normalized(self):
|
||||
""" return list of all normalized search info labels found in the library """
|
||||
"""return list of all normalized search info labels found in the library"""
|
||||
if self._db_version <= _PHOTOS_4_VERSION:
|
||||
logging.warning(f"SearchInfo not implemented for this library version")
|
||||
return []
|
||||
@@ -175,7 +175,7 @@ def labels_normalized(self):
|
||||
|
||||
@property
|
||||
def labels_as_dict(self):
|
||||
""" return labels as dict of label: count in reverse sorted order (descending) """
|
||||
"""return labels as dict of label: count in reverse sorted order (descending)"""
|
||||
if self._db_version <= _PHOTOS_4_VERSION:
|
||||
logging.warning(f"SearchInfo not implemented for this library version")
|
||||
return dict()
|
||||
@@ -187,7 +187,7 @@ def labels_as_dict(self):
|
||||
|
||||
@property
|
||||
def labels_normalized_as_dict(self):
|
||||
""" return normalized labels as dict of label: count in reverse sorted order (descending) """
|
||||
"""return normalized labels as dict of label: count in reverse sorted order (descending)"""
|
||||
if self._db_version <= _PHOTOS_4_VERSION:
|
||||
logging.warning(f"SearchInfo not implemented for this library version")
|
||||
return dict()
|
||||
@@ -201,8 +201,8 @@ def labels_normalized_as_dict(self):
|
||||
|
||||
@lru_cache(maxsize=128)
|
||||
def ints_to_uuid(uuid_0, uuid_1):
|
||||
""" convert two signed ints into a UUID strings
|
||||
uuid_0, uuid_1: the two int components of an RFC 4122 UUID """
|
||||
"""convert two signed ints into a UUID strings
|
||||
uuid_0, uuid_1: the two int components of an RFC 4122 UUID"""
|
||||
|
||||
# assumes uuid imported as uuidlib (to avoid namespace conflict with other uses of uuid)
|
||||
|
||||
|
||||
@@ -39,6 +39,7 @@ from .._constants import (
|
||||
_PHOTOS_5_PROJECT_ALBUM_KIND,
|
||||
_PHOTOS_5_ROOT_FOLDER_KIND,
|
||||
_PHOTOS_5_SHARED_ALBUM_KIND,
|
||||
_PHOTOS_5_VERSION,
|
||||
_TESTED_OS_VERSIONS,
|
||||
_UNKNOWN_PERSON,
|
||||
BURST_KEY,
|
||||
@@ -66,6 +67,8 @@ from ..utils import (
|
||||
)
|
||||
from .photosdb_utils import get_db_model_version, get_db_version
|
||||
|
||||
__all__ = ["PhotosDB"]
|
||||
|
||||
# TODO: Add test for imageTimeZoneOffsetSeconds = None
|
||||
# TODO: Add test for __str__
|
||||
# TODO: Add special albums and magic albums
|
||||
@@ -657,14 +660,18 @@ class PhotosDB:
|
||||
|
||||
for person in c:
|
||||
pk = person[0]
|
||||
fullname = person[2] if person[2] is not None else _UNKNOWN_PERSON
|
||||
fullname = (
|
||||
normalize_unicode(person[2])
|
||||
if person[2] is not None
|
||||
else _UNKNOWN_PERSON
|
||||
)
|
||||
self._dbpersons_pk[pk] = {
|
||||
"pk": pk,
|
||||
"uuid": person[1],
|
||||
"fullname": fullname,
|
||||
"facecount": person[3],
|
||||
"keyface": person[5],
|
||||
"displayname": person[4],
|
||||
"displayname": normalize_unicode(person[4]),
|
||||
"photo_uuid": None,
|
||||
"keyface_uuid": None,
|
||||
}
|
||||
@@ -731,13 +738,6 @@ class PhotosDB:
|
||||
except KeyError:
|
||||
self._dbfaces_pk[pk] = [uuid]
|
||||
|
||||
if _debug():
|
||||
logging.debug(f"Finished walking through persons")
|
||||
logging.debug(pformat(self._dbpersons_pk))
|
||||
logging.debug(pformat(self._dbpersons_fullname))
|
||||
logging.debug(pformat(self._dbfaces_pk))
|
||||
logging.debug(pformat(self._dbfaces_uuid))
|
||||
|
||||
# Get info on albums
|
||||
verbose("Processing albums.")
|
||||
c.execute(
|
||||
@@ -874,14 +874,6 @@ class PhotosDB:
|
||||
else:
|
||||
self._dbalbum_folders[album] = {}
|
||||
|
||||
if _debug():
|
||||
logging.debug(f"Finished walking through albums")
|
||||
logging.debug(pformat(self._dbalbums_album))
|
||||
logging.debug(pformat(self._dbalbums_uuid))
|
||||
logging.debug(pformat(self._dbalbum_details))
|
||||
logging.debug(pformat(self._dbalbum_folders))
|
||||
logging.debug(pformat(self._dbfolder_details))
|
||||
|
||||
# Get info on keywords
|
||||
verbose("Processing keywords.")
|
||||
c.execute(
|
||||
@@ -897,13 +889,16 @@ class PhotosDB:
|
||||
RKMaster.uuid = RKVersion.masterUuid
|
||||
"""
|
||||
)
|
||||
for keyword in c:
|
||||
if not keyword[1] in self._dbkeywords_uuid:
|
||||
self._dbkeywords_uuid[keyword[1]] = []
|
||||
if not keyword[0] in self._dbkeywords_keyword:
|
||||
self._dbkeywords_keyword[keyword[0]] = []
|
||||
self._dbkeywords_uuid[keyword[1]].append(keyword[0])
|
||||
self._dbkeywords_keyword[keyword[0]].append(keyword[1])
|
||||
for keyword_title, keyword_uuid, _ in c:
|
||||
keyword_title = normalize_unicode(keyword_title)
|
||||
try:
|
||||
self._dbkeywords_uuid[keyword_uuid].append(keyword_title)
|
||||
except KeyError:
|
||||
self._dbkeywords_uuid[keyword_uuid] = [keyword_title]
|
||||
try:
|
||||
self._dbkeywords_keyword[keyword_title].append(keyword_uuid)
|
||||
except KeyError:
|
||||
self._dbkeywords_keyword[keyword_title] = [keyword_uuid]
|
||||
|
||||
# Get info on disk volumes
|
||||
c.execute("select RKVolume.modelId, RKVolume.name from RKVolume")
|
||||
@@ -1025,13 +1020,11 @@ class PhotosDB:
|
||||
|
||||
for row in c:
|
||||
uuid = row[0]
|
||||
if _debug():
|
||||
logging.debug(f"uuid = '{uuid}, master = '{row[2]}")
|
||||
self._dbphotos[uuid] = {}
|
||||
self._dbphotos[uuid]["_uuid"] = uuid # stored here for easier debugging
|
||||
self._dbphotos[uuid]["modelID"] = row[1]
|
||||
self._dbphotos[uuid]["masterUuid"] = row[2]
|
||||
self._dbphotos[uuid]["filename"] = row[3]
|
||||
self._dbphotos[uuid]["filename"] = normalize_unicode(row[3])
|
||||
|
||||
# There are sometimes negative values for lastmodifieddate in the database
|
||||
# I don't know what these mean but they will raise exception in datetime if
|
||||
@@ -1270,13 +1263,13 @@ class PhotosDB:
|
||||
info["volumeId"] = row[1]
|
||||
info["imagePath"] = row[2]
|
||||
info["isMissing"] = row[3]
|
||||
info["originalFilename"] = row[4]
|
||||
info["originalFilename"] = normalize_unicode(row[4])
|
||||
info["UTI"] = row[5]
|
||||
info["modelID"] = row[6]
|
||||
info["fileSize"] = row[7]
|
||||
info["isTrulyRAW"] = row[8]
|
||||
info["alternateMasterUuid"] = row[9]
|
||||
info["filename"] = row[10]
|
||||
info["filename"] = normalize_unicode(row[10])
|
||||
self._dbphotos_master[uuid] = info
|
||||
|
||||
# get details needed to find path of the edited photos
|
||||
@@ -1548,39 +1541,6 @@ class PhotosDB:
|
||||
|
||||
# done processing, dump debug data if requested
|
||||
verbose("Done processing details from Photos library.")
|
||||
if _debug():
|
||||
logging.debug("Faces (_dbfaces_uuid):")
|
||||
logging.debug(pformat(self._dbfaces_uuid))
|
||||
|
||||
logging.debug("Persons (_dbpersons_pk):")
|
||||
logging.debug(pformat(self._dbpersons_pk))
|
||||
|
||||
logging.debug("Keywords by uuid (_dbkeywords_uuid):")
|
||||
logging.debug(pformat(self._dbkeywords_uuid))
|
||||
|
||||
logging.debug("Keywords by keyword (_dbkeywords_keywords):")
|
||||
logging.debug(pformat(self._dbkeywords_keyword))
|
||||
|
||||
logging.debug("Albums by uuid (_dbalbums_uuid):")
|
||||
logging.debug(pformat(self._dbalbums_uuid))
|
||||
|
||||
logging.debug("Albums by album (_dbalbums_albums):")
|
||||
logging.debug(pformat(self._dbalbums_album))
|
||||
|
||||
logging.debug("Album details (_dbalbum_details):")
|
||||
logging.debug(pformat(self._dbalbum_details))
|
||||
|
||||
logging.debug("Album titles (_dbalbum_titles):")
|
||||
logging.debug(pformat(self._dbalbum_titles))
|
||||
|
||||
logging.debug("Volumes (_dbvolumes):")
|
||||
logging.debug(pformat(self._dbvolumes))
|
||||
|
||||
logging.debug("Photos (_dbphotos):")
|
||||
logging.debug(pformat(self._dbphotos))
|
||||
|
||||
logging.debug("Burst Photos (dbphotos_burst:")
|
||||
logging.debug(pformat(self._dbphotos_burst))
|
||||
|
||||
def _build_album_folder_hierarchy_4(self, uuid, folders=None):
|
||||
"""recursively build folder/album hierarchy
|
||||
@@ -1671,7 +1631,7 @@ class PhotosDB:
|
||||
for person in c:
|
||||
pk = person[0]
|
||||
fullname = (
|
||||
person[2]
|
||||
normalize_unicode(person[2])
|
||||
if (person[2] != "" and person[2] is not None)
|
||||
else _UNKNOWN_PERSON
|
||||
)
|
||||
@@ -1681,7 +1641,7 @@ class PhotosDB:
|
||||
"fullname": fullname,
|
||||
"facecount": person[3],
|
||||
"keyface": person[4],
|
||||
"displayname": person[5],
|
||||
"displayname": normalize_unicode(person[5]),
|
||||
"photo_uuid": None,
|
||||
"keyface_uuid": None,
|
||||
}
|
||||
@@ -1745,13 +1705,6 @@ class PhotosDB:
|
||||
except KeyError:
|
||||
self._dbfaces_pk[pk] = [uuid]
|
||||
|
||||
if _debug():
|
||||
logging.debug(f"Finished walking through persons")
|
||||
logging.debug(pformat(self._dbpersons_pk))
|
||||
logging.debug(pformat(self._dbpersons_fullname))
|
||||
logging.debug(pformat(self._dbfaces_pk))
|
||||
logging.debug(pformat(self._dbfaces_uuid))
|
||||
|
||||
# get details about albums
|
||||
verbose("Processing albums.")
|
||||
c.execute(
|
||||
@@ -1868,13 +1821,6 @@ class PhotosDB:
|
||||
# shared albums can't be in folders
|
||||
self._dbalbum_folders[album] = []
|
||||
|
||||
if _debug():
|
||||
logging.debug(f"Finished walking through albums")
|
||||
logging.debug(pformat(self._dbalbums_album))
|
||||
logging.debug(pformat(self._dbalbums_uuid))
|
||||
logging.debug(pformat(self._dbalbum_details))
|
||||
logging.debug(pformat(self._dbalbum_folders))
|
||||
|
||||
# get details on keywords
|
||||
verbose("Processing keywords.")
|
||||
c.execute(
|
||||
@@ -1884,29 +1830,22 @@ class PhotosDB:
|
||||
JOIN Z_1KEYWORDS ON Z_1KEYWORDS.Z_1ASSETATTRIBUTES = ZADDITIONALASSETATTRIBUTES.Z_PK
|
||||
JOIN ZKEYWORD ON ZKEYWORD.Z_PK = {keyword_join} """
|
||||
)
|
||||
for keyword in c:
|
||||
keyword_title = normalize_unicode(keyword[0])
|
||||
if not keyword[1] in self._dbkeywords_uuid:
|
||||
self._dbkeywords_uuid[keyword[1]] = []
|
||||
if not keyword_title in self._dbkeywords_keyword:
|
||||
self._dbkeywords_keyword[keyword_title] = []
|
||||
self._dbkeywords_uuid[keyword[1]].append(keyword[0])
|
||||
self._dbkeywords_keyword[keyword_title].append(keyword[1])
|
||||
|
||||
if _debug():
|
||||
logging.debug(f"Finished walking through keywords")
|
||||
logging.debug(pformat(self._dbkeywords_keyword))
|
||||
logging.debug(pformat(self._dbkeywords_uuid))
|
||||
for keyword_title, keyword_uuid in c:
|
||||
keyword_title = normalize_unicode(keyword_title)
|
||||
try:
|
||||
self._dbkeywords_uuid[keyword_uuid].append(keyword_title)
|
||||
except KeyError:
|
||||
self._dbkeywords_uuid[keyword_uuid] = [keyword_title]
|
||||
try:
|
||||
self._dbkeywords_keyword[keyword_title].append(keyword_uuid)
|
||||
except KeyError:
|
||||
self._dbkeywords_keyword[keyword_title] = [keyword_uuid]
|
||||
|
||||
# get details on disk volumes
|
||||
c.execute("SELECT ZUUID, ZNAME from ZFILESYSTEMVOLUME")
|
||||
for vol in c:
|
||||
self._dbvolumes[vol[0]] = vol[1]
|
||||
|
||||
if _debug():
|
||||
logging.debug(f"Finished walking through volumes")
|
||||
logging.debug(self._dbvolumes)
|
||||
|
||||
# get details about photos
|
||||
verbose("Processing photo details.")
|
||||
c.execute(
|
||||
@@ -2040,8 +1979,8 @@ class PhotosDB:
|
||||
|
||||
info["hidden"] = row[9]
|
||||
info["favorite"] = row[10]
|
||||
info["originalFilename"] = row[3]
|
||||
info["filename"] = row[12]
|
||||
info["originalFilename"] = normalize_unicode(row[3])
|
||||
info["filename"] = normalize_unicode(row[12])
|
||||
info["directory"] = row[11]
|
||||
|
||||
# set latitude and longitude
|
||||
@@ -2519,48 +2458,6 @@ class PhotosDB:
|
||||
|
||||
# done processing, dump debug data if requested
|
||||
verbose("Done processing details from Photos library.")
|
||||
if _debug():
|
||||
logging.debug("Faces (_dbfaces_uuid):")
|
||||
logging.debug(pformat(self._dbfaces_uuid))
|
||||
|
||||
logging.debug("Persons (_dbpersons_pk):")
|
||||
logging.debug(pformat(self._dbpersons_pk))
|
||||
|
||||
logging.debug("Keywords by uuid (_dbkeywords_uuid):")
|
||||
logging.debug(pformat(self._dbkeywords_uuid))
|
||||
|
||||
logging.debug("Keywords by keyword (_dbkeywords_keywords):")
|
||||
logging.debug(pformat(self._dbkeywords_keyword))
|
||||
|
||||
logging.debug("Albums by uuid (_dbalbums_uuid):")
|
||||
logging.debug(pformat(self._dbalbums_uuid))
|
||||
|
||||
logging.debug("Albums by album (_dbalbums_albums):")
|
||||
logging.debug(pformat(self._dbalbums_album))
|
||||
|
||||
logging.debug("Album details (_dbalbum_details):")
|
||||
logging.debug(pformat(self._dbalbum_details))
|
||||
|
||||
logging.debug("Album titles (_dbalbum_titles):")
|
||||
logging.debug(pformat(self._dbalbum_titles))
|
||||
|
||||
logging.debug("Album folders (_dbalbum_folders):")
|
||||
logging.debug(pformat(self._dbalbum_folders))
|
||||
|
||||
logging.debug("Album parent folders (_dbalbum_parent_folders):")
|
||||
logging.debug(pformat(self._dbalbum_parent_folders))
|
||||
|
||||
logging.debug("Albums pk (_dbalbums_pk):")
|
||||
logging.debug(pformat(self._dbalbums_pk))
|
||||
|
||||
logging.debug("Volumes (_dbvolumes):")
|
||||
logging.debug(pformat(self._dbvolumes))
|
||||
|
||||
logging.debug("Photos (_dbphotos):")
|
||||
logging.debug(pformat(self._dbphotos))
|
||||
|
||||
logging.debug("Burst Photos (dbphotos_burst:")
|
||||
logging.debug(pformat(self._dbphotos_burst))
|
||||
|
||||
def _process_moments(self):
|
||||
"""Process data from ZMOMENT table"""
|
||||
@@ -2621,8 +2518,8 @@ class PhotosDB:
|
||||
moment_info["modificationDate"] = row[6]
|
||||
moment_info["representativeDate"] = row[7]
|
||||
moment_info["startDate"] = row[8]
|
||||
moment_info["subtitle"] = row[9]
|
||||
moment_info["title"] = row[10]
|
||||
moment_info["subtitle"] = normalize_unicode(row[9])
|
||||
moment_info["title"] = normalize_unicode(row[10])
|
||||
moment_info["uuid"] = row[11]
|
||||
|
||||
# if both lat/lon == -180, then it means location undefined
|
||||
@@ -3025,6 +2922,7 @@ class PhotosDB:
|
||||
if keywords:
|
||||
keyword_set = set()
|
||||
for keyword in keywords:
|
||||
keyword = normalize_unicode(keyword)
|
||||
if keyword in self._dbkeywords_keyword:
|
||||
keyword_set.update(self._dbkeywords_keyword[keyword])
|
||||
photos_sets.append(keyword_set)
|
||||
@@ -3032,6 +2930,7 @@ class PhotosDB:
|
||||
if persons:
|
||||
person_set = set()
|
||||
for person in persons:
|
||||
person = normalize_unicode(person)
|
||||
if person in self._dbpersons_fullname:
|
||||
for pk in self._dbpersons_fullname[person]:
|
||||
try:
|
||||
@@ -3074,8 +2973,6 @@ class PhotosDB:
|
||||
):
|
||||
info = PhotoInfo(db=self, uuid=p, info=self._dbphotos[p])
|
||||
photoinfo.append(info)
|
||||
if _debug:
|
||||
logging.debug(f"photoinfo: {pformat(photoinfo)}")
|
||||
|
||||
return photoinfo
|
||||
|
||||
@@ -3412,23 +3309,35 @@ class PhotosDB:
|
||||
# case-insensitive
|
||||
for n in name:
|
||||
n = n.lower()
|
||||
photo_list.extend(
|
||||
[
|
||||
p
|
||||
for p in photos
|
||||
if n in p.filename.lower()
|
||||
or n in p.original_filename.lower()
|
||||
]
|
||||
)
|
||||
if self._db_version >= _PHOTOS_5_VERSION:
|
||||
# search only original_filename (#594)
|
||||
photo_list.extend(
|
||||
[p for p in photos if n in p.original_filename.lower()]
|
||||
)
|
||||
else:
|
||||
photo_list.extend(
|
||||
[
|
||||
p
|
||||
for p in photos
|
||||
if n in p.filename.lower()
|
||||
or n in p.original_filename.lower()
|
||||
]
|
||||
)
|
||||
else:
|
||||
for n in name:
|
||||
photo_list.extend(
|
||||
[
|
||||
p
|
||||
for p in photos
|
||||
if n in p.filename or n in p.original_filename
|
||||
]
|
||||
)
|
||||
if self._db_version >= _PHOTOS_5_VERSION:
|
||||
# search only original_filename (#594)
|
||||
photo_list.extend(
|
||||
[p for p in photos if n in p.original_filename]
|
||||
)
|
||||
else:
|
||||
photo_list.extend(
|
||||
[
|
||||
p
|
||||
for p in photos
|
||||
if n in p.filename or n in p.original_filename
|
||||
]
|
||||
)
|
||||
photos = photo_list
|
||||
|
||||
if options.min_size:
|
||||
|
||||
@@ -16,6 +16,14 @@ from .._constants import (
|
||||
)
|
||||
from ..utils import _open_sql_file
|
||||
|
||||
__all__ = [
|
||||
"get_db_version",
|
||||
"get_model_version",
|
||||
"get_db_model_version",
|
||||
"UnknownLibraryVersion",
|
||||
"get_photos_library_version",
|
||||
]
|
||||
|
||||
|
||||
def get_db_version(db_file):
|
||||
"""Gets the Photos DB version from LiGlobals table
|
||||
@@ -105,9 +113,8 @@ def get_photos_library_version(library_path):
|
||||
return 3
|
||||
if db_ver == int(_PHOTOS_4_VERSION):
|
||||
return 4
|
||||
if db_ver != int(_PHOTOS_5_VERSION):
|
||||
raise UnknownLibraryVersion(f"db_ver = {db_ver}")
|
||||
|
||||
# assume it's a Photos 5+ library, get the model version to determine which version
|
||||
model_ver = get_model_version(str(library_path / "database" / "Photos.sqlite"))
|
||||
model_ver = int(model_ver)
|
||||
if _PHOTOS_5_MODEL_VERSION[0] <= model_ver <= _PHOTOS_5_MODEL_VERSION[1]:
|
||||
|
||||
@@ -17,11 +17,19 @@ from ._constants import _UNKNOWN_PERSON, TEXT_DETECTION_CONFIDENCE_THRESHOLD
|
||||
from ._version import __version__
|
||||
from .datetime_formatter import DateTimeFormatter
|
||||
from .exiftool import ExifToolCaching
|
||||
from .export_db import ExportDB_ABC, ExportDBInMemory
|
||||
from .path_utils import sanitize_dirname, sanitize_filename, sanitize_pathpart
|
||||
from .text_detection import detect_text
|
||||
from .utils import expand_and_validate_filepath, load_function
|
||||
|
||||
__all__ = [
|
||||
"RenderOptions",
|
||||
"PhotoTemplateParser",
|
||||
"PhotoTemplate",
|
||||
"parse_default_kv",
|
||||
"get_template_help",
|
||||
"format_str_value",
|
||||
]
|
||||
|
||||
# TODO: a lot of values are passed from function to function like path_sep--make these all class properties
|
||||
|
||||
# ensure locale set to user's locale
|
||||
@@ -291,7 +299,6 @@ class RenderOptions:
|
||||
dest_path: set to the destination path of the photo (for use by {function} template), only valid with --filename
|
||||
filepath: set to value for filepath of the exported photo if you want to evaluate {filepath} template
|
||||
quote: quote path templates for execution in the shell
|
||||
exportdb: ExportDB object
|
||||
"""
|
||||
|
||||
none_str: str = "_"
|
||||
@@ -306,7 +313,6 @@ class RenderOptions:
|
||||
dest_path: Optional[str] = None
|
||||
filepath: Optional[str] = None
|
||||
quote: bool = False
|
||||
exportdb: Optional[ExportDB_ABC] = None
|
||||
|
||||
|
||||
class PhotoTemplateParser:
|
||||
@@ -375,9 +381,6 @@ class PhotoTemplate:
|
||||
self.filepath = options.filepath
|
||||
self.quote = options.quote
|
||||
self.dest_path = options.dest_path
|
||||
self.exportdb = options.exportdb or ExportDBInMemory(
|
||||
None, self.export_dir or "."
|
||||
)
|
||||
|
||||
def render(
|
||||
self,
|
||||
@@ -411,7 +414,6 @@ class PhotoTemplate:
|
||||
self.filepath = options.filepath
|
||||
self.quote = options.quote
|
||||
self.dest_path = options.dest_path
|
||||
self.exportdb = options.exportdb or self.exportdb
|
||||
|
||||
try:
|
||||
model = self.parser.parse(template)
|
||||
@@ -1207,7 +1209,7 @@ class PhotoTemplate:
|
||||
else:
|
||||
values = list(obj)
|
||||
elif field == "detected_text":
|
||||
values = _get_detected_text(self.photo, self.exportdb, confidence=subfield)
|
||||
values = _get_detected_text(self.photo, confidence=subfield)
|
||||
else:
|
||||
raise ValueError(f"Unhandled template value: {field}")
|
||||
|
||||
@@ -1450,7 +1452,7 @@ def _get_album_by_path(photo, folder_album_path):
|
||||
return None
|
||||
|
||||
|
||||
def _get_detected_text(photo, exportdb, confidence=TEXT_DETECTION_CONFIDENCE_THRESHOLD):
|
||||
def _get_detected_text(photo, confidence=TEXT_DETECTION_CONFIDENCE_THRESHOLD):
|
||||
"""Returns the detected text for a photo
|
||||
{detected_text} uses this instead of PhotoInfo.detected_text() to cache the text for all confidence values
|
||||
"""
|
||||
@@ -1466,5 +1468,4 @@ def _get_detected_text(photo, exportdb, confidence=TEXT_DETECTION_CONFIDENCE_THR
|
||||
# _detected_text caches the text detection results in an extended attribute
|
||||
# so the first time this gets called is slow but repeated accesses are fast
|
||||
detected_text = photo._detected_text()
|
||||
exportdb.set_detected_text_for_uuid(photo.uuid, json.dumps(detected_text))
|
||||
return [text for text, conf in detected_text if conf >= confidence]
|
||||
|
||||
@@ -14,6 +14,16 @@ from bpylist import archiver
|
||||
from ._constants import UNICODE_FORMAT
|
||||
from .utils import normalize_unicode
|
||||
|
||||
__all__ = [
|
||||
"PLRevGeoLocationInfo",
|
||||
"PLRevGeoMapItem",
|
||||
"PLRevGeoMapItemAdditionalPlaceInfo",
|
||||
"CNPostalAddress",
|
||||
"PlaceInfo",
|
||||
"PlaceInfo4",
|
||||
"PlaceInfo5",
|
||||
]
|
||||
|
||||
# postal address information, returned by PlaceInfo.address
|
||||
PostalAddress = namedtuple(
|
||||
"PostalAddress",
|
||||
@@ -65,7 +75,7 @@ PlaceNames = namedtuple(
|
||||
# in ZADDITIONALASSETATTRIBUTES.ZREVERSELOCATIONDATA
|
||||
# These classes are used by bpylist.archiver to unarchive the serialized objects
|
||||
class PLRevGeoLocationInfo:
|
||||
""" The top level reverse geolocation object """
|
||||
"""The top level reverse geolocation object"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
@@ -147,7 +157,7 @@ class PLRevGeoLocationInfo:
|
||||
|
||||
|
||||
class PLRevGeoMapItem:
|
||||
""" Stores the list of place names, organized by area """
|
||||
"""Stores the list of place names, organized by area"""
|
||||
|
||||
def __init__(self, sortedPlaceInfos, finalPlaceInfos):
|
||||
self.sortedPlaceInfos = sortedPlaceInfos
|
||||
@@ -182,7 +192,7 @@ class PLRevGeoMapItem:
|
||||
|
||||
|
||||
class PLRevGeoMapItemAdditionalPlaceInfo:
|
||||
""" Additional info about individual places """
|
||||
"""Additional info about individual places"""
|
||||
|
||||
def __init__(self, area, name, placeType, dominantOrderType):
|
||||
self.area = area
|
||||
@@ -221,7 +231,7 @@ class PLRevGeoMapItemAdditionalPlaceInfo:
|
||||
|
||||
|
||||
class CNPostalAddress:
|
||||
""" postal address for the reverse geolocation info """
|
||||
"""postal address for the reverse geolocation info"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
@@ -354,17 +364,17 @@ class PlaceInfo(ABC):
|
||||
|
||||
|
||||
class PlaceInfo4(PlaceInfo):
|
||||
""" Reverse geolocation place info for a photo (Photos <= 4) """
|
||||
"""Reverse geolocation place info for a photo (Photos <= 4)"""
|
||||
|
||||
def __init__(self, place_names, country_code):
|
||||
""" place_names: list of place name tuples in ascending order by area
|
||||
tuple fields are: modelID, place name, place type, area, e.g.
|
||||
[(5, "St James's Park", 45, 0),
|
||||
(4, 'Westminster', 16, 22097376),
|
||||
(3, 'London', 4, 1596146816),
|
||||
(2, 'England', 2, 180406091776),
|
||||
(1, 'United Kingdom', 1, 414681432064)]
|
||||
country_code: two letter country code for the country
|
||||
"""place_names: list of place name tuples in ascending order by area
|
||||
tuple fields are: modelID, place name, place type, area, e.g.
|
||||
[(5, "St James's Park", 45, 0),
|
||||
(4, 'Westminster', 16, 22097376),
|
||||
(3, 'London', 4, 1596146816),
|
||||
(2, 'England', 2, 180406091776),
|
||||
(1, 'United Kingdom', 1, 414681432064)]
|
||||
country_code: two letter country code for the country
|
||||
"""
|
||||
self._place_names = place_names
|
||||
self._country_code = country_code
|
||||
@@ -404,7 +414,7 @@ class PlaceInfo4(PlaceInfo):
|
||||
)
|
||||
|
||||
def _process_place_info(self):
|
||||
""" Process place_names to set self._name and self._names """
|
||||
"""Process place_names to set self._name and self._names"""
|
||||
places = self._place_names
|
||||
|
||||
# build a dictionary where key is placetype
|
||||
@@ -500,38 +510,38 @@ class PlaceInfo4(PlaceInfo):
|
||||
|
||||
|
||||
class PlaceInfo5(PlaceInfo):
|
||||
""" Reverse geolocation place info for a photo (Photos >= 5) """
|
||||
"""Reverse geolocation place info for a photo (Photos >= 5)"""
|
||||
|
||||
def __init__(self, revgeoloc_bplist):
|
||||
""" revgeoloc_bplist: a binary plist blob containing
|
||||
a serialized PLRevGeoLocationInfo object """
|
||||
"""revgeoloc_bplist: a binary plist blob containing
|
||||
a serialized PLRevGeoLocationInfo object"""
|
||||
self._bplist = revgeoloc_bplist
|
||||
self._plrevgeoloc = archiver.unarchive(revgeoloc_bplist)
|
||||
self._process_place_info()
|
||||
|
||||
@property
|
||||
def address_str(self):
|
||||
""" returns the postal address as a string """
|
||||
"""returns the postal address as a string"""
|
||||
return self._plrevgeoloc.addressString
|
||||
|
||||
@property
|
||||
def country_code(self):
|
||||
""" returns the country code """
|
||||
"""returns the country code"""
|
||||
return self._plrevgeoloc.countryCode
|
||||
|
||||
@property
|
||||
def ishome(self):
|
||||
""" returns True if place is user's home address """
|
||||
"""returns True if place is user's home address"""
|
||||
return self._plrevgeoloc.isHome
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
""" returns local place name """
|
||||
"""returns local place name"""
|
||||
return self._name
|
||||
|
||||
@property
|
||||
def names(self):
|
||||
""" returns PlaceNames tuple with detailed reverse geolocation place names """
|
||||
"""returns PlaceNames tuple with detailed reverse geolocation place names"""
|
||||
return self._names
|
||||
|
||||
@property
|
||||
@@ -556,7 +566,7 @@ class PlaceInfo5(PlaceInfo):
|
||||
return postal_address
|
||||
|
||||
def _process_place_info(self):
|
||||
""" Process sortedPlaceInfos to set self._name and self._names """
|
||||
"""Process sortedPlaceInfos to set self._name and self._names"""
|
||||
places = self._plrevgeoloc.mapItem.sortedPlaceInfos
|
||||
|
||||
# build a dictionary where key is placetype
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
__all__ = ["PyReplQuitter", "embed_repl"]
|
||||
""" Custom Python REPL based on ptpython that allows quitting with custom keywords instead of `quit()` """
|
||||
|
||||
""" This file is distributed under the same license as the ptpython package:
|
||||
|
||||
@@ -8,6 +8,8 @@ from mako.template import Template
|
||||
|
||||
from ._constants import _DB_TABLE_NAMES
|
||||
|
||||
__all__ = ["get_query"]
|
||||
|
||||
QUERY_DIR = os.path.join(os.path.dirname(__file__), "queries")
|
||||
|
||||
|
||||
|
||||
@@ -6,6 +6,8 @@ from typing import Iterable, List, Optional, Tuple
|
||||
|
||||
import bitmath
|
||||
|
||||
__all__ = ["QueryOptions"]
|
||||
|
||||
|
||||
@dataclass
|
||||
class QueryOptions:
|
||||
|
||||
@@ -4,10 +4,12 @@ from dataclasses import dataclass
|
||||
|
||||
from ._constants import _PHOTOS_4_VERSION
|
||||
|
||||
__all__ = ["ScoreInfo"]
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class ScoreInfo:
|
||||
""" Computed photo score info associated with a photo from the Photos library """
|
||||
"""Computed photo score info associated with a photo from the Photos library"""
|
||||
|
||||
overall: float
|
||||
curation: float
|
||||
@@ -36,4 +38,3 @@ class ScoreInfo:
|
||||
well_chosen_subject: float
|
||||
well_framed_subject: float
|
||||
well_timed_shot: float
|
||||
|
||||
|
||||
@@ -23,6 +23,8 @@ from ._constants import (
|
||||
SEARCH_CATEGORY_YEAR,
|
||||
)
|
||||
|
||||
__all__ = ["SearchInfo"]
|
||||
|
||||
|
||||
class SearchInfo:
|
||||
"""Info about search terms such as machine learning labels that Photos knows about a photo"""
|
||||
|
||||
@@ -4,6 +4,8 @@ import re
|
||||
import sqlite3
|
||||
from typing import Generator, List
|
||||
|
||||
__all__ = ["sqlgrep"]
|
||||
|
||||
|
||||
def sqlgrep(
|
||||
filename: str,
|
||||
|
||||
@@ -13,6 +13,8 @@ from wurlitzer import pipes
|
||||
|
||||
from .utils import _get_os_version
|
||||
|
||||
__all__ = ["detect_text", "make_request_handler"]
|
||||
|
||||
ver, major, minor = _get_os_version()
|
||||
if ver == "10" and int(major) < 15:
|
||||
vision = False
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
__all__ = ["get_preferred_uti_extension", "get_uti_for_extension"]
|
||||
""" get UTI for a given file extension and the preferred extension for a given UTI """
|
||||
|
||||
""" Implementation note: runs only on macOS
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
""" Utility functions used in osxphotos """
|
||||
|
||||
import datetime
|
||||
import fnmatch
|
||||
import glob
|
||||
import importlib
|
||||
@@ -16,13 +17,30 @@ import sys
|
||||
import unicodedata
|
||||
import urllib.parse
|
||||
from plistlib import load as plistload
|
||||
from typing import Callable, Union
|
||||
from typing import Callable, List, Optional, Union
|
||||
|
||||
import CoreFoundation
|
||||
import objc
|
||||
from Foundation import NSString
|
||||
from Foundation import NSFileManager, NSPredicate, NSString
|
||||
|
||||
from ._constants import UNICODE_FORMAT
|
||||
from .path_utils import sanitize_filestem_with_count
|
||||
|
||||
__all__ = [
|
||||
"dd_to_dms_str",
|
||||
"expand_and_validate_filepath",
|
||||
"get_last_library_path",
|
||||
"get_system_library_path",
|
||||
"increment_filename_with_count",
|
||||
"increment_filename",
|
||||
"lineno",
|
||||
"list_directory",
|
||||
"list_photo_libraries",
|
||||
"load_function",
|
||||
"noop",
|
||||
"normalize_fs_path",
|
||||
"normalize_unicode",
|
||||
]
|
||||
|
||||
_DEBUG = False
|
||||
|
||||
@@ -248,7 +266,9 @@ def list_photo_libraries():
|
||||
# On older MacOS versions, mdfind appears to ignore some libraries
|
||||
# glob to find libraries in ~/Pictures then mdfind to find all the others
|
||||
# TODO: make this more robust
|
||||
lib_list = glob.glob(f"{str(pathlib.Path.home())}/Pictures/*.photoslibrary")
|
||||
lib_list = list_directory(
|
||||
f"{pathlib.Path.home()}/Pictures/", glob="*.photoslibrary"
|
||||
)
|
||||
|
||||
# On older OS, may not get all libraries so make sure we get the last one
|
||||
last_lib = get_last_library_path()
|
||||
@@ -267,24 +287,95 @@ def list_photo_libraries():
|
||||
|
||||
def normalize_fs_path(path: str) -> str:
|
||||
"""Normalize filesystem paths with unicode in them"""
|
||||
with objc.autorelease_pool():
|
||||
normalized_path = NSString.fileSystemRepresentation(path)
|
||||
return normalized_path.decode("utf8")
|
||||
# macOS HFS+ uses NFD, APFS doesn't normalize but stick with NFD
|
||||
# ref: https://eclecticlight.co/2021/05/08/explainer-unicode-normalization-and-apfs/
|
||||
return unicodedata.normalize("NFD", path)
|
||||
|
||||
|
||||
def findfiles(pattern, path_):
|
||||
"""Returns list of filenames from path_ matched by pattern
|
||||
shell pattern. Matching is case-insensitive.
|
||||
If 'path_' is invalid/doesn't exist, returns []."""
|
||||
if not os.path.isdir(path_):
|
||||
# def findfiles(pattern, path):
|
||||
# """Returns list of filenames from path matched by pattern
|
||||
# shell pattern. Matching is case-insensitive.
|
||||
# If 'path_' is invalid/doesn't exist, returns []."""
|
||||
# if not os.path.isdir(path):
|
||||
# return []
|
||||
|
||||
# # paths need to be normalized for unicode as filesystem returns unicode in NFD form
|
||||
# pattern = normalize_fs_path(pattern)
|
||||
# rule = re.compile(fnmatch.translate(pattern), re.IGNORECASE)
|
||||
# files = os.listdir(path)
|
||||
# return [name for name in files if rule.match(name)]
|
||||
|
||||
|
||||
def list_directory(
|
||||
directory: Union[str, pathlib.Path],
|
||||
startswith: Optional[str] = None,
|
||||
endswith: Optional[str] = None,
|
||||
contains: Optional[str] = None,
|
||||
glob: Optional[str] = None,
|
||||
include_path: bool = False,
|
||||
case_sensitive: bool = False,
|
||||
) -> List[Union[str, pathlib.Path]]:
|
||||
"""List directory contents and return list of files or directories matching search criteria.
|
||||
Accounts for case-insensitive filesystems, unicode filenames. directory can be a str or a pathlib.Path object.
|
||||
|
||||
Args:
|
||||
directory: directory to search
|
||||
startswith: string to match at start of filename
|
||||
endswith: string to match at end of filename
|
||||
contains: string to match anywhere in filename
|
||||
glob: shell-style glob pattern to match filename
|
||||
include_path: if True, return full path to file
|
||||
case_sensitive: if True, match case-sensitively
|
||||
|
||||
Returns: List of files or directories matching search criteria as either str or pathlib.Path objects depending on the input type;
|
||||
returns empty list if directory is invalid or doesn't exist.
|
||||
|
||||
"""
|
||||
is_pathlib = isinstance(directory, pathlib.Path)
|
||||
if is_pathlib:
|
||||
directory = str(directory)
|
||||
|
||||
if not os.path.isdir(directory):
|
||||
return []
|
||||
# See: https://gist.github.com/techtonik/5694830
|
||||
|
||||
# paths need to be normalized for unicode as filesystem returns unicode in NFD form
|
||||
pattern = normalize_fs_path(pattern)
|
||||
rule = re.compile(fnmatch.translate(pattern), re.IGNORECASE)
|
||||
files = [normalize_fs_path(p) for p in os.listdir(path_)]
|
||||
return [name for name in files if rule.match(name)]
|
||||
startswith = normalize_fs_path(startswith) if startswith else None
|
||||
endswith = normalize_fs_path(endswith) if endswith else None
|
||||
contains = normalize_fs_path(contains) if contains else None
|
||||
glob = normalize_fs_path(glob) if glob else None
|
||||
|
||||
files = [normalize_fs_path(f) for f in os.listdir(directory)]
|
||||
if not case_sensitive:
|
||||
files_normalized = {f.lower(): f for f in files}
|
||||
files = [f.lower() for f in files]
|
||||
startswith = startswith.lower() if startswith else None
|
||||
endswith = endswith.lower() if endswith else None
|
||||
contains = contains.lower() if contains else None
|
||||
glob = glob.lower() if glob else None
|
||||
else:
|
||||
files_normalized = {f: f for f in files}
|
||||
|
||||
if startswith:
|
||||
files = [f for f in files if f.startswith(startswith)]
|
||||
if endswith:
|
||||
endswith = normalize_fs_path(endswith)
|
||||
files = [f for f in files if f.endswith(endswith)]
|
||||
if contains:
|
||||
contains = normalize_fs_path(contains)
|
||||
files = [f for f in files if contains in f]
|
||||
if glob:
|
||||
glob = normalize_fs_path(glob)
|
||||
flags = re.IGNORECASE if not case_sensitive else 0
|
||||
rule = re.compile(fnmatch.translate(glob), flags)
|
||||
files = [f for f in files if rule.match(f)]
|
||||
|
||||
files = [files_normalized[f] for f in files]
|
||||
|
||||
if include_path:
|
||||
files = [os.path.join(directory, f) for f in files]
|
||||
if is_pathlib:
|
||||
files = [pathlib.Path(f) for f in files]
|
||||
|
||||
return files
|
||||
|
||||
|
||||
def _open_sql_file(dbname):
|
||||
@@ -325,47 +416,24 @@ def _db_is_locked(dbname):
|
||||
return locked
|
||||
|
||||
|
||||
# OSXPHOTOS_XATTR_UUID = "com.osxphotos.uuid"
|
||||
|
||||
# def get_uuid_for_file(filepath):
|
||||
# """ returns UUID associated with an exported file
|
||||
# filepath: path to exported photo
|
||||
# """
|
||||
# attr = xattr.xattr(filepath)
|
||||
# try:
|
||||
# uuid_bytes = attr[OSXPHOTOS_XATTR_UUID]
|
||||
# uuid_str = uuid_bytes.decode('utf-8')
|
||||
# except KeyError:
|
||||
# uuid_str = None
|
||||
# return uuid_str
|
||||
|
||||
# def set_uuid_for_file(filepath, uuid):
|
||||
# """ sets the UUID associated with an exported file
|
||||
# filepath: path to exported photo
|
||||
# uuid: uuid string for photo
|
||||
# """
|
||||
# if not os.path.exists(filepath):
|
||||
# raise FileNotFoundError(f"Missing file: {filepath}")
|
||||
|
||||
# attr = xattr.xattr(filepath)
|
||||
# uuid_bytes = bytes(uuid, 'utf-8')
|
||||
# attr.set(OSXPHOTOS_XATTR_UUID, uuid_bytes)
|
||||
|
||||
|
||||
def normalize_unicode(value):
|
||||
"""normalize unicode data"""
|
||||
if value is not None:
|
||||
if isinstance(value, (tuple, list)):
|
||||
return tuple(unicodedata.normalize(UNICODE_FORMAT, v) for v in value)
|
||||
elif isinstance(value, str):
|
||||
return unicodedata.normalize(UNICODE_FORMAT, value)
|
||||
else:
|
||||
return value
|
||||
else:
|
||||
if value is None:
|
||||
return None
|
||||
if isinstance(value, (tuple, list)):
|
||||
return tuple(unicodedata.normalize(UNICODE_FORMAT, v) for v in value)
|
||||
elif isinstance(value, str):
|
||||
return unicodedata.normalize(UNICODE_FORMAT, value)
|
||||
else:
|
||||
return value
|
||||
|
||||
|
||||
def increment_filename_with_count(filepath: Union[str,pathlib.Path], count: int = 0) -> str:
|
||||
def increment_filename_with_count(
|
||||
filepath: Union[str, pathlib.Path],
|
||||
count: int = 0,
|
||||
lock: bool = False,
|
||||
dry_run: bool = False,
|
||||
) -> str:
|
||||
"""Return filename (1).ext, etc if filename.ext exists
|
||||
|
||||
If file exists in filename's parent folder with same stem as filename,
|
||||
@@ -374,6 +442,8 @@ def increment_filename_with_count(filepath: Union[str,pathlib.Path], count: int
|
||||
Args:
|
||||
filepath: str or pathlib.Path; full path, including file name
|
||||
count: int; starting increment value
|
||||
lock: bool; if True, create a lock file in form .filename.lock to prevent other processes from using the same filename
|
||||
dry_run: bool; if True, don't actually create lock file
|
||||
|
||||
Returns:
|
||||
tuple of new filepath (or same if not incremented), count
|
||||
@@ -381,19 +451,36 @@ def increment_filename_with_count(filepath: Union[str,pathlib.Path], count: int
|
||||
Note: This obviously is subject to race condition so using with caution.
|
||||
"""
|
||||
dest = filepath if isinstance(filepath, pathlib.Path) else pathlib.Path(filepath)
|
||||
dest_files = findfiles(f"{dest.stem}*", str(dest.parent))
|
||||
dest_files = [normalize_fs_path(pathlib.Path(f).stem.lower()) for f in dest_files]
|
||||
dest_new = dest.stem
|
||||
if count:
|
||||
dest_new = f"{dest.stem} ({count})"
|
||||
while normalize_fs_path(dest_new.lower()) in dest_files:
|
||||
dest_files = list_directory(dest.parent, startswith=dest.stem)
|
||||
dest_files = [f.stem.lower() for f in dest_files]
|
||||
dest_new = f"{dest.stem} ({count})" if count else dest.stem
|
||||
dest_new = normalize_fs_path(dest_new)
|
||||
dest_new = sanitize_filestem_with_count(dest_new, dest.suffix)
|
||||
if lock and not dry_run:
|
||||
dest_lock = "." + dest_new + dest.suffix + ".lock"
|
||||
dest_lock = dest.parent / dest_lock
|
||||
else:
|
||||
dest_lock = pathlib.Path("")
|
||||
|
||||
while dest_new.lower() in dest_files or (
|
||||
lock and not dry_run and dest_lock.exists()
|
||||
):
|
||||
count += 1
|
||||
dest_new = f"{dest.stem} ({count})"
|
||||
dest_new = normalize_fs_path(f"{dest.stem} ({count})")
|
||||
dest_new = sanitize_filestem_with_count(dest_new, dest.suffix)
|
||||
if lock:
|
||||
dest_lock = "." + dest_new + dest.suffix + ".lock"
|
||||
dest_lock = dest.parent / dest_lock
|
||||
if lock and not dry_run:
|
||||
dest_lock.touch()
|
||||
dest = dest.parent / f"{dest_new}{dest.suffix}"
|
||||
return str(dest), count
|
||||
|
||||
return normalize_fs_path(str(dest)), count
|
||||
|
||||
|
||||
def increment_filename(filepath: Union[str, pathlib.Path]) -> str:
|
||||
def increment_filename(
|
||||
filepath: Union[str, pathlib.Path], lock: bool = False, dry_run: bool = False
|
||||
) -> str:
|
||||
"""Return filename (1).ext, etc if filename.ext exists
|
||||
|
||||
If file exists in filename's parent folder with same stem as filename,
|
||||
@@ -401,13 +488,17 @@ def increment_filename(filepath: Union[str, pathlib.Path]) -> str:
|
||||
|
||||
Args:
|
||||
filepath: str or pathlib.Path; full path, including file name
|
||||
lock: bool; if True, creates a lock file in form .filename.lock to prevent other processes from using the same filename
|
||||
dry_run: bool; if True, don't actually create lock file
|
||||
|
||||
Returns:
|
||||
new filepath (or same if not incremented)
|
||||
|
||||
Note: This obviously is subject to race condition so using with caution.
|
||||
Note: This obviously is subject to race condition so using with caution but using lock=True reduces the risk of race condition (but lock files must be cleaned up)
|
||||
"""
|
||||
new_filepath, _ = increment_filename_with_count(filepath)
|
||||
new_filepath, _ = increment_filename_with_count(
|
||||
filepath, lock=lock, dry_run=dry_run
|
||||
)
|
||||
return new_filepath
|
||||
|
||||
|
||||
@@ -448,3 +539,9 @@ def load_function(pyfile: str, function_name: str) -> Callable:
|
||||
sys.path = syspath
|
||||
|
||||
return func
|
||||
|
||||
|
||||
def format_sec_to_hhmmss(sec: float) -> str:
|
||||
"""Format seconds to hh:mm:ss"""
|
||||
delta = datetime.timedelta(seconds=sec)
|
||||
return str(delta).split(".")[0]
|
||||
|
||||
8
setup.py
@@ -74,12 +74,11 @@ setup(
|
||||
"Topic :: Software Development :: Libraries :: Python Modules",
|
||||
],
|
||||
install_requires=[
|
||||
"Click>=8.0.1,<9.0",
|
||||
"Mako>=1.1.4,<1.2.0",
|
||||
"PyYAML>=5.4.1,<5.5.0",
|
||||
"bitmath>=1.3.3.1,<1.4.0.0",
|
||||
"bpylist2==3.0.2",
|
||||
"Click>=8.0.1,<9.0",
|
||||
"dataclasses==0.7;python_version<'3.7'",
|
||||
"Mako>=1.1.4,<1.2.0",
|
||||
"more-itertools>=8.8.0,<9.0.0",
|
||||
"objexplore>=1.5.5,<1.6.0",
|
||||
"osxmetadata>=0.99.34,<1.0.0",
|
||||
@@ -87,15 +86,16 @@ setup(
|
||||
"photoscript>=0.1.4,<0.2.0",
|
||||
"ptpython>=3.0.20,<4.0.0",
|
||||
"pyobjc-core>=7.3,<9.0",
|
||||
"pyobjc-framework-AVFoundation>=7.3,<9.0",
|
||||
"pyobjc-framework-AppleScriptKit>=7.3,<9.0",
|
||||
"pyobjc-framework-AppleScriptObjC>=7.3,<9.0",
|
||||
"pyobjc-framework-AVFoundation>=7.3,<9.0",
|
||||
"pyobjc-framework-Cocoa>=7.3,<9.0",
|
||||
"pyobjc-framework-CoreServices>=7.2,<9.0",
|
||||
"pyobjc-framework-Metal>=7.3,<9.0",
|
||||
"pyobjc-framework-Photos>=7.3,<9.0",
|
||||
"pyobjc-framework-Quartz>=7.3,<9.0",
|
||||
"pyobjc-framework-Vision>=7.3,<9.0",
|
||||
"PyYAML>=5.4.1,<5.5.0",
|
||||
"rich>=10.6.0,<=11.0.0",
|
||||
"textx>=2.3.0,<3.0.0",
|
||||
"toml>=0.10.2,<0.11.0",
|
||||
|
||||
@@ -7,7 +7,7 @@
|
||||
<key>hostuuid</key>
|
||||
<string>585B80BF-8D1F-55EF-A9E8-6CF4E5523959</string>
|
||||
<key>pid</key>
|
||||
<integer>1961</integer>
|
||||
<integer>14817</integer>
|
||||
<key>processname</key>
|
||||
<string>photolibraryd</string>
|
||||
<key>uid</key>
|
||||
|
||||
|
After Width: | Height: | Size: 2.1 MiB |
|
After Width: | Height: | Size: 2.8 MiB |
|
After Width: | Height: | Size: 2.3 MiB |
|
After Width: | Height: | Size: 2.8 MiB |
@@ -3,24 +3,24 @@
|
||||
<plist version="1.0">
|
||||
<dict>
|
||||
<key>BackgroundHighlightCollection</key>
|
||||
<date>2021-09-14T04:40:42Z</date>
|
||||
<date>2022-02-04T13:51:40Z</date>
|
||||
<key>BackgroundHighlightEnrichment</key>
|
||||
<date>2021-09-14T04:40:42Z</date>
|
||||
<date>2022-02-04T13:51:39Z</date>
|
||||
<key>BackgroundJobAssetRevGeocode</key>
|
||||
<date>2021-09-14T04:40:42Z</date>
|
||||
<date>2022-02-04T13:51:40Z</date>
|
||||
<key>BackgroundJobSearch</key>
|
||||
<date>2021-09-14T04:40:42Z</date>
|
||||
<date>2022-02-04T13:51:40Z</date>
|
||||
<key>BackgroundPeopleSuggestion</key>
|
||||
<date>2021-09-14T04:40:41Z</date>
|
||||
<date>2022-02-04T13:51:39Z</date>
|
||||
<key>BackgroundUserBehaviorProcessor</key>
|
||||
<date>2021-09-14T04:40:42Z</date>
|
||||
<date>2022-02-04T13:51:40Z</date>
|
||||
<key>PhotoAnalysisGraphLastBackgroundGraphConsistencyUpdateJobDateKey</key>
|
||||
<date>2021-07-20T05:48:08Z</date>
|
||||
<key>PhotoAnalysisGraphLastBackgroundGraphRebuildJobDate</key>
|
||||
<date>2021-07-20T05:47:59Z</date>
|
||||
<key>PhotoAnalysisGraphLastBackgroundMemoryGenerationJobDate</key>
|
||||
<date>2021-09-14T04:40:43Z</date>
|
||||
<date>2022-02-04T13:51:40Z</date>
|
||||
<key>SiriPortraitDonation</key>
|
||||
<date>2021-09-14T04:40:42Z</date>
|
||||
<date>2022-02-04T13:51:40Z</date>
|
||||
</dict>
|
||||
</plist>
|
||||
|
||||
|
After Width: | Height: | Size: 191 KiB |
|
After Width: | Height: | Size: 123 KiB |
|
After Width: | Height: | Size: 178 KiB |
|
After Width: | Height: | Size: 123 KiB |
|
After Width: | Height: | Size: 58 KiB |
|
After Width: | Height: | Size: 32 KiB |
|
After Width: | Height: | Size: 54 KiB |
|
After Width: | Height: | Size: 32 KiB |