Compare commits

...

39 Commits

Author SHA1 Message Date
Rhet Turnbull
1d6bc4e09e Additional fix for #615 2022-02-05 23:57:50 -08:00
Rhet Turnbull
3e14b718ef Updated docs [skip ci] 2022-02-05 23:12:42 -08:00
Rhet Turnbull
1ae6270561 Fixed exiftool to ignore unsupported file types, #615 2022-02-05 22:54:50 -08:00
Rhet Turnbull
55a601c07e Updated tests 2022-02-05 14:30:20 -08:00
Rhet Turnbull
7d67b81879 Updated CHANGELOG.md [skip ci] 2022-02-05 14:08:43 -08:00
Rhet Turnbull
cd02144ac3 Fix for --name searching only original_filename on Photos 5+, #594 2022-02-05 12:55:56 -08:00
Rhet Turnbull
9b247acd1c Fix for unicode in query strings, #618 2022-02-05 12:36:25 -08:00
Rhet Turnbull
942126ea3d Updated CHANGELOG.md [skip ci] 2022-02-05 10:56:18 -08:00
Rhet Turnbull
2b9ea11701 Updated docs [skip ci] 2022-02-05 10:39:35 -08:00
Rhet Turnbull
b3d3e14ffe Fix for #561, no really, I mean it this time 2022-02-05 10:36:23 -08:00
Rhet Turnbull
62ae5db9fd Updated CHANGELOG.md [skip ci] 2022-02-04 21:59:33 -08:00
Rhet Turnbull
77a49a09a1 Updated tests for #561 [skip ci] 2022-02-04 05:56:01 -08:00
Rhet Turnbull
06c5bbfcfd Updated docs [skip ci] 2022-02-03 22:49:56 -08:00
Rhet Turnbull
f3063d35be Fix for filenames with special characters, #561, #618 2022-02-03 22:46:11 -08:00
Rhet Turnbull
e32090bf39 Updated known issues [skip ci] 2022-02-01 06:53:25 -08:00
Rhet Turnbull
7ab500740b Added progress counter, #601 2022-01-29 19:02:25 -08:00
Rhet Turnbull
911bd30d28 Updated CHANGELOG.md [skip ci] 2022-01-29 19:02:00 -08:00
allcontributors[bot]
282857eae0 docs: add oPromessa as a contributor for ideas, test (#611)
* docs: update .all-contributorsrc [skip ci]

* docs: update README.md [skip ci]

Co-authored-by: allcontributors[bot] <46447321+allcontributors[bot]@users.noreply.github.com>
2022-01-29 14:08:36 -08:00
Rhet Turnbull
d8c2f99c06 Added --timestamp option for --verbose, #600 2022-01-29 11:59:41 -08:00
Rhet Turnbull
16d3f74366 Updated formatting for elapsed time, #604 2022-01-29 11:05:33 -08:00
Rhet Turnbull
5fc28139ea Updated docs [skip ci] 2022-01-29 10:55:41 -08:00
Rhet Turnbull
b7b6876688 Updated CHANGELOG.md [skip ci] 2022-01-29 10:03:31 -08:00
Rhet Turnbull
235dea329c Implemented #605, refactor out export2 2022-01-29 09:38:52 -08:00
Rhet Turnbull
5afdf6fc20 Fix for #564, --preview with --download-missing 2022-01-29 08:27:43 -08:00
Rhet Turnbull
385059e973 Updated CHANGELOG.md [skip ci] 2022-01-28 23:32:46 -08:00
Rhet Turnbull
62aed02070 Updated docs [skip ci] 2022-01-28 23:20:27 -08:00
Rhet Turnbull
6843b8661d Refactored photoexporter for performance, #591 2022-01-28 23:15:02 -08:00
Rhet Turnbull
9da747ea9d Refactoring to support #591 2022-01-27 21:37:12 -08:00
Rhet Turnbull
22964afc69 Performance improvements and refactoring, #462, partial for #591 2022-01-27 06:28:12 -08:00
Rhet Turnbull
3bc53fd92b Performance improvements, partial for #591 2022-01-25 20:37:58 -08:00
Rhet Turnbull
bd31120569 Version bump 2022-01-24 06:28:58 -08:00
Rhet Turnbull
6af124e4d3 Removed exportdb requirement from PhotoTemplate 2022-01-24 06:20:34 -08:00
Rhet Turnbull
b3b1d8f193 Updated CHANGELOG.md [skip ci] 2022-01-23 22:01:54 -08:00
Rhet Turnbull
785580115b Added query options to repl, #597 2022-01-23 21:57:51 -08:00
Rhet Turnbull
b4bd04c146 Added run command, #598 2022-01-23 18:38:16 -08:00
Rhet Turnbull
e88c6b8a59 Bug fix for get_photos_library_version 2022-01-23 18:06:19 -08:00
Rhet Turnbull
74868238f3 Performance improvements, added --profile 2022-01-23 17:14:55 -08:00
Xiaoliang Wu
61a300250d creat unit test for __all__ (#599) 2022-01-23 16:40:20 -08:00
Rhet Turnbull
d8dbc0866f Updated CHANGELOG.md [skip ci] 2022-01-22 14:43:11 -08:00
84 changed files with 6855 additions and 1008 deletions

View File

@@ -257,7 +257,9 @@
"avatar_url": "https://avatars.githubusercontent.com/u/21261491?v=4",
"profile": "https://github.com/oPromessa",
"contributions": [
"bug"
"bug",
"ideas",
"test"
]
},
{

View File

@@ -4,6 +4,97 @@ All notable changes to this project will be documented in this file. Dates are d
Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
#### [v0.45.6](https://github.com/RhetTbull/osxphotos/compare/v0.45.5...v0.45.6)
> 5 February 2022
- Fix for unicode in query strings, #618 [`9b247ac`](https://github.com/RhetTbull/osxphotos/commit/9b247acd1cc4b2def59fdd18a6fb3c8eb9914f11)
- Fix for --name searching only original_filename on Photos 5+, #594 [`cd02144`](https://github.com/RhetTbull/osxphotos/commit/cd02144ac33cc1c13a20358133971c84d35b8a57)
#### [v0.45.5](https://github.com/RhetTbull/osxphotos/compare/v0.45.4...v0.45.5)
> 5 February 2022
- Fix for #561, no really, I mean it this time [`b3d3e14`](https://github.com/RhetTbull/osxphotos/commit/b3d3e14ffe41fbb22edb614b24f3985f379766a2)
- Updated docs [skip ci] [`2b9ea11`](https://github.com/RhetTbull/osxphotos/commit/2b9ea11701799af9a661a8e2af70fca97235f487)
- Updated tests for #561 [skip ci] [`77a49a0`](https://github.com/RhetTbull/osxphotos/commit/77a49a09a1bee74113a7114c543fbc25fa410ffc)
#### [v0.45.4](https://github.com/RhetTbull/osxphotos/compare/v0.45.3...v0.45.4)
> 3 February 2022
- docs: add oPromessa as a contributor for ideas, test [`#611`](https://github.com/RhetTbull/osxphotos/pull/611)
- Fix for filenames with special characters, #561, #618 [`f3063d3`](https://github.com/RhetTbull/osxphotos/commit/f3063d35be3c96342d83dbd87ddd614a2001bff4)
- Updated docs [skip ci] [`06c5bbf`](https://github.com/RhetTbull/osxphotos/commit/06c5bbfcfdf591a4a5d43f1456adaa27385fe01a)
- Added progress counter, #601 [`7ab5007`](https://github.com/RhetTbull/osxphotos/commit/7ab500740b28594dcd778140e10991f839220e9d)
- Updated known issues [skip ci] [`e32090b`](https://github.com/RhetTbull/osxphotos/commit/e32090bf39cb786171b49443f878ffdbab774420)
#### [v0.45.3](https://github.com/RhetTbull/osxphotos/compare/v0.45.2...v0.45.3)
> 29 January 2022
- Added --timestamp option for --verbose, #600 [`d8c2f99`](https://github.com/RhetTbull/osxphotos/commit/d8c2f99c06bc6f72bf2cb1a13c5765824fe3cbba)
- Updated docs [skip ci] [`5fc2813`](https://github.com/RhetTbull/osxphotos/commit/5fc28139ea0374bc3e228c0432b8a41ada430389)
- Updated formatting for elapsed time, #604 [`16d3f74`](https://github.com/RhetTbull/osxphotos/commit/16d3f743664396d43b3b3028a5e7a919ec56d9e1)
#### [v0.45.2](https://github.com/RhetTbull/osxphotos/compare/v0.45.0...v0.45.2)
> 29 January 2022
- Implemented #605, refactor out export2 [`235dea3`](https://github.com/RhetTbull/osxphotos/commit/235dea329c98ab8fa61565c09a1b4a83e5d99043)
- Fix for #564, --preview with --download-missing [`5afdf6f`](https://github.com/RhetTbull/osxphotos/commit/5afdf6fc20a3cb6eb2b0217d8b3be20295eb7ba4)
#### [v0.45.0](https://github.com/RhetTbull/osxphotos/compare/v0.44.13...v0.45.0)
> 28 January 2022
- Performance improvements and refactoring, #462, partial for #591 [`22964af`](https://github.com/RhetTbull/osxphotos/commit/22964afc6988166218413125d7a62348bb858a83)
- Refactored photoexporter for performance, #591 [`6843b86`](https://github.com/RhetTbull/osxphotos/commit/6843b8661d41d42368794c77304fc07194e7af18)
- Performance improvements, partial for #591 [`3bc53fd`](https://github.com/RhetTbull/osxphotos/commit/3bc53fd92b3222c6959e7aa12310811db41b83fe)
#### [v0.44.13](https://github.com/RhetTbull/osxphotos/compare/v0.44.12...v0.44.13)
> 24 January 2022
- Removed exportdb requirement from PhotoTemplate [`6af124e`](https://github.com/RhetTbull/osxphotos/commit/6af124e4d3a0e26c48f435452920020cd42afa1c)
- Version bump [`bd31120`](https://github.com/RhetTbull/osxphotos/commit/bd3112056920806f565be2c0c12caf4f2aff5231)
#### [v0.44.12](https://github.com/RhetTbull/osxphotos/compare/v0.44.11...v0.44.12)
> 23 January 2022
- Added query options to repl, #597 [`7855801`](https://github.com/RhetTbull/osxphotos/commit/785580115b29f5ccb895de22be1243f56dbb43dc)
- Added run command, #598 [`b4bd04c`](https://github.com/RhetTbull/osxphotos/commit/b4bd04c1461d0b427937f541403305bc979bcf4f)
- Bug fix for get_photos_library_version [`e88c6b8`](https://github.com/RhetTbull/osxphotos/commit/e88c6b8a59dfd947f6cf3c7eac9c92519ab781a3)
#### [v0.44.11](https://github.com/RhetTbull/osxphotos/compare/v0.44.10...v0.44.11)
> 23 January 2022
- creat unit test for __all__ [`#599`](https://github.com/RhetTbull/osxphotos/pull/599)
- Performance improvements, added --profile [`7486823`](https://github.com/RhetTbull/osxphotos/commit/74868238f3b1ee18feb744f137f5c14ef8e36ffc)
#### [v0.44.10](https://github.com/RhetTbull/osxphotos/compare/v0.44.9...v0.44.10)
> 22 January 2022
- Create __all__ for all python files [`#589`](https://github.com/RhetTbull/osxphotos/pull/589)
- Create __all__ for the file cli.py [`#587`](https://github.com/RhetTbull/osxphotos/pull/587)
- docs: add xwu64 as a contributor for code [`#585`](https://github.com/RhetTbull/osxphotos/pull/585)
- add __all__ to files "adjustmentsinfo.py" and "albuminfo.py" [`#584`](https://github.com/RhetTbull/osxphotos/pull/584)
- More refactoring of export code, #462 [`6261a7b`](https://github.com/RhetTbull/osxphotos/commit/6261a7b5c96ac43aece66b72b9e27a90854accfa)
- Added ExportOptions to photoexporter.py, #462 [`9517876`](https://github.com/RhetTbull/osxphotos/commit/9517876bd06572238648a6362a309063b86007e7)
- Blackified files [`3bafdf7`](https://github.com/RhetTbull/osxphotos/commit/3bafdf7bfd5f7992b2e0c12496c55e7be1f57455)
- More refactoring of export code, #462 [`c2d726b`](https://github.com/RhetTbull/osxphotos/commit/c2d726beafabe76cf4d5fb3213447c900129b8c0)
- Refactored photoexporter sidecar writing, #462 [`458da0e`](https://github.com/RhetTbull/osxphotos/commit/458da0e9b2b82a78cec30191c5bf1ee2ed993acf)
#### [v0.44.9](https://github.com/RhetTbull/osxphotos/compare/v0.44.8...v0.44.9)
> 9 January 2022
- Added diff command [`3927f05`](https://github.com/RhetTbull/osxphotos/commit/3927f052670b2a1c31cced1f8278a0ffe519a3eb)
- Added uuid command [`a010ab5`](https://github.com/RhetTbull/osxphotos/commit/a010ab5a299470782b938e689a7ddc336513065e)
#### [v0.44.8](https://github.com/RhetTbull/osxphotos/compare/v0.44.7...v0.44.8)
> 9 January 2022

View File

@@ -1,7 +1,7 @@
include README.md
include README.rst
include osxphotos/templates/*
include osxphotos/*.json
include osxphotos/*.md
include osxphotos/phototemplate.tx
include osxphotos/phototemplate.md
include osxphotos/tutorial.md
include osxphotos/queries/*
include osxphotos/queries/*
include osxphotos/templates/*
include README.md
include README.rst

108
README.md
View File

@@ -38,6 +38,7 @@ OSXPhotos provides the ability to interact with and query Apple's Photos.app lib
+ [Raw Photos](#raw-photos)
+ [Template System](#template-system)
+ [ExifTool](#exiftoolExifTool)
+ [PhotoExporter](#photoexporter)
+ [Text Detection](#textdetection)
+ [Utility Functions](#utility-functions)
* [Examples](#examples)
@@ -600,6 +601,7 @@ Options:
library, 2. system library, 3.
~/Pictures/Photos Library.photoslibrary
-V, --verbose Print verbose output.
--timestamp Add time stamp to verbose output
--keyword KEYWORD Search for photos with keyword KEYWORD. If
more than one keyword, treated as "OR", e.g.
find photos matching any keyword
@@ -1723,7 +1725,7 @@ Substitution Description
{lf} A line feed: '\n', alias for {newline}
{cr} A carriage return: '\r'
{crlf} a carriage return + line feed: '\r\n'
{osxphotos_version} The osxphotos version, e.g. '0.44.10'
{osxphotos_version} The osxphotos version, e.g. '0.45.8'
{osxphotos_cmd_line} The full command line used to run osxphotos
The following substitutions may result in multiple values. Thus if specified for
@@ -3627,7 +3629,7 @@ The following template field substitutions are availabe for use the templating s
|{lf}|A line feed: '\n', alias for {newline}|
|{cr}|A carriage return: '\r'|
|{crlf}|a carriage return + line feed: '\r\n'|
|{osxphotos_version}|The osxphotos version, e.g. '0.44.10'|
|{osxphotos_version}|The osxphotos version, e.g. '0.45.8'|
|{osxphotos_cmd_line}|The full command line used to run osxphotos|
|{album}|Album(s) photo is contained in|
|{folder_album}|Folder path + album photo is contained in. e.g. 'Folder/Subfolder/Album' or just 'Album' if no enclosing folder|
@@ -3711,6 +3713,105 @@ osxphotos.exiftool also provides an `ExifToolCaching` class which caches all met
`ExifTool()` runs `exiftool` as a subprocess using the `-stay_open True` flag to keep the process running in the background. The subprocess will be cleaned up when your main script terminates. `ExifTool()` uses a singleton pattern to ensure that only one instance of `exiftool` is created. Multiple instances of `ExifTool()` will all use the same `exiftool` subprocess.
### <a name="photoexporter">PhotoExporter</a>
[PhotoInfo.export()](#photoinfo) provides a simple method to export a photo. This method actually calls `PhotoExporter.export()` to do the export. `PhotoExporter` provides many more options to configure the export and report results and this is what the osxphotos command line export tools uses.
#### `export(dest, filename=None, options: Optional[ExportOptions]=None) -> ExportResults`
Export a photo.
Args:
- dest: must be valid destination path or exception raised
- filename: (optional): name of exported picture; if not provided, will use current filename
- options (ExportOptions): optional ExportOptions instance
Returns: ExportResults instance
*Note*: to use dry run mode, you must set options.dry_run=True and also pass in memory version of export_db, and no-op fileutil (e.g. ExportDBInMemory and FileUtilNoOp) in options.export_db and options.fileutil respectively.
#### `ExportOptions`
Options class for exporting photos with `export`
Attributes:
- convert_to_jpeg (bool): if True, converts non-jpeg images to jpeg
- description_template (str): optional template string that will be rendered for use as photo description
- download_missing: (bool, default=False): if True will attempt to export photo via applescript interaction with Photos if missing (see also use_photokit, use_photos_export)
- dry_run: (bool, default=False): set to True to run in "dry run" mode
- edited: (bool, default=False): if True will export the edited version of the photo otherwise exports the original version
- exiftool_flags (list of str): optional list of flags to pass to exiftool when using exiftool option, e.g ["-m", "-F"]
- exiftool: (bool, default = False): if True, will use exiftool to write metadata to export file
- export_as_hardlink: (bool, default=False): if True, will hardlink files instead of copying them
- export_db: (ExportDB_ABC): instance of a class that conforms to ExportDB_ABC with methods for getting/setting data related to exported files to compare update state
- fileutil: (FileUtilABC): class that conforms to FileUtilABC with various file utilities
- ignore_date_modified (bool): for use with sidecar and exiftool; if True, sets EXIF:ModifyDate to EXIF:DateTimeOriginal even if date_modified is set
- ignore_signature (bool, default=False): ignore file signature when used with update (look only at filename)
- increment (bool, default=True): if True, will increment file name until a non-existant name is found if overwrite=False and increment=False, export will fail if destination file already exists
- jpeg_ext (str): if set, will use this value for extension on jpegs converted to jpeg with convert_to_jpeg; if not set, uses jpeg; do not include the leading "."
- jpeg_quality (float in range 0.0 <= jpeg_quality <= 1.0): a value of 1.0 specifies use best quality, a value of 0.0 specifies use maximum compression.
- keyword_template (list of str): list of template strings that will be rendered as used as keywords
- live_photo (bool, default=False): if True, will also export the associated .mov for live photos
- location (bool): if True, include location in exported metadata
- merge_exif_keywords (bool): if True, merged keywords found in file's exif data (requires exiftool)
- merge_exif_persons (bool): if True, merged persons found in file's exif data (requires exiftool)
- overwrite (bool, default=False): if True will overwrite files if they already exist
- persons (bool): if True, include persons in exported metadata
- preview_suffix (str): optional string to append to end of filename for preview images
- preview (bool): if True, also exports preview image
- raw_photo (bool, default=False): if True, will also export the associated RAW photo
- render_options (RenderOptions): optional osxphotos.phototemplate.RenderOptions instance to specify options for rendering templates
- replace_keywords (bool): if True, keyword_template replaces any keywords, otherwise it's additive
- sidecar_drop_ext (bool, default=False): if True, drops the photo's extension from sidecar filename (e.g. 'IMG_1234.json' instead of 'IMG_1234.JPG.json')
- sidecar: bit field (int): set to one or more of SIDECAR_XMP, SIDECAR_JSON, SIDECAR_EXIFTOOL
- SIDECAR_JSON: if set will write a json sidecar with data in format readable by exiftool sidecar filename will be dest/filename.json; includes exiftool tag group names (e.g. `exiftool -G -j`)
- SIDECAR_EXIFTOOL: if set will write a json sidecar with data in format readable by exiftool sidecar filename will be dest/filename.json; does not include exiftool tag group names (e.g. `exiftool -j`)
- SIDECAR_XMP: if set will write an XMP sidecar with IPTC data sidecar filename will be dest/filename.xmp
- strip (bool): if True, strip whitespace from rendered templates
- timeout (int, default=120): timeout in seconds used with use_photos_export
- touch_file (bool, default=False): if True, sets file's modification time upon photo date
- update (bool, default=False): if True export will run in update mode, that is, it will not export the photo if the current version already exists in the destination
- use_albums_as_keywords (bool, default = False): if True, will include album names in keywords when exporting metadata with exiftool or sidecar
- use_persons_as_keywords (bool, default = False): if True, will include person names in keywords when exporting metadata with exiftool or sidecar
- use_photos_export (bool, default=False): if True will attempt to export photo via applescript interaction with Photos even if not missing (see also use_photokit, download_missing)
- use_photokit (bool, default=False): if True, will use photokit to export photos when use_photos_export is True
- verbose (Callable): optional callable function to use for printing verbose text during processing; if None (default), does not print output.
#### `ExportResults`
`PhotoExporter().export()` returns an instance of this class.
`ExportResults` has the following properties:
- exported: list of all exported files (A single call to export could export more than one file, e.g. original file, preview, live video, raw, etc.)
- new: list of new files exported when used with update=True
- updated: list of updated files when used with update=True
- skipped: list of skipped files when used with update=True
- exif_updated: list of updated files when used with update=True and exiftool
- touched: list of files touched during export (e.g. file date/time updated with touch_file=True)
- to_touch: Reserved for internal use of export
- converted_to_jpeg: list of files converted to jpeg when convert_to_jpeg=True
- sidecar_json_written: list of JSON sidecars written
- sidecar_json_skipped: list of JSON sidecars skipped when update=True
- sidecar_exiftool_written: list of exiftool sidecars written
- sidecar_exiftool_skipped: list of exiftool sidecars skipped when update=True
- sidecar_xmp_written: list of XMP sidecars written
- sidecar_xmp_skipped: list of XMP sidecars skipped when update=True
- missing: list of missing files
- error: list of tuples containing (filename, error) if error generated during export
- exiftool_warning: list of warnings generated by exiftool during export
- exiftool_error: list of errors generated by exiftool during export
- xattr_written: list of files with extended attributes written during export
- xattr_skipped: list of files where extended attributes were skipped when update=True
- deleted_files: reserved for use by osxphotos CLI
- deleted_directories: reserved for use by osxphotos CLI
- exported_album: reserved for use by osxphotos CLI
- skipped_album: reserved for use by osxphotos CLI
- missing_album: reserved for use by osxphotos CLI
### <a name="textdetection">Text Detection</a>
The [PhotoInfo.detected_text()](#detected_text_method) and the `{detected_text}` template will perform text detection on the photos in your library. Text detection is a slow process so to avoid unnecessary re-processing of photos, osxphotos will cache the results of the text detection process as an extended attribute on the photo image file. Extended attributes do not modify the actual file. The extended attribute is named `osxphotos.metadata:detected_text` and can be viewed using the built-in [xattr](https://ss64.com/osx/xattr.html) command or my [osxmetadata](https://github.com/RhetTbull/osxmetadata) tool. If you want to remove the cached attribute, you can do so with osxmetadata as follows:
@@ -3847,7 +3948,7 @@ Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/d
<td align="center"><a href="https://github.com/mkirkland4874"><img src="https://avatars.githubusercontent.com/u/36466711?v=4?s=75" width="75px;" alt=""/><br /><sub><b>mkirkland4874</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3Amkirkland4874" title="Bug reports">🐛</a> <a href="#example-mkirkland4874" title="Examples">💡</a></td>
<td align="center"><a href="https://github.com/jcommisso07"><img src="https://avatars.githubusercontent.com/u/3111054?v=4?s=75" width="75px;" alt=""/><br /><sub><b>Joseph Commisso</b></sub></a><br /><a href="#data-jcommisso07" title="Data">🔣</a></td>
<td align="center"><a href="https://github.com/dssinger"><img src="https://avatars.githubusercontent.com/u/1817903?v=4?s=75" width="75px;" alt=""/><br /><sub><b>David Singer</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3Adssinger" title="Bug reports">🐛</a></td>
<td align="center"><a href="https://github.com/oPromessa"><img src="https://avatars.githubusercontent.com/u/21261491?v=4?s=75" width="75px;" alt=""/><br /><sub><b>oPromessa</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3AoPromessa" title="Bug reports">🐛</a></td>
<td align="center"><a href="https://github.com/oPromessa"><img src="https://avatars.githubusercontent.com/u/21261491?v=4?s=75" width="75px;" alt=""/><br /><sub><b>oPromessa</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3AoPromessa" title="Bug reports">🐛</a> <a href="#ideas-oPromessa" title="Ideas, Planning, & Feedback">🤔</a> <a href="https://github.com/RhetTbull/osxphotos/commits?author=oPromessa" title="Tests">⚠️</a></td>
<td align="center"><a href="http://spencerchang.me"><img src="https://avatars.githubusercontent.com/u/14796580?v=4?s=75" width="75px;" alt=""/><br /><sub><b>Spencer Chang</b></sub></a><br /><a href="https://github.com/RhetTbull/osxphotos/issues?q=author%3Aspencerc99" title="Bug reports">🐛</a></td>
</tr>
<tr>
@@ -3873,7 +3974,6 @@ My goal is make osxphotos as reliable and comprehensive as possible. The test s
- Audio-only files are not handled. It is possible to store audio-only files in Photos. osxphotos currently only handles images and videos. See [Issue #436](https://github.com/RhetTbull/osxphotos/issues/436)
- Face coordinates (mouth, left eye, right eye) may not be correct for images where the head is tilted. See [Issue #196](https://github.com/RhetTbull/osxphotos/issues/196).
- Raw images imported to Photos with an associated jpeg preview are not handled correctly by osxphotos. osxphotos query and export will operate on the jpeg preview instead of the raw image as will `PhotoInfo.path`. If the user selects "Use RAW as original" in Photos, the raw image will be exported or operated on but the jpeg will be ignored. See [Issue #101](https://github.com/RhetTbull/osxphotos/issues/101). Note: Beta version of fix for this bug is implemented in the current version of osxphotos.
- The `--download-missing` option for `osxphotos export` does not work correctly with burst images. It will download the primary image but not the other burst images. See [Issue #75](https://github.com/RhetTbull/osxphotos/issues/75).
## Implementation Notes

View File

@@ -1,4 +1,4 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
config: 716e0bf3a38e2e923691f2db50ed7ba7
config: bf43bf49b725c31ce72a8823e4f8012b
tags: 645f666f9bcd5a90fca523b33c5a78b7

View File

@@ -1,6 +1,6 @@
var DOCUMENTATION_OPTIONS = {
URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'),
VERSION: '0.44.10',
VERSION: '0.45.8',
LANGUAGE: 'None',
COLLAPSE_INDEX: false,
BUILDER: 'html',

View File

@@ -6,7 +6,7 @@
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" /><meta name="generator" content="Docutils 0.17.1: http://docutils.sourceforge.net/" />
<title>osxphotos command line interface (CLI) &#8212; osxphotos 0.44.10 documentation</title>
<title>osxphotos command line interface (CLI) &#8212; osxphotos 0.45.8 documentation</title>
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>

View File

@@ -5,7 +5,7 @@
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Index &#8212; osxphotos 0.44.10 documentation</title>
<title>Index &#8212; osxphotos 0.45.8 documentation</title>
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>

View File

@@ -6,7 +6,7 @@
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" /><meta name="generator" content="Docutils 0.17.1: http://docutils.sourceforge.net/" />
<title>Welcome to osxphotoss documentation! &#8212; osxphotos 0.44.10 documentation</title>
<title>Welcome to osxphotoss documentation! &#8212; osxphotos 0.45.8 documentation</title>
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>

View File

@@ -6,7 +6,7 @@
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" /><meta name="generator" content="Docutils 0.17.1: http://docutils.sourceforge.net/" />
<title>osxphotos &#8212; osxphotos 0.44.10 documentation</title>
<title>osxphotos &#8212; osxphotos 0.45.8 documentation</title>
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>

View File

@@ -6,7 +6,7 @@
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" /><meta name="generator" content="Docutils 0.17.1: http://docutils.sourceforge.net/" />
<title>osxphotos package &#8212; osxphotos 0.44.10 documentation</title>
<title>osxphotos package &#8212; osxphotos 0.45.8 documentation</title>
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>

View File

@@ -5,7 +5,7 @@
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Search &#8212; osxphotos 0.44.10 documentation</title>
<title>Search &#8212; osxphotos 0.45.8 documentation</title>
<link rel="stylesheet" type="text/css" href="_static/pygments.css" />
<link rel="stylesheet" type="text/css" href="_static/alabaster.css" />

View File

@@ -14,6 +14,7 @@ datas = [
("osxphotos/phototemplate.tx", "osxphotos"),
("osxphotos/phototemplate.md", "osxphotos"),
("osxphotos/tutorial.md", "osxphotos"),
("osxphotos/exiftool_filetypes.json", "osxphotos"),
]
package_imports = [["photoscript", ["photoscript.applescript"]]]
for package, files in package_imports:

View File

@@ -1,13 +1,45 @@
from ._constants import AlbumSortOrder
from ._version import __version__
from .exiftool import ExifTool
from .photoexporter import ExportResults, PhotoExporter
from .export_db import ExportDB, ExportDBInMemory, ExportDBNoOp
from .fileutil import FileUtil, FileUtilNoOp
from .momentinfo import MomentInfo
from .personinfo import PersonInfo
from .photoexporter import ExportOptions, ExportResults, PhotoExporter
from .photoinfo import PhotoInfo
from .photosdb import PhotosDB
from .photosdb._photosdb_process_comments import CommentInfo, LikeInfo
from .phototemplate import PhotoTemplate
from .placeinfo import PlaceInfo
from .queryoptions import QueryOptions
from .scoreinfo import ScoreInfo
from .searchinfo import SearchInfo
from .utils import _debug, _get_logger, _set_debug
# TODO: Add test for imageTimeZoneOffsetSeconds = None
# TODO: Add special albums and magic albums
__all__ = [
"__version__",
"_debug",
"_get_logger",
"_set_debug",
"AlbumSortOrder",
"CommentInfo",
"ExifTool",
"ExportDB",
"ExportDBInMemory",
"ExportDBNoOp",
"ExportOptions",
"ExportResults",
"FileUtil",
"FileUtilNoOp",
"LikeInfo",
"MomentInfo",
"PersonInfo",
"PhotoExporter",
"PhotoInfo",
"PhotosDB",
"PhotoTemplate",
"PlaceInfo",
"QueryOptions",
"ScoreInfo",
"SearchInfo",
]

View File

@@ -305,3 +305,21 @@ class AlbumSortOrder(Enum):
TEXT_DETECTION_CONFIDENCE_THRESHOLD = 0.75
# stat sort order for cProfile: https://docs.python.org/3/library/profile.html#pstats.Stats.sort_stats
PROFILE_SORT_KEYS = [
"calls",
"cumulative",
"cumtime",
"file",
"filename",
"module",
"ncalls",
"pcalls",
"line",
"name",
"nfl",
"stdname",
"time",
"tottime",
]

View File

@@ -1,3 +1,3 @@
""" version info """
__version__ = "0.44.10"
__version__ = "0.45.8"

View File

@@ -1,19 +1,25 @@
"""Command line interface for osxphotos """
import atexit
import code
import cProfile
import csv
import dataclasses
import datetime
import io
import json
import os
import os.path
import pathlib
import pprint
import pstats
import shlex
import shutil
import subprocess
import sys
import time
from runpy import run_module
from runpy import run_module, run_path
from typing import Dict, List
import bitmath
import click
@@ -43,6 +49,7 @@ from ._constants import (
OSXPHOTOS_EXPORT_DB,
OSXPHOTOS_URL,
POST_COMMAND_CATEGORIES,
PROFILE_SORT_KEYS,
SIDECAR_EXIFTOOL,
SIDECAR_JSON,
SIDECAR_XMP,
@@ -63,13 +70,19 @@ from .photoexporter import ExportOptions, ExportResults, PhotoExporter
from .photoinfo import PhotoInfo
from .photokit import check_photokit_authorization, request_photokit_authorization
from .photosalbum import PhotosAlbum
from .photosdb import PhotosDB
from .photosdb.photosdb_utils import get_photos_library_version
from .phototemplate import PhotoTemplate, RenderOptions
from .pyrepl import embed_repl
from .queryoptions import QueryOptions
from .sqlgrep import sqlgrep
from .uti import get_preferred_uti_extension
from .utils import expand_and_validate_filepath, load_function, normalize_fs_path
from .utils import (
expand_and_validate_filepath,
format_sec_to_hhmmss,
load_function,
normalize_fs_path,
)
__all__ = [
"verbose_",
@@ -79,7 +92,7 @@ __all__ = [
"TimeISO8601",
"FunctionCall",
"CLI_Obj",
"deleted_options",
"DELETED_OPTIONS",
"QUERY_OPTIONS",
"cli",
"export",
@@ -120,6 +133,10 @@ __all__ = [
# global variable to control verbose output
# set via --verbose/-V
VERBOSE = False
VERBOSE_TIMESTAMP = False
# used to show/hide hidden commands
OSXPHOTOS_HIDDEN = not bool(os.getenv("OSXPHOTOS_SHOW_HIDDEN", default=False))
# used by snap and diff commands
OSXPHOTOS_SNAPSHOT_DIR = "/private/tmp/osxphotos_snapshots"
@@ -131,8 +148,10 @@ def verbose_(*args, **kwargs):
"""print output if verbose flag set"""
if VERBOSE:
styled_args = []
timestamp = str(datetime.datetime.now()) + " -- " if VERBOSE_TIMESTAMP else ""
for arg in args:
if type(arg) == str:
arg = timestamp + arg
if "error" in arg.lower():
arg = click.style(arg, fg=CLI_COLOR_ERROR)
elif "warning" in arg.lower():
@@ -247,6 +266,10 @@ class FunctionCall(click.ParamType):
return (function, value)
class IncompatibleQueryOptions(Exception):
pass
# Click CLI object & context settings
class CLI_Obj:
def __init__(self, db=None, json=False, debug=False):
@@ -284,7 +307,7 @@ JSON_OPTION = click.option(
)
def deleted_options(f):
def DELETED_OPTIONS(f):
o = click.option
options = [
o(
@@ -645,7 +668,9 @@ def QUERY_OPTIONS(f):
@click.group(context_settings=CTX_SETTINGS)
@DB_OPTION
@JSON_OPTION
@click.option("--debug", required=False, is_flag=True, default=False, hidden=True)
@click.option(
"--debug", required=False, is_flag=True, default=False, hidden=OSXPHOTOS_HIDDEN
)
@click.version_option(__version__, "--version", "-v")
@click.pass_context
def cli(ctx, db, json_, debug):
@@ -655,13 +680,14 @@ def cli(ctx, db, json_, debug):
@cli.command(cls=ExportCommand)
@DB_OPTION
@click.option("--verbose", "-V", "verbose", is_flag=True, help="Print verbose output.")
@click.option("--timestamp", is_flag=True, help="Add time stamp to verbose output")
@QUERY_OPTIONS
@click.option(
"--missing",
is_flag=True,
help="Export only photos missing from the Photos library; must be used with --download-missing.",
)
@deleted_options
@DELETED_OPTIONS
@click.option(
"--update",
is_flag=True,
@@ -1144,7 +1170,32 @@ def cli(ctx, db, json_, debug):
type=click.Path(),
)
@click.option(
"--beta", is_flag=True, default=False, hidden=True, help="Enable beta options."
"--beta",
is_flag=True,
default=False,
hidden=OSXPHOTOS_HIDDEN,
help="Enable beta options.",
)
@click.option(
"--profile",
is_flag=True,
default=False,
hidden=OSXPHOTOS_HIDDEN,
help="Run export with code profiler.",
)
@click.option(
"--profile-sort",
default=None,
hidden=OSXPHOTOS_HIDDEN,
multiple=True,
metavar="SORT_KEY",
type=click.Choice(
PROFILE_SORT_KEYS,
case_sensitive=True,
),
help="Sort profiler output by SORT_KEY as specified at https://docs.python.org/3/library/profile.html#pstats.Stats.sort_stats. "
f"Can be specified multiple times. Valid options are: {PROFILE_SORT_KEYS}. "
"Default = 'cumulative'.",
)
@DB_ARGUMENT
@click.argument("dest", nargs=1, type=click.Path(exists=True))
@@ -1181,6 +1232,7 @@ def export(
from_time,
to_time,
verbose,
timestamp,
missing,
update,
ignore_signature,
@@ -1284,6 +1336,8 @@ def export(
preview,
preview_suffix,
preview_if_missing,
profile,
profile_sort,
):
"""Export photos from the Photos database.
Export path DEST is required.
@@ -1297,6 +1351,24 @@ def export(
to modify this behavior.
"""
if profile:
click.echo("Profiling...")
profile_sort = profile_sort or ["cumulative"]
click.echo(f"Profile sort_stats order: {profile_sort}")
pr = cProfile.Profile()
pr.enable()
def at_exit():
pr.disable()
click.echo("Profiling completed")
s = io.StringIO()
pstats.Stats(pr, stream=s).strip_dirs().sort_stats(
*profile_sort
).print_stats()
click.echo(s.getvalue())
atexit.register(at_exit)
# NOTE: because of the way ConfigOptions works, Click options must not
# set defaults which are not None or False. If defaults need to be set
# do so below after load_config and save_config are handled.
@@ -1307,7 +1379,9 @@ def export(
)
global VERBOSE
global VERBOSE_TIMESTAMP
VERBOSE = bool(verbose)
VERBOSE_TIMESTAMP = timestamp
if load_config:
try:
@@ -1603,7 +1677,9 @@ def export(
if any([exiftool, exiftool_merge_keywords, exiftool_merge_persons]):
verbose_(f"exiftool path: {exiftool_path}")
photos = movies = True # default searches for everything
# default searches for everything
photos = True
movies = True
if only_movies:
photos = False
if only_photos:
@@ -1820,10 +1896,12 @@ def export(
else None
)
photo_num = 0
# send progress bar output to /dev/null if verbose to hide the progress bar
fp = open(os.devnull, "w") if verbose else None
with click.progressbar(photos, file=fp) as bar:
with click.progressbar(photos, show_pos=True, file=fp) as bar:
for p in bar:
photo_num += 1
export_results = export_photo(
photo=p,
dest=dest,
@@ -1870,6 +1948,8 @@ def export(
export_preview=preview,
preview_suffix=preview_suffix,
preview_if_missing=preview_if_missing,
photo_num=photo_num,
num_photos=num_photos,
)
if post_function:
@@ -1963,7 +2043,6 @@ def export(
finder_tag_template=finder_tag_template,
strip=strip,
export_dir=dest,
export_db=export_db,
)
results.xattr_written.extend(tags_written)
results.xattr_skipped.extend(tags_skipped)
@@ -1975,7 +2054,6 @@ def export(
xattr_template,
strip=strip,
export_dir=dest,
export_db=export_db,
)
results.xattr_written.extend(xattr_written)
results.xattr_skipped.extend(xattr_skipped)
@@ -2003,7 +2081,7 @@ def export(
summary += f", touched date: {len(results.touched)}"
click.echo(summary)
stop_time = time.perf_counter()
click.echo(f"Elapsed time: {(stop_time-start_time):.3f} seconds")
click.echo(f"Elapsed time: {format_sec_to_hhmmss(stop_time-start_time)}")
else:
click.echo("Did not find any photos to export")
@@ -2046,6 +2124,18 @@ def export(
export_db.close()
def _export_with_profiler(args: Dict):
""" "Run export with cProfile"""
try:
args.pop("profile")
except KeyError:
pass
cProfile.runctx(
"_export(**args)", globals=globals(), locals=locals(), sort="tottime"
)
@cli.command()
@click.argument("topic", default=None, required=False, nargs=1)
@click.pass_context
@@ -2065,7 +2155,7 @@ def help(ctx, topic, **kw):
@DB_OPTION
@JSON_OPTION
@QUERY_OPTIONS
@deleted_options
@DELETED_OPTIONS
@click.option("--missing", is_flag=True, help="Search for photos missing from disk.")
@click.option(
"--not-missing",
@@ -2254,7 +2344,9 @@ def query(
return
# actually have something to query
photos = movies = True # default searches for everything
# default searches for everything
photos = True
movies = True
if only_movies:
photos = False
if only_photos:
@@ -2536,6 +2628,8 @@ def export_photo(
export_preview=False,
preview_suffix=None,
preview_if_missing=False,
photo_num=1,
num_photos=1,
):
"""Helper function for export that does the actual export
@@ -2580,6 +2674,8 @@ def export_photo(
export_preview: export the preview image generated by Photos
preview_suffix: str, template to use as suffix for preview images
preview_if_missing: bool, export preview if original is missing
photo_num: int, which number photo in total of num_photos is being exported
num_photos: int, total number of photos that will be exported
Returns:
list of path(s) of exported photo or None if photo was missing
@@ -2701,7 +2797,7 @@ def export_photo(
original_filename = str(original_filename)
verbose_(
f"Exporting {photo.original_filename} ({photo.filename}) as {original_filename}"
f"Exporting {photo.original_filename} ({photo.filename}) as {original_filename} ({photo_num}/{num_photos})"
)
results += export_photo_to_directory(
@@ -2875,7 +2971,7 @@ def _render_suffix_template(
return ""
try:
options = RenderOptions(filename=True, export_dir=dest, exportdb=export_db)
options = RenderOptions(filename=True, export_dir=dest)
rendered_suffix, unmatched = photo.render_template(suffix_template, options)
except ValueError as e:
raise click.BadOptionUsage(
@@ -2945,7 +3041,7 @@ def export_photo_to_directory(
"""Export photo to directory dest_path"""
results = ExportResults()
# TODO: can be updated to let export2 do all the missing logic
# TODO: can be updated to let export do all the missing logic
if export_original:
if missing and not preview_if_missing:
space = " " if not verbose else ""
@@ -2990,9 +3086,7 @@ def export_photo_to_directory(
results.missing.append(str(pathlib.Path(dest_path) / filename))
return results
render_options = RenderOptions(
export_dir=export_dir, dest_path=dest_path, exportdb=export_db
)
render_options = RenderOptions(export_dir=export_dir, dest_path=dest_path)
tries = 0
while tries <= retry:
@@ -3035,7 +3129,7 @@ def export_photo_to_directory(
verbose=verbose_,
)
exporter = PhotoExporter(photo)
export_results = exporter.export2(
export_results = exporter.export(
dest=dest_path, filename=filename, options=export_options
)
for warning_ in export_results.exiftool_warning:
@@ -3132,7 +3226,6 @@ def get_filenames_from_template(
edited_version=edited,
export_dir=export_dir,
dest_path=dest_path,
exportdb=export_db,
)
filenames, unmatched = photo.render_template(filename_template, options)
except ValueError as e:
@@ -3198,9 +3291,7 @@ def get_dirnames_from_template(
elif directory:
# got a directory template, render it and check results are valid
try:
options = RenderOptions(
dirname=True, edited_version=edited, exportdb=export_db
)
options = RenderOptions(dirname=True, edited_version=edited)
dirnames, unmatched = photo.render_template(directory, options)
except ValueError as e:
raise click.BadOptionUsage(
@@ -3457,8 +3548,7 @@ def cleanup_files(dest_path, files_to_keep, fileutil):
deleted_files = []
for p in pathlib.Path(dest_path).rglob("*"):
path = normalize_fs_path(str(p).lower())
if p.is_file() and path not in keepers:
if p.is_file() and normalize_fs_path(str(p).lower()) not in keepers:
verbose_(f"Deleting {p}")
fileutil.unlink(p)
deleted_files.append(str(p))
@@ -3487,7 +3577,6 @@ def write_finder_tags(
finder_tag_template=None,
strip=False,
export_dir=None,
export_db=None,
):
"""Write Finder tags (extended attributes) to files; only writes attributes if attributes on file differ from what would be written
@@ -3501,7 +3590,6 @@ def write_finder_tags(
exiftool_merge_keywords: if True, include any keywords in the exif data of the source image as keywords
finder_tag_template: list of templates to evaluate for determining Finder tags
export_dir: value to use for {export_dir} template
export_db: an ExportDB object
Returns:
(list of file paths that were updated with new Finder tags, list of file paths skipped because Finder tags didn't need updating)
@@ -3533,7 +3621,6 @@ def write_finder_tags(
none_str=_OSXPHOTOS_NONE_SENTINEL,
path_sep="/",
export_dir=export_dir,
exportdb=export_db,
)
rendered, unmatched = photo.render_template(template_str, options)
except ValueError as e:
@@ -3581,7 +3668,6 @@ def write_extended_attributes(
xattr_template,
strip=False,
export_dir=None,
export_db=None,
):
"""Writes extended attributes to exported files
@@ -3589,7 +3675,6 @@ def write_extended_attributes(
photo: a PhotoInfo object
strip: xattr_template: list of tuples: (attribute name, attribute template)
export_dir: value to use for {export_dir} template
exportdb: an ExportDB object
Returns:
tuple(list of file paths that were updated with new attributes, list of file paths skipped because attributes didn't need updating)
@@ -3599,10 +3684,7 @@ def write_extended_attributes(
for xattr, template_str in xattr_template:
try:
options = RenderOptions(
none_str=_OSXPHOTOS_NONE_SENTINEL,
path_sep="/",
export_dir=export_dir,
exportdb=export_db,
none_str=_OSXPHOTOS_NONE_SENTINEL, path_sep="/", export_dir=export_dir
)
rendered, unmatched = photo.render_template(template_str, options)
except ValueError as e:
@@ -3669,9 +3751,7 @@ def run_post_command(
# some categories, like error, return a tuple of (file, error str)
if isinstance(f, tuple):
f = f[0]
render_options = RenderOptions(
export_dir=export_dir, filepath=f, exportdb=export_db
)
render_options = RenderOptions(export_dir=export_dir, filepath=f)
template = PhotoTemplate(photo, exiftool_path=exiftool_path)
command, _ = template.render(command_template, options=render_options)
command = command[0] if command else None
@@ -3942,7 +4022,7 @@ def places(ctx, cli_obj, db, json_, photos_library):
@cli.command()
@DB_OPTION
@JSON_OPTION
@deleted_options
@DELETED_OPTIONS
@DB_ARGUMENT
@click.pass_obj
@click.pass_context
@@ -4144,7 +4224,7 @@ def _load_photos_db(dbpath):
return photosdb
def _get_photos(photosdb):
def _get_all_photos(photosdb):
"""get list of all photos in photosdb"""
photos = photosdb.photos(images=True, movies=True)
photos.extend(photosdb.photos(images=True, movies=True, intrash=True))
@@ -4179,7 +4259,42 @@ def _spotlight_photo(photo: PhotoInfo):
default=False,
help="Launch REPL with Emacs keybindings (default is vi bindings)",
)
def repl(ctx, cli_obj, db, emacs):
@click.option(
"--beta",
is_flag=True,
default=False,
hidden=True,
help="Enable beta options.",
)
@QUERY_OPTIONS
@DELETED_OPTIONS
@click.option("--missing", is_flag=True, help="Search for photos missing from disk.")
@click.option(
"--not-missing",
is_flag=True,
help="Search for photos present on disk (e.g. not missing).",
)
@click.option(
"--cloudasset",
is_flag=True,
help="Search for photos that are part of an iCloud library",
)
@click.option(
"--not-cloudasset",
is_flag=True,
help="Search for photos that are not part of an iCloud library",
)
@click.option(
"--incloud",
is_flag=True,
help="Search for photos that are in iCloud (have been synched)",
)
@click.option(
"--not-incloud",
is_flag=True,
help="Search for photos that are not in iCloud (have not been synched)",
)
def repl(ctx, cli_obj, db, emacs, beta, **kwargs):
"""Run interactive osxphotos REPL shell (useful for debugging, prototyping, and inspecting your Photos library)"""
import logging
@@ -4204,9 +4319,20 @@ def repl(ctx, cli_obj, db, emacs):
print(f"osxphotos version: {osxphotos._version.__version__}")
db = db or get_photos_db()
photosdb = _load_photos_db(db)
# enable beta features if requested
if beta:
photosdb._beta = beta
print("Beta mode enabled")
print("Getting photos")
tic = time.perf_counter()
photos = _get_photos(photosdb)
try:
query_options = _query_options_from_kwargs(**kwargs)
except IncompatibleQueryOptions:
click.echo("Incompatible query options", err=True)
click.echo(cli.commands["repl"].get_help(ctx), err=True)
sys.exit(1)
photos = _query_photos(photosdb, query_options)
all_photos = _get_all_photos(photosdb)
toc = time.perf_counter()
tictoc = toc - tic
@@ -4233,7 +4359,10 @@ def repl(ctx, cli_obj, db, emacs):
print("The following variables are defined:")
print(f"- photosdb: PhotosDB() instance for {photosdb.library_path}")
print(
f"- photos: list of PhotoInfo objects for all photos in photosdb, including those in the trash (len={len(photos)})"
f"- photos: list of PhotoInfo objects for all photos filtered with any query options passed on command line (len={len(photos)})"
)
print(
f"- all_photos: list of PhotoInfo objects for all photos in photosdb, including those in the trash (len={len(all_photos)})"
)
print(
f"- selected: list of PhotoInfo objects for any photos selected in Photos (len={len(selected)})"
@@ -4272,7 +4401,7 @@ def repl(ctx, cli_obj, db, emacs):
)
@cli.command(name="grep", hidden=True)
@cli.command(name="grep", hidden=OSXPHOTOS_HIDDEN)
@DB_OPTION
@click.pass_obj
@click.pass_context
@@ -4318,7 +4447,7 @@ def grep(ctx, cli_obj, db, ignore_case, print_filename, pattern):
print(", ".join([table, column, row_id, value]))
@cli.command(hidden=True)
@cli.command(hidden=OSXPHOTOS_HIDDEN)
@DB_OPTION
@DB_ARGUMENT
@click.option(
@@ -4544,3 +4673,113 @@ def diff(ctx, cli_obj, db, raw_output, style, db2, verbose):
line, "sql", theme=style, line_numbers=False, code_width=1000
)
console.print(syntax)
@cli.command(name="run")
@click.argument("python_file", nargs=1, type=click.Path(exists=True))
def run(python_file):
"""Run a python file using same environment as osxphotos"""
run_path(python_file, run_name="__main__")
def _query_options_from_kwargs(**kwargs) -> QueryOptions:
"""Validate query options and create a QueryOptions instance"""
# sanity check input args
nonexclusive = [
"keyword",
"person",
"album",
"folder",
"name",
"uuid",
"uuid_from_file",
"edited",
"external_edit",
"uti",
"has_raw",
"from_date",
"to_date",
"from_time",
"to_time",
"label",
"is_reference",
"query_eval",
"query_function",
"min_size",
"max_size",
"regex",
"selected",
"exif",
"duplicate",
]
exclusive = [
("favorite", "not_favorite"),
("hidden", "not_hidden"),
("missing", "not_missing"),
("only_photos", "only_movies"),
("burst", "not_burst"),
("live", "not_live"),
("cloudasset", "not_cloudasset"),
("incloud", "not_incloud"),
("portrait", "not_portrait"),
("screenshot", "not_screenshot"),
("slow_mo", "not_slow_mo"),
("time_lapse", "not_time_lapse"),
("hdr", "not_hdr"),
("selfie", "not_selfie"),
("panorama", "not_panorama"),
("deleted", "deleted_only"),
("shared", "not_shared"),
("has_comment", "no_comment"),
("has_likes", "no_likes"),
("in_album", "not_in_album"),
("location", "no_location"),
]
# print help if no non-exclusive term or a double exclusive term is given
# TODO: add option to validate requiring at least one query arg
if any(all([kwargs[b], kwargs[n]]) for b, n in exclusive) or any(
[
all([any(kwargs["title"]), kwargs["no_title"]]),
all([any(kwargs["description"]), kwargs["no_description"]]),
all([any(kwargs["place"]), kwargs["no_place"]]),
]
):
raise IncompatibleQueryOptions
# actually have something to query
include_photos = True
include_movies = True # default searches for everything
if kwargs["only_movies"]:
include_photos = False
if kwargs["only_photos"]:
include_movies = False
# load UUIDs if necessary and append to any uuids passed with --uuid
uuid = None
if kwargs["uuid_from_file"]:
uuid_list = list(kwargs["uuid"]) # Click option is a tuple
uuid_list.extend(load_uuid_from_file(kwargs["uuid_from_file"]))
uuid = tuple(uuid_list)
query_fields = [field.name for field in dataclasses.fields(QueryOptions)]
query_dict = {field: kwargs.get(field) for field in query_fields}
query_dict["photos"] = include_photos
query_dict["movies"] = include_movies
query_dict["uuid"] = uuid
return QueryOptions(**query_dict)
def _query_photos(photosdb: PhotosDB, query_options: QueryOptions) -> List:
"""Query photos given a QueryOptions instance"""
try:
photos = photosdb.query(query_options)
except ValueError as e:
if "Invalid query_eval CRITERIA:" in str(e):
msg = str(e).split(":")[1]
raise click.BadOptionUsage(
"query_eval", f"Invalid query-eval CRITERIA: {msg}"
)
else:
raise ValueError(e)
return photos

View File

@@ -11,6 +11,7 @@ import html
import json
import logging
import os
import pathlib
import re
import shutil
import subprocess
@@ -19,11 +20,12 @@ from functools import lru_cache # pylint: disable=syntax-error
__all__ = [
"escape_str",
"unescape_str",
"terminate_exiftool",
"get_exiftool_path",
"exiftool_can_write",
"ExifTool",
"ExifToolCaching",
"get_exiftool_path",
"terminate_exiftool",
"unescape_str",
]
# exiftool -stay_open commands outputs this EOF marker after command is run
@@ -33,6 +35,24 @@ EXIFTOOL_STAYOPEN_EOF_LEN = len(EXIFTOOL_STAYOPEN_EOF)
# list of exiftool processes to cleanup when exiting or when terminate is called
EXIFTOOL_PROCESSES = []
# exiftool supported file types, created by utils/exiftool_supported_types.py
EXIFTOOL_FILETYPES_JSON = "exiftool_filetypes.json"
with (pathlib.Path(__file__).parent / EXIFTOOL_FILETYPES_JSON).open("r") as f:
EXIFTOOL_SUPPORTED_FILETYPES = json.load(f)
def exiftool_can_write(suffix: str) -> bool:
"""Return True if exiftool supports writing to a file with the given suffix, otherwise False"""
if not suffix:
return False
suffix = suffix.lower()
if suffix[0] == ".":
suffix = suffix[1:]
return (
suffix in EXIFTOOL_SUPPORTED_FILETYPES
and EXIFTOOL_SUPPORTED_FILETYPES[suffix]["write"]
)
def escape_str(s):
"""escape string for use with exiftool -E"""

File diff suppressed because it is too large Load Diff

View File

@@ -10,13 +10,17 @@ import sys
from abc import ABC, abstractmethod
from io import StringIO
from sqlite3 import Error
from typing import Union
from ._constants import OSXPHOTOS_EXPORT_DB
from ._version import __version__
from .utils import normalize_fs_path
__all__ = ["ExportDB_ABC", "ExportDBNoOp", "ExportDB", "ExportDBInMemory"]
OSXPHOTOS_EXPORTDB_VERSION = "4.0"
OSXPHOTOS_EXPORTDB_VERSION = "4.3"
OSXPHOTOS_EXPORTDB_VERSION_MIGRATE_FILEPATH = "4.3"
OSXPHOTOS_ABOUT_STRING = f"Created by osxphotos version {__version__} (https://github.com/RhetTbull/osxphotos) on {datetime.datetime.now()}"
@@ -104,12 +108,12 @@ class ExportDB_ABC(ABC):
self,
filename,
uuid,
orig_stat,
exif_stat,
converted_stat,
edited_stat,
info_json,
exif_json,
orig_stat=None,
exif_stat=None,
converted_stat=None,
edited_stat=None,
info_json=None,
exif_json=None,
):
pass
@@ -183,12 +187,12 @@ class ExportDBNoOp(ExportDB_ABC):
self,
filename,
uuid,
orig_stat,
exif_stat,
converted_stat,
edited_stat,
info_json,
exif_json,
orig_stat=None,
exif_stat=None,
converted_stat=None,
edited_stat=None,
info_json=None,
exif_json=None,
):
pass
@@ -211,12 +215,13 @@ class ExportDB(ExportDB_ABC):
"""query database for filename and return UUID
returns None if filename not found in database
"""
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
filepath_normalized = self._normalize_filepath_relative(filename)
conn = self._conn
try:
c = conn.cursor()
c.execute(
"SELECT uuid FROM files WHERE filepath_normalized = ?", (filename,)
"SELECT uuid FROM files WHERE filepath_normalized = ?",
(filepath_normalized,),
)
results = c.fetchone()
uuid = results[0] if results else None
@@ -228,7 +233,7 @@ class ExportDB(ExportDB_ABC):
def set_uuid_for_file(self, filename, uuid):
"""set UUID of filename to uuid in the database"""
filename = str(pathlib.Path(filename).relative_to(self._path))
filename_normalized = filename.lower()
filename_normalized = self._normalize_filepath(filename)
conn = self._conn
try:
c = conn.cursor()
@@ -245,7 +250,7 @@ class ExportDB(ExportDB_ABC):
"""set stat info for filename
filename: filename to set the stat info for
stat: a tuple of length 3: mode, size, mtime"""
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
filename = self._normalize_filepath_relative(filename)
if len(stats) != 3:
raise ValueError(f"expected 3 elements for stat, got {len(stats)}")
@@ -266,7 +271,7 @@ class ExportDB(ExportDB_ABC):
"""get stat info for filename
returns: tuple of (mode, size, mtime)
"""
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
filename = self._normalize_filepath_relative(filename)
conn = self._conn
try:
c = conn.cursor()
@@ -302,7 +307,7 @@ class ExportDB(ExportDB_ABC):
"""set stat info for filename (after exiftool has updated it)
filename: filename to set the stat info for
stat: a tuple of length 3: mode, size, mtime"""
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
filename = self._normalize_filepath_relative(filename)
if len(stats) != 3:
raise ValueError(f"expected 3 elements for stat, got {len(stats)}")
@@ -323,7 +328,7 @@ class ExportDB(ExportDB_ABC):
"""get stat info for filename (after exiftool has updated it)
returns: tuple of (mode, size, mtime)
"""
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
filename = self._normalize_filepath_relative(filename)
conn = self._conn
try:
c = conn.cursor()
@@ -384,7 +389,7 @@ class ExportDB(ExportDB_ABC):
def get_exifdata_for_file(self, filename):
"""returns the exifdata JSON struct for a file"""
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
filename = self._normalize_filepath_relative(filename)
conn = self._conn
try:
c = conn.cursor()
@@ -402,7 +407,7 @@ class ExportDB(ExportDB_ABC):
def set_exifdata_for_file(self, filename, exifdata):
"""sets the exifdata JSON struct for a file"""
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
filename = self._normalize_filepath_relative(filename)
conn = self._conn
try:
c = conn.cursor()
@@ -416,7 +421,7 @@ class ExportDB(ExportDB_ABC):
def get_sidecar_for_file(self, filename):
"""returns the sidecar data and signature for a file"""
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
filename = self._normalize_filepath_relative(filename)
conn = self._conn
try:
c = conn.cursor()
@@ -444,7 +449,7 @@ class ExportDB(ExportDB_ABC):
def set_sidecar_for_file(self, filename, sidecar_data, sidecar_sig):
"""sets the sidecar data and signature for a file"""
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
filename = self._normalize_filepath_relative(filename)
conn = self._conn
try:
c = conn.cursor()
@@ -506,52 +511,65 @@ class ExportDB(ExportDB_ABC):
self,
filename,
uuid,
orig_stat,
exif_stat,
converted_stat,
edited_stat,
info_json,
exif_json,
orig_stat=None,
exif_stat=None,
converted_stat=None,
edited_stat=None,
info_json=None,
exif_json=None,
):
"""sets all the data for file and uuid at once"""
"""sets all the data for file and uuid at once; if any value is None, does not set it"""
filename = str(pathlib.Path(filename).relative_to(self._path))
filename_normalized = filename.lower()
filename_normalized = self._normalize_filepath(filename)
conn = self._conn
try:
c = conn.cursor()
# update files table (if needed);
# this statement works around fact that there was no unique constraint on files.filepath_normalized
c.execute(
"INSERT OR REPLACE INTO files(filepath, filepath_normalized, uuid) VALUES (?, ?, ?);",
"""INSERT OR IGNORE INTO files(filepath, filepath_normalized, uuid) VALUES (?, ?, ?);""",
(filename, filename_normalized, uuid),
)
c.execute(
"UPDATE files "
+ "SET orig_mode = ?, orig_size = ?, orig_mtime = ? "
+ "WHERE filepath_normalized = ?;",
(*orig_stat, filename_normalized),
)
c.execute(
"UPDATE files "
+ "SET exif_mode = ?, exif_size = ?, exif_mtime = ? "
+ "WHERE filepath_normalized = ?;",
(*exif_stat, filename_normalized),
)
c.execute(
"INSERT OR REPLACE INTO converted(filepath_normalized, mode, size, mtime) VALUES (?, ?, ?, ?);",
(filename_normalized, *converted_stat),
)
c.execute(
"INSERT OR REPLACE INTO edited(filepath_normalized, mode, size, mtime) VALUES (?, ?, ?, ?);",
(filename_normalized, *edited_stat),
)
c.execute(
"INSERT OR REPLACE INTO info(uuid, json_info) VALUES (?, ?);",
(uuid, info_json),
)
c.execute(
"INSERT OR REPLACE INTO exifdata(filepath_normalized, json_exifdata) VALUES (?, ?);",
(filename_normalized, exif_json),
)
if orig_stat is not None:
c.execute(
"UPDATE files "
+ "SET orig_mode = ?, orig_size = ?, orig_mtime = ? "
+ "WHERE filepath_normalized = ?;",
(*orig_stat, filename_normalized),
)
if exif_stat is not None:
c.execute(
"UPDATE files "
+ "SET exif_mode = ?, exif_size = ?, exif_mtime = ? "
+ "WHERE filepath_normalized = ?;",
(*exif_stat, filename_normalized),
)
if converted_stat is not None:
c.execute(
"INSERT OR REPLACE INTO converted(filepath_normalized, mode, size, mtime) VALUES (?, ?, ?, ?);",
(filename_normalized, *converted_stat),
)
if edited_stat is not None:
c.execute(
"INSERT OR REPLACE INTO edited(filepath_normalized, mode, size, mtime) VALUES (?, ?, ?, ?);",
(filename_normalized, *edited_stat),
)
if info_json is not None:
c.execute(
"INSERT OR REPLACE INTO info(uuid, json_info) VALUES (?, ?);",
(uuid, info_json),
)
if exif_json is not None:
c.execute(
"INSERT OR REPLACE INTO exifdata(filepath_normalized, json_exifdata) VALUES (?, ?);",
(filename_normalized, exif_json),
)
conn.commit()
except Error as e:
logging.warning(e)
@@ -564,7 +582,7 @@ class ExportDB(ExportDB_ABC):
logging.warning(e)
def _set_stat_for_file(self, table, filename, stats):
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
filename = self._normalize_filepath_relative(filename)
if len(stats) != 3:
raise ValueError(f"expected 3 elements for stat, got {len(stats)}")
@@ -577,7 +595,7 @@ class ExportDB(ExportDB_ABC):
conn.commit()
def _get_stat_for_file(self, table, filename):
filename = str(pathlib.Path(filename).relative_to(self._path)).lower()
filename = self._normalize_filepath_relative(filename)
conn = self._conn
c = conn.cursor()
c.execute(
@@ -613,6 +631,8 @@ class ExportDB(ExportDB_ABC):
version_info = self._get_database_version(conn)
if version_info[1] < OSXPHOTOS_EXPORTDB_VERSION:
self._create_db_tables(conn)
if version_info[1] < OSXPHOTOS_EXPORTDB_VERSION_MIGRATE_FILEPATH:
self._migrate_normalized_filepath(conn)
self.was_upgraded = (version_info[1], OSXPHOTOS_EXPORTDB_VERSION)
else:
self.was_upgraded = ()
@@ -662,6 +682,22 @@ class ExportDB(ExportDB_ABC):
exif_size INTEGER,
exif_mtime REAL
); """,
"sql_files_table_migrate": """ CREATE TABLE IF NOT EXISTS files_migrate (
id INTEGER PRIMARY KEY,
filepath TEXT NOT NULL,
filepath_normalized TEXT NOT NULL,
uuid TEXT,
orig_mode INTEGER,
orig_size INTEGER,
orig_mtime REAL,
exif_mode INTEGER,
exif_size INTEGER,
exif_mtime REAL,
UNIQUE(filepath_normalized)
); """,
"sql_files_migrate": """ INSERT INTO files_migrate SELECT * FROM files;""",
"sql_files_drop_tables": """ DROP TABLE files;""",
"sql_files_alter": """ ALTER TABLE files_migrate RENAME TO files;""",
"sql_runs_table": """ CREATE TABLE IF NOT EXISTS runs (
id INTEGER PRIMARY KEY,
datetime TEXT,
@@ -753,6 +789,32 @@ class ExportDB(ExportDB_ABC):
except Error as e:
logging.warning(e)
def _normalize_filepath(self, filepath: Union[str, pathlib.Path]) -> str:
"""normalize filepath for unicode, lower case"""
return normalize_fs_path(str(filepath)).lower()
def _normalize_filepath_relative(self, filepath: Union[str, pathlib.Path]) -> str:
"""normalize filepath for unicode, relative path (to export dir), lower case"""
filepath = str(pathlib.Path(filepath).relative_to(self._path))
return normalize_fs_path(str(filepath)).lower()
def _migrate_normalized_filepath(self, conn):
"""Fix all filepath_normalized columns for unicode normalization"""
# Prior to database version 4.3, filepath_normalized was not normalized for unicode
c = conn.cursor()
for table in ["converted", "edited", "exifdata", "files", "sidecar"]:
old_values = c.execute(
f"SELECT filepath_normalized, id FROM {table}"
).fetchall()
new_values = [
(self._normalize_filepath(filepath_normalized), id_)
for filepath_normalized, id_ in old_values
]
c.executemany(
f"UPDATE {table} SET filepath_normalized=? WHERE id=?", new_values
)
conn.commit()
class ExportDBInMemory(ExportDB):
"""In memory version of ExportDB

View File

@@ -181,7 +181,6 @@ class FileUtilMacOS(FileUtilABC):
return False
s1 = cls._sig(os.stat(f1))
if s1[0] != stat.S_IFREG or s2[0] != stat.S_IFREG:
return False
return s1 == s2

View File

@@ -1,10 +1,8 @@
""" PhotoExport class to export photos
"""
# TODO: the various sidecar_json, sidecar_xmp, etc args should all be collapsed to a sidecar param using a bit mask
import dataclasses
import glob
import hashlib
import json
import logging
@@ -14,7 +12,7 @@ import re
import tempfile
from collections import namedtuple # pylint: disable=syntax-error
from dataclasses import asdict, dataclass
from typing import TYPE_CHECKING, Callable, List, Optional
from typing import TYPE_CHECKING, Callable, List, Optional, Tuple
import photoscript
from mako.template import Template
@@ -34,7 +32,7 @@ from ._constants import (
)
from ._version import __version__
from .datetime_utils import datetime_tz_to_utc
from .exiftool import ExifTool
from .exiftool import ExifTool, exiftool_can_write
from .export_db import ExportDB_ABC, ExportDBNoOp
from .fileutil import FileUtil
from .photokit import (
@@ -46,7 +44,7 @@ from .photokit import (
)
from .phototemplate import RenderOptions
from .uti import get_preferred_uti_extension
from .utils import increment_filename, increment_filename_with_count, lineno
from .utils import increment_filename, lineno, list_directory
__all__ = [
"ExportError",
@@ -72,7 +70,7 @@ class ExportError(Exception):
@dataclass
class ExportOptions:
"""Options class for exporting photos with export2
"""Options class for exporting photos with export
Attributes:
convert_to_jpeg (bool): if True, converts non-jpeg images to jpeg
@@ -212,7 +210,7 @@ class StagedFiles:
class ExportResults:
"""Results class which holds export results for export2"""
"""Results class which holds export results for export"""
def __init__(
self,
@@ -222,6 +220,7 @@ class ExportResults:
skipped=None,
exif_updated=None,
touched=None,
to_touch=None,
converted_to_jpeg=None,
sidecar_json_written=None,
sidecar_json_skipped=None,
@@ -247,6 +246,7 @@ class ExportResults:
self.skipped = skipped or []
self.exif_updated = exif_updated or []
self.touched = touched or []
self.to_touch = to_touch or []
self.converted_to_jpeg = converted_to_jpeg or []
self.sidecar_json_written = sidecar_json_written or []
self.sidecar_json_skipped = sidecar_json_skipped or []
@@ -298,6 +298,7 @@ class ExportResults:
self.skipped += other.skipped
self.exif_updated += other.exif_updated
self.touched += other.touched
self.to_touch += other.to_touch
self.converted_to_jpeg += other.converted_to_jpeg
self.sidecar_json_written += other.sidecar_json_written
self.sidecar_json_skipped += other.sidecar_json_skipped
@@ -326,6 +327,7 @@ class ExportResults:
+ f",skipped={self.skipped}"
+ f",exif_updated={self.exif_updated}"
+ f",touched={self.touched}"
+ f",to_touch={self.to_touch}"
+ f",converted_to_jpeg={self.converted_to_jpeg}"
+ f",sidecar_json_written={self.sidecar_json_written}"
+ f",sidecar_json_skipped={self.sidecar_json_skipped}"
@@ -357,136 +359,15 @@ class PhotoExporter:
prefix=f"osxphotos_photo_exporter_{self.photo.uuid}_"
)
self._temp_dir_path = pathlib.Path(self._temp_dir.name)
self.fileutil = FileUtil
def export(
self,
dest,
filename=None,
edited=False,
live_photo=False,
raw_photo=False,
export_as_hardlink=False,
overwrite=False,
increment=True,
sidecar_json=False,
sidecar_exiftool=False,
sidecar_xmp=False,
download_missing=False,
use_photos_export=False,
use_photokit=True,
timeout=120,
exiftool=False,
use_albums_as_keywords=False,
use_persons_as_keywords=False,
keyword_template=None,
description_template=None,
render_options: Optional[RenderOptions] = None,
):
"""export photo
dest: must be valid destination path (or exception raised)
filename: (optional): name of exported picture; if not provided, will use current filename
**NOTE**: if provided, user must ensure file extension (suffix) is correct.
For example, if photo is .CR2 file, edited image may be .jpeg.
If you provide an extension different than what the actual file is,
export will print a warning but will export the photo using the
incorrect file extension (unless use_photos_export is true, in which case export will
use the extension provided by Photos upon export; in this case, an incorrect extension is
silently ignored).
e.g. to get the extension of the edited photo,
reference PhotoInfo.path_edited
edited: (boolean, default=False); if True will export the edited version of the photo, otherwise exports the original version
(or raise exception if no edited version)
live_photo: (boolean, default=False); if True, will also export the associated .mov for live photos
raw_photo: (boolean, default=False); if True, will also export the associated RAW photo
export_as_hardlink: (boolean, default=False); if True, will hardlink files instead of copying them
overwrite: (boolean, default=False); if True will overwrite files if they already exist
increment: (boolean, default=True); if True, will increment file name until a non-existant name is found
if overwrite=False and increment=False, export will fail if destination file already exists
sidecar_json: if set will write a json sidecar with data in format readable by exiftool
sidecar filename will be dest/filename.json; includes exiftool tag group names (e.g. `exiftool -G -j`)
sidecar_exiftool: if set will write a json sidecar with data in format readable by exiftool
sidecar filename will be dest/filename.json; does not include exiftool tag group names (e.g. `exiftool -j`)
sidecar_xmp: if set will write an XMP sidecar with IPTC data
sidecar filename will be dest/filename.xmp
use_photos_export: (boolean, default=False); if True will attempt to export photo via AppleScript or PhotoKit interaction with Photos
download_missing: (boolean, default=False); if True will attempt to export photo via AppleScript or PhotoKit interaction with Photos if missing
use_photokit: (boolean, default=True); if True will attempt to export photo via photokit instead of AppleScript when used with use_photos_export or download_missing
timeout: (int, default=120) timeout in seconds used with use_photos_export
exiftool: (boolean, default = False); if True, will use exiftool to write metadata to export file
returns list of full paths to the exported files
use_albums_as_keywords: (boolean, default = False); if True, will include album names in keywords
when exporting metadata with exiftool or sidecar
use_persons_as_keywords: (boolean, default = False); if True, will include person names in keywords
when exporting metadata with exiftool or sidecar
keyword_template: (list of strings); list of template strings that will be rendered as used as keywords
description_template: string; optional template string that will be rendered for use as photo description
render_options: an optional osxphotos.phototemplate.RenderOptions instance with options to pass to template renderer
Returns: list of photos exported
"""
# Implementation note: calls export2 to actually do the work
sidecar = 0
if sidecar_json:
sidecar |= SIDECAR_JSON
if sidecar_exiftool:
sidecar |= SIDECAR_EXIFTOOL
if sidecar_xmp:
sidecar |= SIDECAR_XMP
if not filename:
if not edited:
filename = self.photo.original_filename
else:
original_name = pathlib.Path(self.photo.original_filename)
if self.photo.path_edited:
ext = pathlib.Path(self.photo.path_edited).suffix
else:
uti = (
self.photo.uti_edited
if edited and self.photo.uti_edited
else self.photo.uti
)
ext = get_preferred_uti_extension(uti)
ext = "." + ext
filename = original_name.stem + "_edited" + ext
options = ExportOptions(
description_template=description_template,
download_missing=download_missing,
edited=edited,
exiftool=exiftool,
export_as_hardlink=export_as_hardlink,
increment=increment,
keyword_template=keyword_template,
live_photo=live_photo,
overwrite=overwrite,
raw_photo=raw_photo,
render_options=render_options,
sidecar=sidecar,
timeout=timeout,
use_albums_as_keywords=use_albums_as_keywords,
use_persons_as_keywords=use_persons_as_keywords,
use_photokit=use_photokit,
use_photos_export=use_photos_export,
)
results = self.export2(
dest,
filename=filename,
options=options,
)
return results.exported
def export2(
self,
dest,
filename=None,
options: Optional[ExportOptions] = None,
):
"""export photo, like export but with update and dry_run options
) -> ExportResults:
"""export photo
Args:
dest: must be valid destination path or exception raised
@@ -520,11 +401,10 @@ class PhotoExporter:
# when called from export(), won't get an export_db, so use no-op version
options.export_db = options.export_db or ExportDBNoOp()
export_db = options.export_db
# ensure there's a FileUtil class to use
options.fileutil = options.fileutil or FileUtil
fileutil = options.fileutil
self.fileutil = options.fileutil
self._render_options = options.render_options or RenderOptions()
@@ -551,87 +431,24 @@ class PhotoExporter:
dest = pathlib.Path(dest) / filename
# Is there something to convert with convert_to_jpeg?
if options.convert_to_jpeg and self.photo.isphoto:
something_to_convert = False
ext = "." + options.jpeg_ext if options.jpeg_ext else ".jpeg"
if export_original and self.photo.uti_original != "public.jpeg":
# not a jpeg but will convert to jpeg upon export so fix file extension
something_to_convert = True
dest = dest.parent / f"{dest.stem}{ext}"
if export_edited and self.photo.uti != "public.jpeg":
# in Big Sur+, edited HEICs are HEIC
something_to_convert = True
dest = dest.parent / f"{dest.stem}{ext}"
convert_to_jpeg = something_to_convert
else:
convert_to_jpeg = False
options = dataclasses.replace(options, convert_to_jpeg=convert_to_jpeg)
dest, options = self._should_convert_to_jpeg(dest, options)
dest, _ = self._validate_dest_path(
dest,
increment=options.increment,
update=options.update,
overwrite=options.overwrite,
)
dest = pathlib.Path(dest)
# stage files for export by finding path in local library or downloading from iCloud as appropriate
staged_files = self._stage_photos_for_export(options)
src = staged_files.edited if options.edited else staged_files.original
# get the right destination path depending on options.update, etc.
dest = self._get_dest_path(src, dest, options)
self._render_options.filepath = str(dest)
all_results = ExportResults()
staged_files = self._stage_photos_for_export(options)
src = staged_files.edited if options.edited else staged_files.original
if src:
# found source now try to find right destination
if options.update and dest.exists():
# destination exists, check to see if destination is the right UUID
dest_uuid = export_db.get_uuid_for_file(dest)
if dest_uuid is None and fileutil.cmp(src, dest):
# might be exporting into a pre-ExportDB folder or the DB got deleted
dest_uuid = self.photo.uuid
export_db.set_data(
filename=dest,
uuid=self.photo.uuid,
orig_stat=fileutil.file_sig(dest),
exif_stat=(None, None, None),
converted_stat=(None, None, None),
edited_stat=(None, None, None),
info_json=self.photo.json(),
exif_json=None,
)
if dest_uuid != self.photo.uuid:
# not the right file, find the right one
glob_str = str(dest.parent / f"{dest.stem} (*{dest.suffix}")
dest_files = glob.glob(glob_str)
for file_ in dest_files:
dest_uuid = export_db.get_uuid_for_file(file_)
if dest_uuid == self.photo.uuid:
dest = pathlib.Path(file_)
break
elif dest_uuid is None and fileutil.cmp(src, file_):
# files match, update the UUID
dest = pathlib.Path(file_)
export_db.set_data(
filename=dest,
uuid=self.photo.uuid,
orig_stat=fileutil.file_sig(dest),
exif_stat=(None, None, None),
converted_stat=(None, None, None),
edited_stat=(None, None, None),
info_json=self.photo.json(),
exif_json=None,
)
break
else:
# increment the destination file
dest = pathlib.Path(increment_filename(dest))
# export the dest file
results = self._export_photo(
all_results += self._export_photo(
src,
dest,
options=options,
)
all_results += results
# copy live photo associated .mov if requested
if (
@@ -642,13 +459,12 @@ class PhotoExporter:
):
live_name = dest.parent / f"{dest.stem}.mov"
src_live = staged_files.original_live
results = self._export_photo(
all_results += self._export_photo(
src_live,
live_name,
# don't try to convert the live photo
options=dataclasses.replace(options, convert_to_jpeg=False),
)
all_results += results
if (
export_edited
@@ -658,26 +474,23 @@ class PhotoExporter:
):
live_name = dest.parent / f"{dest.stem}.mov"
src_live = staged_files.edited_live
results = self._export_photo(
all_results += self._export_photo(
src_live,
live_name,
# don't try to convert the live photo
options=dataclasses.replace(options, convert_to_jpeg=False),
)
all_results += results
# copy associated RAW image if requested
if options.raw_photo and self.photo.has_raw and staged_files.raw:
raw_path = pathlib.Path(staged_files.raw)
raw_ext = raw_path.suffix
raw_name = dest.parent / f"{dest.stem}{raw_ext}"
if raw_path is not None:
results = self._export_photo(
raw_path,
raw_name,
options=options,
)
all_results += results
all_results += self._export_photo(
raw_path,
raw_name,
options=options,
)
# copy preview image if requested
if options.preview and staged_files.preview:
@@ -696,44 +509,35 @@ class PhotoExporter:
if options.overwrite or options.update
else pathlib.Path(increment_filename(preview_name))
)
if preview_path is not None:
results = self._export_photo(
preview_path,
preview_name,
options=options,
)
all_results += results
results = self._write_sidecar_files(dest=dest, options=options)
all_results += results
# if exiftool, write the metadata
if options.exiftool:
exif_files = (
all_results.new + all_results.updated + all_results.skipped
if options.update
else all_results.exported
all_results += self._export_photo(
preview_path,
preview_name,
options=options,
)
for exported_file in exif_files:
results = self._write_exif_metadata_to_files(
exported_file=exported_file, options=options
)
all_results += results
all_results += self._write_sidecar_files(dest=dest, options=options)
if options.touch_file:
for exif_file in all_results.exif_updated:
verbose(f"Updating file modification time for {exif_file}")
all_results.touched.append(exif_file)
ts = int(self.photo.date.timestamp())
fileutil.utime(exif_file, (ts, ts))
all_results.touched = list(set(all_results.touched))
all_results += self._touch_files(all_results, options)
return all_results
def _touch_files(
self, results: ExportResults, options: ExportOptions
) -> ExportResults:
"""touch file date/time to match photo creation date/time"""
fileutil = options.fileutil
touch_files = set(results.to_touch)
touch_results = ExportResults()
for touch_file in touch_files:
ts = int(self.photo.date.timestamp())
fileutil.utime(touch_file, (ts, ts))
touch_results.touched.append(touch_file)
return touch_results
def _get_edited_filename(self, original_filename):
"""Return the filename for the exported edited photo
(used when filename isn't provided in call to export2)"""
(used when filename isn't provided in call to export)"""
# need to get the right extension for edited file
original_filename = pathlib.Path(original_filename)
if self.photo.path_edited:
@@ -745,34 +549,82 @@ class PhotoExporter:
edited_filename = original_filename.stem + "_edited" + ext
return edited_filename
def _validate_dest_path(self, dest, increment, update, overwrite, count=0):
"""If destination exists, add (1), (2), and so on to filename to get a valid destination
def _get_dest_path(
self, src: str, dest: pathlib.Path, options: ExportOptions
) -> pathlib.Path:
"""If destination exists find match in ExportDB, on disk, or add (1), (2), and so on to filename to get a valid destination
Args:
dest (str): Destination path
increment (bool): Whether to increment the filename if it already exists
update (bool): Whether running in update mode
overwrite (bool): Whether running in overwrite mode
count: optional counter to start from (if 0, start from 1)
src (str): source file path
dest (str): destination path
options (ExportOptions): Export options
Returns:
new dest path (pathlib.Path), increment count (int)
new dest path (pathlib.Path)
"""
# check to see if file exists and if so, add (1), (2), etc until we find one that works
# if overwrite==False and #increment==False, export should fail if file exists
if dest.exists() and not any(
[options.increment, options.update, options.overwrite]
):
raise FileExistsError(
f"destination exists ({dest}); overwrite={options.overwrite}, increment={options.increment}"
)
# if not update or overwrite, check to see if file exists and if so, add (1), (2), etc
# until we find one that works
# Photos checks the stem and adds (1), (2), etc which avoids collision with sidecars
# e.g. exporting sidecar for file1.png and file1.jpeg
# if file1.png exists and exporting file1.jpeg,
# dest will be file1 (1).jpeg even though file1.jpeg doesn't exist to prevent sidecar collision
if increment and not update and not overwrite:
dest, count = increment_filename_with_count(dest, count=count)
dest = pathlib.Path(dest)
if options.increment and not options.update and not options.overwrite:
return pathlib.Path(increment_filename(dest))
# if overwrite==False and #increment==False, export should fail if file exists
if dest.exists() and all([not x for x in [increment, update, overwrite]]):
raise FileExistsError(
f"destination exists ({dest}); overwrite={overwrite}, increment={increment}"
)
return dest, count
# if update and file exists, need to check to see if it's the write file by checking export db
if options.update and dest.exists() and src:
export_db = options.export_db
fileutil = options.fileutil
# destination exists, check to see if destination is the right UUID
dest_uuid = export_db.get_uuid_for_file(dest)
if dest_uuid is None and fileutil.cmp(src, dest):
# might be exporting into a pre-ExportDB folder or the DB got deleted
dest_uuid = self.photo.uuid
export_db.set_data(
filename=dest,
uuid=self.photo.uuid,
orig_stat=fileutil.file_sig(dest),
info_json=self.photo.json(),
)
if dest_uuid != self.photo.uuid:
# not the right file, find the right one
# find files that match "dest_name (*.ext" (e.g. "dest_name (1).jpg", "dest_name (2).jpg)", ...)
dest_files = list_directory(
dest.parent,
startswith=f"{dest.stem} (",
endswith=dest.suffix,
include_path=True,
)
for file_ in dest_files:
dest_uuid = export_db.get_uuid_for_file(file_)
if dest_uuid == self.photo.uuid:
dest = pathlib.Path(file_)
break
elif dest_uuid is None and fileutil.cmp(src, file_):
# files match, update the UUID
dest = pathlib.Path(file_)
export_db.set_data(
filename=dest,
uuid=self.photo.uuid,
orig_stat=fileutil.file_sig(dest),
info_json=self.photo.json(),
)
break
else:
# increment the destination file
dest = pathlib.Path(increment_filename(dest))
# either dest was updated in the if clause above or not updated at all
return dest
def _stage_photos_for_export(self, options: ExportOptions) -> StagedFiles:
"""Stages photos for export
@@ -810,12 +662,26 @@ class PhotoExporter:
if options.live_photo and self.photo.live_photo:
staged.edited_live = self.photo.path_edited_live_photo
if options.exiftool and not options.dry_run and not options.export_as_hardlink:
# copy files to temp dir for exiftool to process before export
# not needed for download_missing or use_photokit as those files already staged to temp dir
for file_type in [
"raw",
"preview",
"original",
"original_live",
"edited",
"edited_live",
]:
staged_file = getattr(staged, file_type)
if staged_file:
setattr(staged, file_type, self._copy_to_temp_file(staged_file))
# download any missing files
if options.download_missing:
live_photo = staged.edited_live if options.edited else staged.original_live
missing_options = ExportOptions(
edited=options.edited,
# TODO: missing previews are not generated/downloaded
preview=options.preview and not staged.preview,
raw_photo=options.raw_photo and not staged.raw,
live_photo=options.live_photo and not live_photo,
@@ -931,6 +797,9 @@ class PhotoExporter:
except Exception as e:
results.error.append((str(dest), f"{e} ({lineno(__file__)})"))
if options.preview and self.photo.path_derivatives:
results.preview = self.photo.path_derivatives[0]
return results
def _stage_photo_for_export_with_applescript(
@@ -1010,13 +879,48 @@ class PhotoExporter:
if results_attr:
setattr(results, results_attr, exported_file)
if options.preview and self.photo.path_derivatives:
results.preview = self.photo.path_derivatives[0]
return results
def _should_convert_to_jpeg(
self, dest: pathlib.Path, options: ExportOptions
) -> Tuple[pathlib.Path, ExportOptions]:
"""Determine if a file really should be converted to jpeg or not
and return the new destination and ExportOptions instance with the convert_to_jpeg flag set appropriately
"""
if not (options.convert_to_jpeg and self.photo.isphoto):
# nothing to convert
return dest, dataclasses.replace(options, convert_to_jpeg=False)
convert_to_jpeg = False
ext = "." + options.jpeg_ext if options.jpeg_ext else ".jpeg"
if not options.edited and self.photo.uti_original != "public.jpeg":
# not a jpeg but will convert to jpeg upon export so fix file extension
convert_to_jpeg = True
dest = dest.parent / f"{dest.stem}{ext}"
elif options.edited and self.photo.uti != "public.jpeg":
# in Big Sur+, edited HEICs are HEIC
convert_to_jpeg = True
dest = dest.parent / f"{dest.stem}{ext}"
return dest, dataclasses.replace(options, convert_to_jpeg=convert_to_jpeg)
def _is_temp_file(self, filepath: str) -> bool:
"""Returns True if file is in the PhotosExporter temp directory otherwise False"""
filepath = pathlib.Path(filepath)
return filepath.parent == self._temp_dir_path
def _copy_to_temp_file(self, filepath: str) -> str:
"""Copies filepath to a temp file preserving access and modification times"""
filepath = pathlib.Path(filepath)
dest = self._temp_dir_path / filepath.name
dest = increment_filename(dest)
self.fileutil.copy(filepath, dest)
stat = os.stat(filepath)
self.fileutil.utime(dest, (stat.st_atime, stat.st_mtime))
return str(dest)
def _export_photo(
self,
src,
@@ -1027,7 +931,9 @@ class PhotoExporter:
Does the actual copy or hardlink taking the appropriate
action depending on update, overwrite, export_as_hardlink
Assumes destination is the right destination (e.g. UUID matches)
sets UUID and JSON info for exported file using set_uuid_for_file, set_info_for_uuid
Sets UUID and JSON info for exported file using set_uuid_for_file, set_info_for_uuid
Expects that src is a temporary file (as set by _stage_photos_for_export) and
may modify the src (e.g. for convert_to_jpeg or exiftool)
Args:
src (str): src path
@@ -1052,9 +958,12 @@ class PhotoExporter:
exported_files = []
update_updated_files = []
update_new_files = []
update_skipped_files = []
update_skipped_files = [] # skip files that are already up to date
touched_files = []
converted_to_jpeg_files = []
exif_results = ExportResults()
converted_stat = None
edited_stat = None
dest_str = str(dest)
dest_exists = dest.exists()
@@ -1144,8 +1053,9 @@ class PhotoExporter:
sig = (sig[0], sig[1], int(self.photo.date.timestamp()))
if not fileutil.cmp_file_sig(src, sig):
touched_files.append(dest_str)
if not update_skipped_files:
converted_stat = (None, None, None)
# have file to export
edited_stat = (
fileutil.file_sig(src) if options.edited else (None, None, None)
)
@@ -1164,14 +1074,27 @@ class PhotoExporter:
raise ExportError(
f"Error hardlinking {src} to {dest}: {e} ({lineno(__file__)})"
) from e
elif options.convert_to_jpeg:
# use convert_to_jpeg to export the file
fileutil.convert_to_jpeg(
src, dest_str, compression_quality=options.jpeg_quality
)
converted_stat = fileutil.file_sig(dest_str)
converted_to_jpeg_files.append(dest_str)
else:
if options.convert_to_jpeg:
# use convert_to_jpeg to export the file
# convert to a temp file before copying
tmp_file = increment_filename(
self._temp_dir_path
/ f"{pathlib.Path(src).stem}_converted_to_jpeg.jpeg"
)
fileutil.convert_to_jpeg(
src, tmp_file, compression_quality=options.jpeg_quality
)
src = tmp_file
converted_stat = fileutil.file_sig(tmp_file)
converted_to_jpeg_files.append(dest_str)
if options.exiftool:
# if exiftool, write the metadata
exif_results = self._write_exif_metadata_to_file(
src, dest, options=options
)
try:
fileutil.copy(src, dest_str)
except Exception as e:
@@ -1179,28 +1102,26 @@ class PhotoExporter:
f"Error copying file {src} to {dest_str}: {e} ({lineno(__file__)})"
) from e
export_db.set_data(
filename=dest_str,
uuid=self.photo.uuid,
orig_stat=fileutil.file_sig(dest_str),
exif_stat=(None, None, None),
converted_stat=converted_stat,
edited_stat=edited_stat,
info_json=self.photo.json(),
exif_json=None,
)
if touched_files:
ts = int(self.photo.date.timestamp())
fileutil.utime(dest, (ts, ts))
export_db.set_data(
filename=dest_str,
uuid=self.photo.uuid,
orig_stat=fileutil.file_sig(dest_str),
converted_stat=converted_stat,
edited_stat=edited_stat,
info_json=self.photo.json(),
)
return ExportResults(
converted_to_jpeg=converted_to_jpeg_files,
error=exif_results.error,
exif_updated=exif_results.exif_updated,
exiftool_error=exif_results.exiftool_error,
exiftool_warning=exif_results.exiftool_warning,
exported=exported_files + update_new_files + update_updated_files,
new=update_new_files,
updated=update_updated_files,
skipped=update_skipped_files,
touched=touched_files,
converted_to_jpeg=converted_to_jpeg_files,
to_touch=touched_files,
updated=update_updated_files,
)
def _write_sidecar_files(
@@ -1318,21 +1239,50 @@ class PhotoExporter:
sidecar_xmp_skipped=sidecar_xmp_files_skipped,
)
def _write_exif_metadata_to_files(
def _write_exif_metadata_to_file(
self,
exported_file: str,
src,
dest,
options: ExportOptions,
) -> ExportResults:
"""Write exif metadata to files using exiftool."""
"""Write exif metadata to file using exiftool
Note: this method modifies src so src must be a copy of the original file;
it also does not write to dest (dest is the intended destination for purposes of
referencing the export database. This allows the exiftool update to be done on the
local machine prior to being copied to the export destination which may be on a
network drive or other slower external storage."""
export_db = options.export_db
fileutil = options.fileutil
verbose = options.verbose or self._verbose
results = ExportResults()
exiftool_results = ExportResults()
# don't try to write if unsupported file type for exiftool
if not exiftool_can_write(os.path.splitext(src)[-1]):
exiftool_results.exiftool_warning.append(
(
dest,
f"Unsupported file type for exiftool, skipping exiftool for {dest}",
)
)
# set file signature so the file doesn't get re-exported with --update
export_db.set_data(
dest,
uuid=self.photo.uuid,
exif_stat=fileutil.file_sig(src),
exif_json=self._exiftool_json_sidecar(options=options),
)
return exiftool_results
# determine if we need to write the exif metadata
# if we are not updating, we always write
# else, need to check the database to determine if we need to write
run_exiftool = not options.update
if options.update:
files_are_different = False
old_data = export_db.get_exifdata_for_file(exported_file)
old_data = export_db.get_exifdata_for_file(dest)
if old_data is not None:
old_data = json.loads(old_data)[0]
current_data = json.loads(self._exiftool_json_sidecar(options=options))[
@@ -1344,44 +1294,31 @@ class PhotoExporter:
if old_data is None or files_are_different:
# didn't have old data, assume we need to write it
# or files were different
verbose(f"Writing metadata with exiftool for {exported_file}")
if not options.dry_run:
warning_, error_ = self._write_exif_data(
exported_file, options=options
)
if warning_:
results.exiftool_warning.append((exported_file, warning_))
if error_:
results.exiftool_error.append((exported_file, error_))
results.error.append((exported_file, error_))
export_db.set_exifdata_for_file(
exported_file, self._exiftool_json_sidecar(options=options)
)
export_db.set_stat_exif_for_file(
exported_file, fileutil.file_sig(exported_file)
)
results.exif_updated.append(exported_file)
run_exiftool = True
else:
verbose(f"Skipped up to date exiftool metadata for {exported_file}")
else:
verbose(f"Writing metadata with exiftool for {exported_file}")
if not options.dry_run:
warning_, error_ = self._write_exif_data(exported_file, options=options)
if warning_:
results.exiftool_warning.append((exported_file, warning_))
if error_:
results.exiftool_error.append((exported_file, error_))
results.error.append((exported_file, error_))
verbose(
f"Skipped up to date exiftool metadata for {pathlib.Path(dest).name}"
)
export_db.set_exifdata_for_file(
exported_file, self._exiftool_json_sidecar(options=options)
if run_exiftool:
verbose(f"Writing metadata with exiftool for {pathlib.Path(dest).name}")
if not options.dry_run:
warning_, error_ = self._write_exif_data(src, options=options)
if warning_:
exiftool_results.exiftool_warning.append((dest, warning_))
if error_:
exiftool_results.exiftool_error.append((dest, error_))
exiftool_results.error.append((dest, error_))
export_db.set_data(
dest,
uuid=self.photo.uuid,
exif_stat=fileutil.file_sig(src),
exif_json=self._exiftool_json_sidecar(options=options),
)
export_db.set_stat_exif_for_file(
exported_file, fileutil.file_sig(exported_file)
)
results.exif_updated.append(exported_file)
return results
exiftool_results.exif_updated.append(dest)
exiftool_results.to_touch.append(dest)
return exiftool_results
def _write_exif_data(self, filepath: str, options: ExportOptions):
"""write exif data to image file at filepath
@@ -1910,7 +1847,7 @@ def _export_photo_uuid_applescript(
raise ValueError(f"dest {dest} must be a directory")
if not original ^ edited:
raise ValueError(f"edited or original must be True but not both")
raise ValueError("edited or original must be True but not both")
tmpdir = tempfile.TemporaryDirectory(prefix="osxphotos_")
@@ -1933,7 +1870,6 @@ def _export_photo_uuid_applescript(
if not exported_files or not filename:
# nothing got exported
raise ExportError(f"Could not export photo {uuid} ({lineno(__file__)})")
# need to find actual filename as sometimes Photos renames JPG to jpeg on export
# may be more than one file exported (e.g. if Live Photo, Photos exports both .jpeg and .mov)
# TemporaryDirectory will cleanup on return

View File

@@ -35,6 +35,9 @@ from ._constants import (
BURST_KEY,
BURST_NOT_SELECTED,
BURST_SELECTED,
SIDECAR_EXIFTOOL,
SIDECAR_JSON,
SIDECAR_XMP,
TEXT_DETECTION_CONFIDENCE_THRESHOLD,
)
from .adjustmentsinfo import AdjustmentsInfo
@@ -43,7 +46,7 @@ from .exifinfo import ExifInfo
from .exiftool import ExifToolCaching, get_exiftool_path
from .momentinfo import MomentInfo
from .personinfo import FaceInfo, PersonInfo
from .photoexporter import PhotoExporter
from .photoexporter import ExportOptions, PhotoExporter
from .phototemplate import PhotoTemplate, RenderOptions
from .placeinfo import PlaceInfo4, PlaceInfo5
from .query_builder import get_query
@@ -51,7 +54,7 @@ from .scoreinfo import ScoreInfo
from .searchinfo import SearchInfo
from .text_detection import detect_text
from .uti import get_preferred_uti_extension, get_uti_for_extension
from .utils import _debug, _get_resource_loc, findfiles
from .utils import _debug, _get_resource_loc, list_directory
__all__ = ["PhotoInfo", "PhotoInfoNone"]
@@ -366,7 +369,7 @@ class PhotoInfo:
# In Photos 5, raw is in same folder as original but with _4.ext
# Unless "Copy Items to the Photos Library" is not checked
# then RAW image is not renamed but has same name is jpeg buth with raw extension
# Current implementation uses findfiles to find images with the correct raw UTI extension
# Current implementation finds images with the correct raw UTI extension
# in same folder as the original and with same stem as original in form: original_stem*.raw_ext
# TODO: I don't like this -- would prefer a more deterministic approach but until I have more
# data on how Photos stores and retrieves RAW images, this seems to be working
@@ -402,8 +405,7 @@ class PhotoInfo:
# raw files have same name as original but with _4.raw_ext appended
# I believe the _4 maps to PHAssetResourceTypeAlternatePhoto = 4
# see: https://developer.apple.com/documentation/photokit/phassetresourcetype/phassetresourcetypealternatephoto?language=objc
glob_str = f"{filestem}_4*"
raw_file = findfiles(glob_str, filepath)
raw_file = list_directory(filepath, startswith=f"{filestem}_4")
if not raw_file:
photopath = None
else:
@@ -1490,28 +1492,48 @@ class PhotoInfo:
"""
exporter = PhotoExporter(self)
return exporter.export(
dest=dest,
filename=filename,
sidecar = 0
if sidecar_json:
sidecar |= SIDECAR_JSON
if sidecar_exiftool:
sidecar |= SIDECAR_EXIFTOOL
if sidecar_xmp:
sidecar |= SIDECAR_XMP
if not filename:
if not edited:
filename = self.original_filename
else:
original_name = pathlib.Path(self.original_filename)
if self.path_edited:
ext = pathlib.Path(self.path_edited).suffix
else:
uti = self.uti_edited if edited and self.uti_edited else self.uti
ext = get_preferred_uti_extension(uti)
ext = "." + ext
filename = original_name.stem + "_edited" + ext
options = ExportOptions(
description_template=description_template,
edited=edited,
live_photo=live_photo,
raw_photo=raw_photo,
export_as_hardlink=export_as_hardlink,
overwrite=overwrite,
increment=increment,
sidecar_json=sidecar_json,
sidecar_exiftool=sidecar_exiftool,
sidecar_xmp=sidecar_xmp,
use_photos_export=use_photos_export,
timeout=timeout,
exiftool=exiftool,
export_as_hardlink=export_as_hardlink,
increment=increment,
keyword_template=keyword_template,
live_photo=live_photo,
overwrite=overwrite,
raw_photo=raw_photo,
render_options=render_options,
sidecar=sidecar,
timeout=timeout,
use_albums_as_keywords=use_albums_as_keywords,
use_persons_as_keywords=use_persons_as_keywords,
keyword_template=keyword_template,
description_template=description_template,
render_options=render_options,
use_photos_export=use_photos_export,
)
results = exporter.export(dest, filename=filename, options=options)
return results.exported
def _get_album_uuids(self, project=False):
"""Return list of album UUIDs this photo is found in

View File

@@ -39,6 +39,7 @@ from .._constants import (
_PHOTOS_5_PROJECT_ALBUM_KIND,
_PHOTOS_5_ROOT_FOLDER_KIND,
_PHOTOS_5_SHARED_ALBUM_KIND,
_PHOTOS_5_VERSION,
_TESTED_OS_VERSIONS,
_UNKNOWN_PERSON,
BURST_KEY,
@@ -659,14 +660,18 @@ class PhotosDB:
for person in c:
pk = person[0]
fullname = person[2] if person[2] is not None else _UNKNOWN_PERSON
fullname = (
normalize_unicode(person[2])
if person[2] is not None
else _UNKNOWN_PERSON
)
self._dbpersons_pk[pk] = {
"pk": pk,
"uuid": person[1],
"fullname": fullname,
"facecount": person[3],
"keyface": person[5],
"displayname": person[4],
"displayname": normalize_unicode(person[4]),
"photo_uuid": None,
"keyface_uuid": None,
}
@@ -733,13 +738,6 @@ class PhotosDB:
except KeyError:
self._dbfaces_pk[pk] = [uuid]
if _debug():
logging.debug(f"Finished walking through persons")
logging.debug(pformat(self._dbpersons_pk))
logging.debug(pformat(self._dbpersons_fullname))
logging.debug(pformat(self._dbfaces_pk))
logging.debug(pformat(self._dbfaces_uuid))
# Get info on albums
verbose("Processing albums.")
c.execute(
@@ -876,14 +874,6 @@ class PhotosDB:
else:
self._dbalbum_folders[album] = {}
if _debug():
logging.debug(f"Finished walking through albums")
logging.debug(pformat(self._dbalbums_album))
logging.debug(pformat(self._dbalbums_uuid))
logging.debug(pformat(self._dbalbum_details))
logging.debug(pformat(self._dbalbum_folders))
logging.debug(pformat(self._dbfolder_details))
# Get info on keywords
verbose("Processing keywords.")
c.execute(
@@ -899,13 +889,16 @@ class PhotosDB:
RKMaster.uuid = RKVersion.masterUuid
"""
)
for keyword in c:
if not keyword[1] in self._dbkeywords_uuid:
self._dbkeywords_uuid[keyword[1]] = []
if not keyword[0] in self._dbkeywords_keyword:
self._dbkeywords_keyword[keyword[0]] = []
self._dbkeywords_uuid[keyword[1]].append(keyword[0])
self._dbkeywords_keyword[keyword[0]].append(keyword[1])
for keyword_title, keyword_uuid, _ in c:
keyword_title = normalize_unicode(keyword_title)
try:
self._dbkeywords_uuid[keyword_uuid].append(keyword_title)
except KeyError:
self._dbkeywords_uuid[keyword_uuid] = [keyword_title]
try:
self._dbkeywords_keyword[keyword_title].append(keyword_uuid)
except KeyError:
self._dbkeywords_keyword[keyword_title] = [keyword_uuid]
# Get info on disk volumes
c.execute("select RKVolume.modelId, RKVolume.name from RKVolume")
@@ -1027,13 +1020,11 @@ class PhotosDB:
for row in c:
uuid = row[0]
if _debug():
logging.debug(f"uuid = '{uuid}, master = '{row[2]}")
self._dbphotos[uuid] = {}
self._dbphotos[uuid]["_uuid"] = uuid # stored here for easier debugging
self._dbphotos[uuid]["modelID"] = row[1]
self._dbphotos[uuid]["masterUuid"] = row[2]
self._dbphotos[uuid]["filename"] = row[3]
self._dbphotos[uuid]["filename"] = normalize_unicode(row[3])
# There are sometimes negative values for lastmodifieddate in the database
# I don't know what these mean but they will raise exception in datetime if
@@ -1272,13 +1263,13 @@ class PhotosDB:
info["volumeId"] = row[1]
info["imagePath"] = row[2]
info["isMissing"] = row[3]
info["originalFilename"] = row[4]
info["originalFilename"] = normalize_unicode(row[4])
info["UTI"] = row[5]
info["modelID"] = row[6]
info["fileSize"] = row[7]
info["isTrulyRAW"] = row[8]
info["alternateMasterUuid"] = row[9]
info["filename"] = row[10]
info["filename"] = normalize_unicode(row[10])
self._dbphotos_master[uuid] = info
# get details needed to find path of the edited photos
@@ -1550,39 +1541,6 @@ class PhotosDB:
# done processing, dump debug data if requested
verbose("Done processing details from Photos library.")
if _debug():
logging.debug("Faces (_dbfaces_uuid):")
logging.debug(pformat(self._dbfaces_uuid))
logging.debug("Persons (_dbpersons_pk):")
logging.debug(pformat(self._dbpersons_pk))
logging.debug("Keywords by uuid (_dbkeywords_uuid):")
logging.debug(pformat(self._dbkeywords_uuid))
logging.debug("Keywords by keyword (_dbkeywords_keywords):")
logging.debug(pformat(self._dbkeywords_keyword))
logging.debug("Albums by uuid (_dbalbums_uuid):")
logging.debug(pformat(self._dbalbums_uuid))
logging.debug("Albums by album (_dbalbums_albums):")
logging.debug(pformat(self._dbalbums_album))
logging.debug("Album details (_dbalbum_details):")
logging.debug(pformat(self._dbalbum_details))
logging.debug("Album titles (_dbalbum_titles):")
logging.debug(pformat(self._dbalbum_titles))
logging.debug("Volumes (_dbvolumes):")
logging.debug(pformat(self._dbvolumes))
logging.debug("Photos (_dbphotos):")
logging.debug(pformat(self._dbphotos))
logging.debug("Burst Photos (dbphotos_burst:")
logging.debug(pformat(self._dbphotos_burst))
def _build_album_folder_hierarchy_4(self, uuid, folders=None):
"""recursively build folder/album hierarchy
@@ -1673,7 +1631,7 @@ class PhotosDB:
for person in c:
pk = person[0]
fullname = (
person[2]
normalize_unicode(person[2])
if (person[2] != "" and person[2] is not None)
else _UNKNOWN_PERSON
)
@@ -1683,7 +1641,7 @@ class PhotosDB:
"fullname": fullname,
"facecount": person[3],
"keyface": person[4],
"displayname": person[5],
"displayname": normalize_unicode(person[5]),
"photo_uuid": None,
"keyface_uuid": None,
}
@@ -1747,13 +1705,6 @@ class PhotosDB:
except KeyError:
self._dbfaces_pk[pk] = [uuid]
if _debug():
logging.debug(f"Finished walking through persons")
logging.debug(pformat(self._dbpersons_pk))
logging.debug(pformat(self._dbpersons_fullname))
logging.debug(pformat(self._dbfaces_pk))
logging.debug(pformat(self._dbfaces_uuid))
# get details about albums
verbose("Processing albums.")
c.execute(
@@ -1870,13 +1821,6 @@ class PhotosDB:
# shared albums can't be in folders
self._dbalbum_folders[album] = []
if _debug():
logging.debug(f"Finished walking through albums")
logging.debug(pformat(self._dbalbums_album))
logging.debug(pformat(self._dbalbums_uuid))
logging.debug(pformat(self._dbalbum_details))
logging.debug(pformat(self._dbalbum_folders))
# get details on keywords
verbose("Processing keywords.")
c.execute(
@@ -1886,29 +1830,22 @@ class PhotosDB:
JOIN Z_1KEYWORDS ON Z_1KEYWORDS.Z_1ASSETATTRIBUTES = ZADDITIONALASSETATTRIBUTES.Z_PK
JOIN ZKEYWORD ON ZKEYWORD.Z_PK = {keyword_join} """
)
for keyword in c:
keyword_title = normalize_unicode(keyword[0])
if not keyword[1] in self._dbkeywords_uuid:
self._dbkeywords_uuid[keyword[1]] = []
if not keyword_title in self._dbkeywords_keyword:
self._dbkeywords_keyword[keyword_title] = []
self._dbkeywords_uuid[keyword[1]].append(keyword[0])
self._dbkeywords_keyword[keyword_title].append(keyword[1])
if _debug():
logging.debug(f"Finished walking through keywords")
logging.debug(pformat(self._dbkeywords_keyword))
logging.debug(pformat(self._dbkeywords_uuid))
for keyword_title, keyword_uuid in c:
keyword_title = normalize_unicode(keyword_title)
try:
self._dbkeywords_uuid[keyword_uuid].append(keyword_title)
except KeyError:
self._dbkeywords_uuid[keyword_uuid] = [keyword_title]
try:
self._dbkeywords_keyword[keyword_title].append(keyword_uuid)
except KeyError:
self._dbkeywords_keyword[keyword_title] = [keyword_uuid]
# get details on disk volumes
c.execute("SELECT ZUUID, ZNAME from ZFILESYSTEMVOLUME")
for vol in c:
self._dbvolumes[vol[0]] = vol[1]
if _debug():
logging.debug(f"Finished walking through volumes")
logging.debug(self._dbvolumes)
# get details about photos
verbose("Processing photo details.")
c.execute(
@@ -2042,8 +1979,8 @@ class PhotosDB:
info["hidden"] = row[9]
info["favorite"] = row[10]
info["originalFilename"] = row[3]
info["filename"] = row[12]
info["originalFilename"] = normalize_unicode(row[3])
info["filename"] = normalize_unicode(row[12])
info["directory"] = row[11]
# set latitude and longitude
@@ -2521,48 +2458,6 @@ class PhotosDB:
# done processing, dump debug data if requested
verbose("Done processing details from Photos library.")
if _debug():
logging.debug("Faces (_dbfaces_uuid):")
logging.debug(pformat(self._dbfaces_uuid))
logging.debug("Persons (_dbpersons_pk):")
logging.debug(pformat(self._dbpersons_pk))
logging.debug("Keywords by uuid (_dbkeywords_uuid):")
logging.debug(pformat(self._dbkeywords_uuid))
logging.debug("Keywords by keyword (_dbkeywords_keywords):")
logging.debug(pformat(self._dbkeywords_keyword))
logging.debug("Albums by uuid (_dbalbums_uuid):")
logging.debug(pformat(self._dbalbums_uuid))
logging.debug("Albums by album (_dbalbums_albums):")
logging.debug(pformat(self._dbalbums_album))
logging.debug("Album details (_dbalbum_details):")
logging.debug(pformat(self._dbalbum_details))
logging.debug("Album titles (_dbalbum_titles):")
logging.debug(pformat(self._dbalbum_titles))
logging.debug("Album folders (_dbalbum_folders):")
logging.debug(pformat(self._dbalbum_folders))
logging.debug("Album parent folders (_dbalbum_parent_folders):")
logging.debug(pformat(self._dbalbum_parent_folders))
logging.debug("Albums pk (_dbalbums_pk):")
logging.debug(pformat(self._dbalbums_pk))
logging.debug("Volumes (_dbvolumes):")
logging.debug(pformat(self._dbvolumes))
logging.debug("Photos (_dbphotos):")
logging.debug(pformat(self._dbphotos))
logging.debug("Burst Photos (dbphotos_burst:")
logging.debug(pformat(self._dbphotos_burst))
def _process_moments(self):
"""Process data from ZMOMENT table"""
@@ -2623,8 +2518,8 @@ class PhotosDB:
moment_info["modificationDate"] = row[6]
moment_info["representativeDate"] = row[7]
moment_info["startDate"] = row[8]
moment_info["subtitle"] = row[9]
moment_info["title"] = row[10]
moment_info["subtitle"] = normalize_unicode(row[9])
moment_info["title"] = normalize_unicode(row[10])
moment_info["uuid"] = row[11]
# if both lat/lon == -180, then it means location undefined
@@ -3027,6 +2922,7 @@ class PhotosDB:
if keywords:
keyword_set = set()
for keyword in keywords:
keyword = normalize_unicode(keyword)
if keyword in self._dbkeywords_keyword:
keyword_set.update(self._dbkeywords_keyword[keyword])
photos_sets.append(keyword_set)
@@ -3034,6 +2930,7 @@ class PhotosDB:
if persons:
person_set = set()
for person in persons:
person = normalize_unicode(person)
if person in self._dbpersons_fullname:
for pk in self._dbpersons_fullname[person]:
try:
@@ -3076,8 +2973,6 @@ class PhotosDB:
):
info = PhotoInfo(db=self, uuid=p, info=self._dbphotos[p])
photoinfo.append(info)
if _debug:
logging.debug(f"photoinfo: {pformat(photoinfo)}")
return photoinfo
@@ -3414,23 +3309,35 @@ class PhotosDB:
# case-insensitive
for n in name:
n = n.lower()
photo_list.extend(
[
p
for p in photos
if n in p.filename.lower()
or n in p.original_filename.lower()
]
)
if self._db_version >= _PHOTOS_5_VERSION:
# search only original_filename (#594)
photo_list.extend(
[p for p in photos if n in p.original_filename.lower()]
)
else:
photo_list.extend(
[
p
for p in photos
if n in p.filename.lower()
or n in p.original_filename.lower()
]
)
else:
for n in name:
photo_list.extend(
[
p
for p in photos
if n in p.filename or n in p.original_filename
]
)
if self._db_version >= _PHOTOS_5_VERSION:
# search only original_filename (#594)
photo_list.extend(
[p for p in photos if n in p.original_filename]
)
else:
photo_list.extend(
[
p
for p in photos
if n in p.filename or n in p.original_filename
]
)
photos = photo_list
if options.min_size:

View File

@@ -113,9 +113,8 @@ def get_photos_library_version(library_path):
return 3
if db_ver == int(_PHOTOS_4_VERSION):
return 4
if db_ver != int(_PHOTOS_5_VERSION):
raise UnknownLibraryVersion(f"db_ver = {db_ver}")
# assume it's a Photos 5+ library, get the model version to determine which version
model_ver = get_model_version(str(library_path / "database" / "Photos.sqlite"))
model_ver = int(model_ver)
if _PHOTOS_5_MODEL_VERSION[0] <= model_ver <= _PHOTOS_5_MODEL_VERSION[1]:

View File

@@ -17,7 +17,6 @@ from ._constants import _UNKNOWN_PERSON, TEXT_DETECTION_CONFIDENCE_THRESHOLD
from ._version import __version__
from .datetime_formatter import DateTimeFormatter
from .exiftool import ExifToolCaching
from .export_db import ExportDB_ABC, ExportDBInMemory
from .path_utils import sanitize_dirname, sanitize_filename, sanitize_pathpart
from .text_detection import detect_text
from .utils import expand_and_validate_filepath, load_function
@@ -300,7 +299,6 @@ class RenderOptions:
dest_path: set to the destination path of the photo (for use by {function} template), only valid with --filename
filepath: set to value for filepath of the exported photo if you want to evaluate {filepath} template
quote: quote path templates for execution in the shell
exportdb: ExportDB object
"""
none_str: str = "_"
@@ -315,7 +313,6 @@ class RenderOptions:
dest_path: Optional[str] = None
filepath: Optional[str] = None
quote: bool = False
exportdb: Optional[ExportDB_ABC] = None
class PhotoTemplateParser:
@@ -384,9 +381,6 @@ class PhotoTemplate:
self.filepath = options.filepath
self.quote = options.quote
self.dest_path = options.dest_path
self.exportdb = options.exportdb or ExportDBInMemory(
None, self.export_dir or "."
)
def render(
self,
@@ -420,7 +414,6 @@ class PhotoTemplate:
self.filepath = options.filepath
self.quote = options.quote
self.dest_path = options.dest_path
self.exportdb = options.exportdb or self.exportdb
try:
model = self.parser.parse(template)
@@ -1216,7 +1209,7 @@ class PhotoTemplate:
else:
values = list(obj)
elif field == "detected_text":
values = _get_detected_text(self.photo, self.exportdb, confidence=subfield)
values = _get_detected_text(self.photo, confidence=subfield)
else:
raise ValueError(f"Unhandled template value: {field}")
@@ -1459,7 +1452,7 @@ def _get_album_by_path(photo, folder_album_path):
return None
def _get_detected_text(photo, exportdb, confidence=TEXT_DETECTION_CONFIDENCE_THRESHOLD):
def _get_detected_text(photo, confidence=TEXT_DETECTION_CONFIDENCE_THRESHOLD):
"""Returns the detected text for a photo
{detected_text} uses this instead of PhotoInfo.detected_text() to cache the text for all confidence values
"""
@@ -1475,5 +1468,4 @@ def _get_detected_text(photo, exportdb, confidence=TEXT_DETECTION_CONFIDENCE_THR
# _detected_text caches the text detection results in an extended attribute
# so the first time this gets called is slow but repeated accesses are fast
detected_text = photo._detected_text()
exportdb.set_detected_text_for_uuid(photo.uuid, json.dumps(detected_text))
return [text for text, conf in detected_text if conf >= confidence]

View File

@@ -1,5 +1,6 @@
""" Utility functions used in osxphotos """
import datetime
import fnmatch
import glob
import importlib
@@ -16,28 +17,28 @@ import sys
import unicodedata
import urllib.parse
from plistlib import load as plistload
from typing import Callable, Union
from typing import Callable, List, Union, Optional
import CoreFoundation
import objc
from Foundation import NSString
from Foundation import NSFileManager, NSPredicate, NSString
from ._constants import UNICODE_FORMAT
__all__ = [
"noop",
"lineno",
"dd_to_dms_str",
"get_system_library_path",
"expand_and_validate_filepath",
"get_last_library_path",
"list_photo_libraries",
"normalize_fs_path",
"findfiles",
"normalize_unicode",
"get_system_library_path",
"increment_filename_with_count",
"increment_filename",
"expand_and_validate_filepath",
"lineno",
"list_directory",
"list_photo_libraries",
"load_function",
"noop",
"normalize_fs_path",
"normalize_unicode",
]
_DEBUG = False
@@ -264,7 +265,9 @@ def list_photo_libraries():
# On older MacOS versions, mdfind appears to ignore some libraries
# glob to find libraries in ~/Pictures then mdfind to find all the others
# TODO: make this more robust
lib_list = glob.glob(f"{str(pathlib.Path.home())}/Pictures/*.photoslibrary")
lib_list = list_directory(
f"{pathlib.Path.home()}/Pictures/", glob="*.photoslibrary"
)
# On older OS, may not get all libraries so make sure we get the last one
last_lib = get_last_library_path()
@@ -283,24 +286,95 @@ def list_photo_libraries():
def normalize_fs_path(path: str) -> str:
"""Normalize filesystem paths with unicode in them"""
with objc.autorelease_pool():
normalized_path = NSString.fileSystemRepresentation(path)
return normalized_path.decode("utf8")
# macOS HFS+ uses NFD, APFS doesn't normalize but stick with NFD
# ref: https://eclecticlight.co/2021/05/08/explainer-unicode-normalization-and-apfs/
return unicodedata.normalize("NFD", path)
def findfiles(pattern, path_):
"""Returns list of filenames from path_ matched by pattern
shell pattern. Matching is case-insensitive.
If 'path_' is invalid/doesn't exist, returns []."""
if not os.path.isdir(path_):
# def findfiles(pattern, path):
# """Returns list of filenames from path matched by pattern
# shell pattern. Matching is case-insensitive.
# If 'path_' is invalid/doesn't exist, returns []."""
# if not os.path.isdir(path):
# return []
# # paths need to be normalized for unicode as filesystem returns unicode in NFD form
# pattern = normalize_fs_path(pattern)
# rule = re.compile(fnmatch.translate(pattern), re.IGNORECASE)
# files = os.listdir(path)
# return [name for name in files if rule.match(name)]
def list_directory(
directory: Union[str, pathlib.Path],
startswith: Optional[str] = None,
endswith: Optional[str] = None,
contains: Optional[str] = None,
glob: Optional[str] = None,
include_path: bool = False,
case_sensitive: bool = False,
) -> List[Union[str, pathlib.Path]]:
"""List directory contents and return list of files or directories matching search criteria.
Accounts for case-insensitive filesystems, unicode filenames. directory can be a str or a pathlib.Path object.
Args:
directory: directory to search
startswith: string to match at start of filename
endswith: string to match at end of filename
contains: string to match anywhere in filename
glob: shell-style glob pattern to match filename
include_path: if True, return full path to file
case_sensitive: if True, match case-sensitively
Returns: List of files or directories matching search criteria as either str or pathlib.Path objects depending on the input type;
returns empty list if directory is invalid or doesn't exist.
"""
is_pathlib = isinstance(directory, pathlib.Path)
if is_pathlib:
directory = str(directory)
if not os.path.isdir(directory):
return []
# See: https://gist.github.com/techtonik/5694830
# paths need to be normalized for unicode as filesystem returns unicode in NFD form
pattern = normalize_fs_path(pattern)
rule = re.compile(fnmatch.translate(pattern), re.IGNORECASE)
files = [normalize_fs_path(p) for p in os.listdir(path_)]
return [name for name in files if rule.match(name)]
startswith = normalize_fs_path(startswith) if startswith else None
endswith = normalize_fs_path(endswith) if endswith else None
contains = normalize_fs_path(contains) if contains else None
glob = normalize_fs_path(glob) if glob else None
files = [normalize_fs_path(f) for f in os.listdir(directory)]
if not case_sensitive:
files_normalized = {f.lower(): f for f in files}
files = [f.lower() for f in files]
startswith = startswith.lower() if startswith else None
endswith = endswith.lower() if endswith else None
contains = contains.lower() if contains else None
glob = glob.lower() if glob else None
else:
files_normalized = {f: f for f in files}
if startswith:
files = [f for f in files if f.startswith(startswith)]
if endswith:
endswith = normalize_fs_path(endswith)
files = [f for f in files if f.endswith(endswith)]
if contains:
contains = normalize_fs_path(contains)
files = [f for f in files if contains in f]
if glob:
glob = normalize_fs_path(glob)
flags = re.IGNORECASE if not case_sensitive else 0
rule = re.compile(fnmatch.translate(glob), flags)
files = [f for f in files if rule.match(f)]
files = [files_normalized[f] for f in files]
if include_path:
files = [os.path.join(directory, f) for f in files]
if is_pathlib:
files = [pathlib.Path(f) for f in files]
return files
def _open_sql_file(dbname):
@@ -341,44 +415,16 @@ def _db_is_locked(dbname):
return locked
# OSXPHOTOS_XATTR_UUID = "com.osxphotos.uuid"
# def get_uuid_for_file(filepath):
# """ returns UUID associated with an exported file
# filepath: path to exported photo
# """
# attr = xattr.xattr(filepath)
# try:
# uuid_bytes = attr[OSXPHOTOS_XATTR_UUID]
# uuid_str = uuid_bytes.decode('utf-8')
# except KeyError:
# uuid_str = None
# return uuid_str
# def set_uuid_for_file(filepath, uuid):
# """ sets the UUID associated with an exported file
# filepath: path to exported photo
# uuid: uuid string for photo
# """
# if not os.path.exists(filepath):
# raise FileNotFoundError(f"Missing file: {filepath}")
# attr = xattr.xattr(filepath)
# uuid_bytes = bytes(uuid, 'utf-8')
# attr.set(OSXPHOTOS_XATTR_UUID, uuid_bytes)
def normalize_unicode(value):
"""normalize unicode data"""
if value is not None:
if isinstance(value, (tuple, list)):
return tuple(unicodedata.normalize(UNICODE_FORMAT, v) for v in value)
elif isinstance(value, str):
return unicodedata.normalize(UNICODE_FORMAT, value)
else:
return value
else:
if value is None:
return None
if isinstance(value, (tuple, list)):
return tuple(unicodedata.normalize(UNICODE_FORMAT, v) for v in value)
elif isinstance(value, str):
return unicodedata.normalize(UNICODE_FORMAT, value)
else:
return value
def increment_filename_with_count(
@@ -399,16 +445,16 @@ def increment_filename_with_count(
Note: This obviously is subject to race condition so using with caution.
"""
dest = filepath if isinstance(filepath, pathlib.Path) else pathlib.Path(filepath)
dest_files = findfiles(f"{dest.stem}*", str(dest.parent))
dest_files = [normalize_fs_path(pathlib.Path(f).stem.lower()) for f in dest_files]
dest_new = dest.stem
if count:
dest_new = f"{dest.stem} ({count})"
while normalize_fs_path(dest_new.lower()) in dest_files:
dest_files = list_directory(dest.parent, startswith=dest.stem)
dest_files = [f.stem.lower() for f in dest_files]
dest_new = f"{dest.stem} ({count})" if count else dest.stem
dest_new = normalize_fs_path(dest_new)
while dest_new.lower() in dest_files:
count += 1
dest_new = f"{dest.stem} ({count})"
dest_new = normalize_fs_path(f"{dest.stem} ({count})")
dest = dest.parent / f"{dest_new}{dest.suffix}"
return str(dest), count
return normalize_fs_path(str(dest)), count
def increment_filename(filepath: Union[str, pathlib.Path]) -> str:
@@ -466,3 +512,9 @@ def load_function(pyfile: str, function_name: str) -> Callable:
sys.path = syspath
return func
def format_sec_to_hhmmss(sec: float) -> str:
"""Format seconds to hh:mm:ss"""
delta = datetime.timedelta(seconds=sec)
return str(delta).split(".")[0]

View File

@@ -7,7 +7,7 @@
<key>hostuuid</key>
<string>585B80BF-8D1F-55EF-A9E8-6CF4E5523959</string>
<key>pid</key>
<integer>1961</integer>
<integer>14817</integer>
<key>processname</key>
<string>photolibraryd</string>
<key>uid</key>

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.1 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.8 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.8 MiB

View File

@@ -3,24 +3,24 @@
<plist version="1.0">
<dict>
<key>BackgroundHighlightCollection</key>
<date>2021-09-14T04:40:42Z</date>
<date>2022-02-04T13:51:40Z</date>
<key>BackgroundHighlightEnrichment</key>
<date>2021-09-14T04:40:42Z</date>
<date>2022-02-04T13:51:39Z</date>
<key>BackgroundJobAssetRevGeocode</key>
<date>2021-09-14T04:40:42Z</date>
<date>2022-02-04T13:51:40Z</date>
<key>BackgroundJobSearch</key>
<date>2021-09-14T04:40:42Z</date>
<date>2022-02-04T13:51:40Z</date>
<key>BackgroundPeopleSuggestion</key>
<date>2021-09-14T04:40:41Z</date>
<date>2022-02-04T13:51:39Z</date>
<key>BackgroundUserBehaviorProcessor</key>
<date>2021-09-14T04:40:42Z</date>
<date>2022-02-04T13:51:40Z</date>
<key>PhotoAnalysisGraphLastBackgroundGraphConsistencyUpdateJobDateKey</key>
<date>2021-07-20T05:48:08Z</date>
<key>PhotoAnalysisGraphLastBackgroundGraphRebuildJobDate</key>
<date>2021-07-20T05:47:59Z</date>
<key>PhotoAnalysisGraphLastBackgroundMemoryGenerationJobDate</key>
<date>2021-09-14T04:40:43Z</date>
<date>2022-02-04T13:51:40Z</date>
<key>SiriPortraitDonation</key>
<date>2021-09-14T04:40:42Z</date>
<date>2022-02-04T13:51:40Z</date>
</dict>
</plist>

Binary file not shown.

After

Width:  |  Height:  |  Size: 191 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 123 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 178 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 123 KiB

25
tests/test___all__.py Normal file
View File

@@ -0,0 +1,25 @@
import re
import sys
from os import walk
from collections import Counter
FILE_PATTERN = "^(?!_).*\.py$"
SOUCE_CODE_ROOT = "osxphotos"
def create_module_name(dirpath: str, filename: str) -> str:
prefix = dirpath[dirpath.rfind(SOUCE_CODE_ROOT):].replace('/', '.')
return f"{prefix}.{filename}".replace(".py", "")
def test_check_duplicate():
for dirpath, dirnames, filenames in walk(SOUCE_CODE_ROOT):
print("\n", sys.modules)
for filename in filenames:
if re.search(FILE_PATTERN, filename):
module = create_module_name(dirpath, filename)
if module in sys.modules:
all_list = sys.modules[module].__all__
all_set = set(all_list)
assert Counter(all_list) == Counter(all_set)

View File

@@ -21,6 +21,7 @@ FOLDER_ALBUM_DICT = {
ALBUM_NAMES = [
"2018-10 - Sponsion, Museum, Frühstück, Römermuseum",
"2019-10/11 Paris Clermont",
"Água",
"AlbumInFolder",
"EmptyAlbum",
"I have a deleted twin",
@@ -38,6 +39,7 @@ ALBUM_NAMES = [
ALBUM_PARENT_DICT = {
"2018-10 - Sponsion, Museum, Frühstück, Römermuseum": None,
"2019-10/11 Paris Clermont": None,
"Água": None,
"AlbumInFolder": "SubFolder2",
"EmptyAlbum": None,
"I have a deleted twin": None,
@@ -54,6 +56,7 @@ ALBUM_PARENT_DICT = {
ALBUM_FOLDER_NAMES_DICT = {
"2018-10 - Sponsion, Museum, Frühstück, Römermuseum": [],
"2019-10/11 Paris Clermont": [],
"Água": [],
"AlbumInFolder": ["Folder1", "SubFolder2"],
"EmptyAlbum": [],
"I have a deleted twin": [],
@@ -70,6 +73,7 @@ ALBUM_FOLDER_NAMES_DICT = {
ALBUM_LEN_DICT = {
"2018-10 - Sponsion, Museum, Frühstück, Römermuseum": 1,
"2019-10/11 Paris Clermont": 1,
"Água": 3,
"AlbumInFolder": 2,
"EmptyAlbum": 0,
"I have a deleted twin": 1,
@@ -103,6 +107,11 @@ ALBUM_PHOTO_UUID_DICT = {
"4D521201-92AC-43E5-8F7C-59BC41C37A96",
"8E1D7BC9-9321-44F9-8CFB-4083F6B9232A",
],
"Água": [
"7FD37B5F-6FAA-4DB1-8A29-BF9C37E38091",
"2DFD33F1-A5D8-486F-A3A9-98C07995535A",
"54E76FCB-D353-4557-9997-0A457BCB4D48",
],
}
UUID_DICT = {

View File

@@ -24,10 +24,10 @@ PHOTOS_DB = "tests/Test-10.15.7.photoslibrary/database/photos.db"
PHOTOS_DB_PATH = "/Test-10.15.7.photoslibrary/database/photos.db"
PHOTOS_LIBRARY_PATH = "/Test-10.15.7.photoslibrary"
PHOTOS_DB_LEN = 25
PHOTOS_NOT_IN_TRASH_LEN = 23
PHOTOS_DB_LEN = 29
PHOTOS_NOT_IN_TRASH_LEN = 27
PHOTOS_IN_TRASH_LEN = 2
PHOTOS_DB_IMPORT_SESSIONS = 17
PHOTOS_DB_IMPORT_SESSIONS = 21
KEYWORDS = [
"Kids",
@@ -72,6 +72,7 @@ ALBUMS = [
"Sorted Oldest First",
"Sorted Title",
"Test Album", # there are 2 albums named "Test Album" for testing duplicate album names
"Água",
]
KEYWORDS_DICT = {
"Drink": 2,
@@ -115,6 +116,7 @@ ALBUM_DICT = {
"Sorted Oldest First": 3,
"Sorted Title": 3,
"Test Album": 2,
"Água": 3,
} # Note: there are 2 albums named "Test Album" for testing duplicate album names
UUID_DICT = {
@@ -1091,7 +1093,7 @@ def test_from_to_date(photosdb):
time.tzset()
photos = photosdb.photos(from_date=datetime.datetime(2018, 10, 28))
assert len(photos) == 16
assert len(photos) == 20
photos = photosdb.photos(to_date=datetime.datetime(2018, 10, 28))
assert len(photos) == 7

View File

@@ -8,6 +8,7 @@ from click.testing import CliRunner
import osxphotos
from osxphotos.exiftool import get_exiftool_path
from osxphotos.utils import normalize_unicode
CLI_PHOTOS_DB = "tests/Test-10.15.7.photoslibrary"
LIVE_PHOTOS_DB = "tests/Test-Cloud-10.15.1.photoslibrary"
@@ -79,64 +80,69 @@ CLI_OUTPUT_NO_SUBCOMMAND = [
CLI_OUTPUT_QUERY_UUID = '[{"uuid": "D79B8D77-BFFC-460B-9312-034F2877D35B", "filename": "D79B8D77-BFFC-460B-9312-034F2877D35B.jpeg", "original_filename": "Pumkins2.jpg", "date": "2018-09-28T16:07:07-04:00", "description": "Girl holding pumpkin", "title": "I found one!", "keywords": ["Kids"], "albums": ["Pumpkin Farm", "Test Album", "Multi Keyword"], "persons": ["Katie"], "path": "/tests/Test-10.15.7.photoslibrary/originals/D/D79B8D77-BFFC-460B-9312-034F2877D35B.jpeg", "ismissing": false, "hasadjustments": false, "external_edit": false, "favorite": false, "hidden": false, "latitude": 41.256566, "longitude": -95.940257, "path_edited": null, "shared": false, "isphoto": true, "ismovie": false, "uti": "public.jpeg", "burst": false, "live_photo": false, "path_live_photo": null, "iscloudasset": false, "incloud": null}]'
CLI_EXPORT_FILENAMES = [
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"St James Park.jpg",
"St James Park_edited.jpeg",
"Tulips.jpg",
"wedding.jpg",
"wedding_edited.jpeg",
"[2020-08-29] AAF035 (1).jpg",
"[2020-08-29] AAF035 (2).jpg",
"[2020-08-29] AAF035 (3).jpg",
"[2020-08-29] AAF035.jpg",
"DSC03584.dng",
"IMG_1693.tif",
"IMG_1994.JPG",
"IMG_1994.cr2",
"IMG_1997.JPG",
"IMG_1997.cr2",
"IMG_3092.heic",
"IMG_3092_edited.jpeg",
"IMG_4547.jpg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"Tulips_edited.jpeg",
"screenshot-really-a-png.jpeg",
"winebottle.jpeg",
"winebottle (1).jpeg",
"Frítest.jpg",
"Frítest (1).jpg",
"Frítest (2).jpg",
"Frítest (3).jpg",
"Frítest_edited.jpeg",
"Frítest_edited (1).jpeg",
"Frítest_edited.jpeg",
"Frítest.jpg",
"IMG_1693.tif",
"IMG_1994.cr2",
"IMG_1994.JPG",
"IMG_1997.cr2",
"IMG_1997.JPG",
"IMG_3092_edited.jpeg",
"IMG_3092.heic",
"IMG_4547.jpg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"screenshot-really-a-png.jpeg",
"St James Park_edited.jpeg",
"St James Park.jpg",
"Tulips_edited.jpeg",
"Tulips.jpg",
"wedding_edited.jpeg",
"wedding.jpg",
"winebottle (1).jpeg",
"winebottle.jpeg",
]
CLI_EXPORT_FILENAMES_DRY_RUN = [
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"St James Park.jpg",
"St James Park_edited.jpeg",
"Tulips.jpg",
"wedding.jpg",
"wedding_edited.jpeg",
"[2020-08-29] AAF035.jpg",
"DSC03584.dng",
"Frítest_edited.jpeg",
"Frítest.jpg",
"IMG_1693.tif",
"IMG_1994.JPG",
"IMG_1994.cr2",
"IMG_1997.JPG",
"IMG_1994.JPG",
"IMG_1997.cr2",
"IMG_3092.heic",
"IMG_1997.JPG",
"IMG_3092_edited.jpeg",
"IMG_3092.heic",
"IMG_4547.jpg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"Tulips_edited.jpeg",
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"screenshot-really-a-png.jpeg",
"St James Park_edited.jpeg",
"St James Park.jpg",
"Tulips_edited.jpeg",
"Tulips.jpg",
"wedding_edited.jpeg",
"wedding.jpg",
"winebottle.jpeg",
"winebottle.jpeg",
"Frítest.jpg",
"Frítest_edited.jpeg",
]
CLI_EXPORT_IGNORE_SIGNATURE_FILENAMES = ["Tulips.jpg", "wedding.jpg"]
@@ -154,225 +160,253 @@ CLI_EXPORT_ORIGINAL_SUFFIX_TEMPLATE = "{edited?_original,}"
CLI_EXPORT_PREVIEW_SUFFIX = "_lowres"
CLI_EXPORT_FILENAMES_EDITED_SUFFIX = [
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"St James Park.jpg",
"St James Park_bearbeiten.jpeg",
"Tulips.jpg",
"wedding.jpg",
"wedding_bearbeiten.jpeg",
"[2020-08-29] AAF035 (1).jpg",
"[2020-08-29] AAF035 (2).jpg",
"[2020-08-29] AAF035 (3).jpg",
"[2020-08-29] AAF035.jpg",
"DSC03584.dng",
"IMG_1693.tif",
"IMG_1994.JPG",
"IMG_1994.cr2",
"IMG_1997.JPG",
"IMG_1997.cr2",
"IMG_3092.heic",
"IMG_3092_bearbeiten.jpeg",
"IMG_4547.jpg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"Tulips_bearbeiten.jpeg",
"screenshot-really-a-png.jpeg",
"winebottle.jpeg",
"winebottle (1).jpeg",
"Frítest.jpg",
"Frítest (1).jpg",
"Frítest (2).jpg",
"Frítest (3).jpg",
"Frítest_bearbeiten.jpeg",
"Frítest_bearbeiten (1).jpeg",
"Frítest_bearbeiten.jpeg",
"Frítest.jpg",
"IMG_1693.tif",
"IMG_1994.cr2",
"IMG_1994.JPG",
"IMG_1997.cr2",
"IMG_1997.JPG",
"IMG_3092_bearbeiten.jpeg",
"IMG_3092.heic",
"IMG_4547.jpg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"screenshot-really-a-png.jpeg",
"St James Park_bearbeiten.jpeg",
"St James Park.jpg",
"Tulips_bearbeiten.jpeg",
"Tulips.jpg",
"wedding_bearbeiten.jpeg",
"wedding.jpg",
"winebottle (1).jpeg",
"winebottle.jpeg",
]
CLI_EXPORT_FILENAMES_EDITED_SUFFIX_TEMPLATE = [
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"St James Park.jpg",
"St James Park_edited.jpeg",
"Tulips.jpg",
"wedding.jpg",
"wedding_edited.jpeg",
"[2020-08-29] AAF035 (1).jpg",
"[2020-08-29] AAF035 (2).jpg",
"[2020-08-29] AAF035 (3).jpg",
"[2020-08-29] AAF035.jpg",
"DSC03584.dng",
"IMG_1693.tif",
"IMG_1994.JPG",
"IMG_1994.cr2",
"IMG_1997.JPG",
"IMG_1997.cr2",
"IMG_3092.heic",
"IMG_3092_edited.jpeg",
"IMG_4547.jpg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"Tulips_edited.jpeg",
"screenshot-really-a-png.jpeg",
"winebottle.jpeg",
"winebottle (1).jpeg",
"Frítest.jpg",
"Frítest (1).jpg",
"Frítest (2).jpg",
"Frítest (3).jpg",
"Frítest_edited.jpeg",
"Frítest_edited (1).jpeg",
"Frítest_edited.jpeg",
"Frítest.jpg",
"IMG_1693.tif",
"IMG_1994.cr2",
"IMG_1994.JPG",
"IMG_1997.cr2",
"IMG_1997.JPG",
"IMG_3092_edited.jpeg",
"IMG_3092.heic",
"IMG_4547.jpg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"screenshot-really-a-png.jpeg",
"St James Park_edited.jpeg",
"St James Park.jpg",
"Tulips_edited.jpeg",
"Tulips.jpg",
"wedding_edited.jpeg",
"wedding.jpg",
"winebottle (1).jpeg",
"winebottle.jpeg",
]
CLI_EXPORT_FILENAMES_ORIGINAL_SUFFIX = [
"Pumkins1_original.jpg",
"Pumkins2_original.jpg",
"Pumpkins3_original.jpg",
"St James Park_original.jpg",
"St James Park_edited.jpeg",
"Tulips_original.jpg",
"wedding_original.jpg",
"wedding_edited.jpeg",
"[2020-08-29] AAF035_original (1).jpg",
"[2020-08-29] AAF035_original (2).jpg",
"[2020-08-29] AAF035_original (3).jpg",
"[2020-08-29] AAF035_original.jpg",
"DSC03584_original.dng",
"IMG_1693_original.tif",
"IMG_1994_original.JPG",
"IMG_1994_original.cr2",
"IMG_1997_original.JPG",
"IMG_1997_original.cr2",
"IMG_3092_original.heic",
"IMG_3092_edited.jpeg",
"IMG_4547_original.jpg",
"Jellyfish_original.MOV",
"Jellyfish1_original.mp4",
"Tulips_edited.jpeg",
"screenshot-really-a-png_original.jpeg",
"winebottle_original.jpeg",
"winebottle_original (1).jpeg",
"Frítest_original.jpg",
"Frítest_edited (1).jpeg",
"Frítest_edited.jpeg",
"Frítest_original (1).jpg",
"Frítest_original (2).jpg",
"Frítest_original (3).jpg",
"Frítest_edited.jpeg",
"Frítest_edited (1).jpeg",
"Frítest_original.jpg",
"IMG_1693_original.tif",
"IMG_1994_original.cr2",
"IMG_1994_original.JPG",
"IMG_1997_original.cr2",
"IMG_1997_original.JPG",
"IMG_3092_edited.jpeg",
"IMG_3092_original.heic",
"IMG_4547_original.jpg",
"Jellyfish_original.MOV",
"Jellyfish1_original.mp4",
"Pumkins1_original.jpg",
"Pumkins2_original.jpg",
"Pumpkins3_original.jpg",
"screenshot-really-a-png_original.jpeg",
"St James Park_edited.jpeg",
"St James Park_original.jpg",
"Tulips_edited.jpeg",
"Tulips_original.jpg",
"wedding_edited.jpeg",
"wedding_original.jpg",
"winebottle_original (1).jpeg",
"winebottle_original.jpeg",
]
CLI_EXPORT_FILENAMES_ORIGINAL_SUFFIX_TEMPLATE = [
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"St James Park_original.jpg",
"St James Park_edited.jpeg",
"Tulips_original.jpg",
"wedding_original.jpg",
"wedding_edited.jpeg",
"Tulips_edited.jpeg",
"[2020-08-29] AAF035 (1).jpg",
"[2020-08-29] AAF035 (2).jpg",
"[2020-08-29] AAF035 (3).jpg",
"[2020-08-29] AAF035.jpg",
"DSC03584.dng",
"Frítest (1).jpg",
"Frítest_edited (1).jpeg",
"Frítest_edited.jpeg",
"Frítest_original (1).jpg",
"Frítest_original.jpg",
"Frítest.jpg",
"IMG_1693.tif",
"IMG_1994.JPG",
"IMG_1994.cr2",
"IMG_1997.JPG",
"IMG_1994.JPG",
"IMG_1997.cr2",
"IMG_3092_original.heic",
"IMG_1997.JPG",
"IMG_3092_edited.jpeg",
"IMG_3092_original.heic",
"IMG_4547.jpg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"screenshot-really-a-png.jpeg",
"winebottle.jpeg",
"St James Park_edited.jpeg",
"St James Park_original.jpg",
"Tulips_edited.jpeg",
"Tulips_original.jpg",
"wedding_edited.jpeg",
"wedding_original.jpg",
"winebottle (1).jpeg",
"Frítest.jpg",
"Frítest (1).jpg",
"Frítest_original.jpg",
"Frítest_edited.jpeg",
"Frítest_original (1).jpg",
"Frítest_edited (1).jpeg",
"winebottle.jpeg",
]
CLI_EXPORT_FILENAMES_CURRENT = [
"1793FAAB-DE75-4E25-886C-2BD66C780D6A_edited.jpeg", # Frítest.jpg
"1793FAAB-DE75-4E25-886C-2BD66C780D6A.jpeg", # Frítest.jpg
"1EB2B765-0765-43BA-A90C-0D0580E6172C.jpeg",
"2DFD33F1-A5D8-486F-A3A9-98C07995535A.jpeg",
"35329C57-B963-48D6-BB75-6AFF9370CBBC.mov",
"3DD2C897-F19E-4CA6-8C22-B027D5A71907.jpeg",
"4D521201-92AC-43E5-8F7C-59BC41C37A96.cr2",
"4D521201-92AC-43E5-8F7C-59BC41C37A96.jpeg",
"52083079-73D5-4921-AC1B-FE76F279133F.jpeg",
"54E76FCB-D353-4557-9997-0A457BCB4D48.jpeg",
"6191423D-8DB8-4D4C-92BE-9BBBA308AAC4_edited.jpeg",
"6191423D-8DB8-4D4C-92BE-9BBBA308AAC4.jpeg",
"7783E8E6-9CAC-40F3-BE22-81FB7051C266_edited.jpeg",
"7783E8E6-9CAC-40F3-BE22-81FB7051C266.heic",
"7F74DD34-5920-4DA3-B284-479887A34F66.jpeg",
"7FD37B5F-6FAA-4DB1-8A29-BF9C37E38091.jpeg",
"8846E3E6-8AC8-4857-8448-E3D025784410.tiff",
"A8266C97-9BAF-4AF4-99F3-0013832869B8.jpeg", # Frítest.jpg
"A92D9C26-3A50-4197-9388-CB5F7DB9FA91.cr2",
"A92D9C26-3A50-4197-9388-CB5F7DB9FA91.jpeg",
"D05A5FE3-15FB-49A1-A15D-AB3DA6F8B068.dng",
"D79B8D77-BFFC-460B-9312-034F2877D35B.jpeg",
"DC99FBDD-7A52-4100-A5BB-344131646C30.jpeg",
"DC99FBDD-7A52-4100-A5BB-344131646C30_edited.jpeg",
"E9BC5C36-7CD1-40A1-A72B-8B8FAC227D51.jpeg",
"E9BC5C36-7CD1-40A1-A72B-8B8FAC227D51_edited.jpeg",
"F12384F6-CD17-4151-ACBA-AE0E3688539E.jpeg",
"35329C57-B963-48D6-BB75-6AFF9370CBBC.mov",
"6191423D-8DB8-4D4C-92BE-9BBBA308AAC4_edited.jpeg",
"7783E8E6-9CAC-40F3-BE22-81FB7051C266.heic",
"7783E8E6-9CAC-40F3-BE22-81FB7051C266_edited.jpeg",
"7F74DD34-5920-4DA3-B284-479887A34F66.jpeg",
"8846E3E6-8AC8-4857-8448-E3D025784410.tiff",
"D1359D09-1373-4F3B-B0E3-1A4DE573E4A3.mp4",
"E2078879-A29C-4D6F-BACB-E3BBE6C3EB91.jpeg",
"52083079-73D5-4921-AC1B-FE76F279133F.jpeg",
"B13F4485-94E0-41CD-AF71-913095D62E31.jpeg", # Frítest.jpg
"1793FAAB-DE75-4E25-886C-2BD66C780D6A.jpeg", # Frítest.jpg
"1793FAAB-DE75-4E25-886C-2BD66C780D6A_edited.jpeg", # Frítest.jpg
"A8266C97-9BAF-4AF4-99F3-0013832869B8.jpeg", # Frítest.jpg
"D1D4040D-D141-44E8-93EA-E403D9F63E07.jpeg", # Frítest.jpg
"D05A5FE3-15FB-49A1-A15D-AB3DA6F8B068.dng",
"D1359D09-1373-4F3B-B0E3-1A4DE573E4A3.mp4",
"D1D4040D-D141-44E8-93EA-E403D9F63E07_edited.jpeg", # Frítest.jpg
"D1D4040D-D141-44E8-93EA-E403D9F63E07.jpeg", # Frítest.jpg
"D79B8D77-BFFC-460B-9312-034F2877D35B.jpeg",
"DC99FBDD-7A52-4100-A5BB-344131646C30_edited.jpeg",
"DC99FBDD-7A52-4100-A5BB-344131646C30.jpeg",
"E2078879-A29C-4D6F-BACB-E3BBE6C3EB91.jpeg",
"E9BC5C36-7CD1-40A1-A72B-8B8FAC227D51_edited.jpeg",
"E9BC5C36-7CD1-40A1-A72B-8B8FAC227D51.jpeg",
"F12384F6-CD17-4151-ACBA-AE0E3688539E.jpeg",
"F207D5DE-EFAD-4217-8424-0764AAC971D0.jpeg",
]
CLI_EXPORT_FILENAMES_CONVERT_TO_JPEG = [
"[2020-08-29] AAF035 (1).jpg",
"[2020-08-29] AAF035 (2).jpg",
"[2020-08-29] AAF035 (3).jpg",
"[2020-08-29] AAF035.jpg",
"DSC03584.jpeg",
"IMG_1693.jpeg",
"IMG_1994.JPG",
"IMG_1994.cr2",
"IMG_1997.JPG",
"IMG_1997.cr2",
"IMG_3092.jpeg",
"IMG_3092_edited.jpeg",
"IMG_4547.jpg",
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"St James Park.jpg",
"St James Park_edited.jpeg",
"Tulips.jpg",
"Tulips_edited.jpeg",
"wedding.jpg",
"wedding_edited.jpeg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"screenshot-really-a-png.jpeg",
"winebottle.jpeg",
"winebottle (1).jpeg",
"Frítest.jpg",
"Frítest (1).jpg",
"Frítest (2).jpg",
"Frítest (3).jpg",
"Frítest_edited (1).jpeg",
"Frítest_edited.jpeg",
"Frítest.jpg",
"IMG_1693.jpeg",
"IMG_1994.cr2",
"IMG_1994.JPG",
"IMG_1997.cr2",
"IMG_1997.JPG",
"IMG_3092_edited.jpeg",
"IMG_3092.jpeg",
"IMG_4547.jpg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"screenshot-really-a-png.jpeg",
"St James Park_edited.jpeg",
"St James Park.jpg",
"Tulips_edited.jpeg",
"Tulips.jpg",
"wedding_edited.jpeg",
"wedding.jpg",
"winebottle (1).jpeg",
"winebottle.jpeg",
]
CLI_EXPORT_FILENAMES_CONVERT_TO_JPEG_SKIP_RAW = [
"[2020-08-29] AAF035 (1).jpg",
"[2020-08-29] AAF035 (2).jpg",
"[2020-08-29] AAF035 (3).jpg",
"[2020-08-29] AAF035.jpg",
"DSC03584.jpeg",
"IMG_1693.jpeg",
"IMG_1994.JPG",
"IMG_1997.JPG",
"IMG_3092.jpeg",
"IMG_3092_edited.jpeg",
"IMG_4547.jpg",
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"St James Park.jpg",
"St James Park_edited.jpeg",
"Tulips.jpg",
"Tulips_edited.jpeg",
"wedding.jpg",
"wedding_edited.jpeg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"screenshot-really-a-png.jpeg",
"winebottle.jpeg",
"winebottle (1).jpeg",
"Frítest.jpg",
"Frítest (1).jpg",
"Frítest (2).jpg",
"Frítest (3).jpg",
"Frítest_edited.jpeg",
"Frítest_edited (1).jpeg",
"Frítest_edited.jpeg",
"Frítest.jpg",
"IMG_1693.jpeg",
"IMG_1994.JPG",
"IMG_1997.JPG",
"IMG_3092_edited.jpeg",
"IMG_3092.jpeg",
"IMG_4547.jpg",
"Jellyfish.MOV",
"Jellyfish1.mp4",
"Pumkins1.jpg",
"Pumkins2.jpg",
"Pumpkins3.jpg",
"screenshot-really-a-png.jpeg",
"St James Park_edited.jpeg",
"St James Park.jpg",
"Tulips_edited.jpeg",
"Tulips.jpg",
"wedding_edited.jpeg",
"wedding.jpg",
"winebottle (1).jpeg",
"winebottle.jpeg",
]
CLI_EXPORT_CONVERT_TO_JPEG_LARGE_FILE = "DSC03584.jpeg"
@@ -546,7 +580,7 @@ PHOTOS_NOT_IN_TRASH_LEN_14_6 = 12
PHOTOS_IN_TRASH_LEN_14_6 = 1
PHOTOS_MISSING_14_6 = 1
PHOTOS_NOT_IN_TRASH_LEN_15_7 = 23
PHOTOS_NOT_IN_TRASH_LEN_15_7 = 27
PHOTOS_IN_TRASH_LEN_15_7 = 2
PHOTOS_MISSING_15_7 = 2
PHOTOS_EDITED_15_7 = 6
@@ -732,6 +766,7 @@ ALBUMS_JSON = {
"Sorted Newest First": 3,
"Sorted Oldest First": 3,
"Sorted Title": 3,
"Água": 3,
},
"shared albums": {},
}
@@ -746,6 +781,7 @@ ALBUMS_STR = """albums:
2018-10 - Sponsion, Museum, Frühstück, Römermuseum: 1
2019-10/11 Paris Clermont: 1
EmptyAlbum: 0
Água: 3
shared albums: {}
"""
@@ -820,37 +856,45 @@ UUID_IS_REFERENCE = [
]
UUID_IN_ALBUM = [
"F12384F6-CD17-4151-ACBA-AE0E3688539E",
"8E1D7BC9-9321-44F9-8CFB-4083F6B9232A",
"1EB2B765-0765-43BA-A90C-0D0580E6172C",
"E9BC5C36-7CD1-40A1-A72B-8B8FAC227D51",
"A92D9C26-3A50-4197-9388-CB5F7DB9FA91",
"D79B8D77-BFFC-460B-9312-034F2877D35B",
"4D521201-92AC-43E5-8F7C-59BC41C37A96",
"D05A5FE3-15FB-49A1-A15D-AB3DA6F8B068",
"2DFD33F1-A5D8-486F-A3A9-98C07995535A",
"3DD2C897-F19E-4CA6-8C22-B027D5A71907",
"4D521201-92AC-43E5-8F7C-59BC41C37A96",
"54E76FCB-D353-4557-9997-0A457BCB4D48",
"7783E8E6-9CAC-40F3-BE22-81FB7051C266",
"7FD37B5F-6FAA-4DB1-8A29-BF9C37E38091",
"8E1D7BC9-9321-44F9-8CFB-4083F6B9232A",
"A92D9C26-3A50-4197-9388-CB5F7DB9FA91",
"D05A5FE3-15FB-49A1-A15D-AB3DA6F8B068",
"D79B8D77-BFFC-460B-9312-034F2877D35B",
"E9BC5C36-7CD1-40A1-A72B-8B8FAC227D51",
"F12384F6-CD17-4151-ACBA-AE0E3688539E",
]
UUID_NOT_IN_ALBUM = [
"A1DD1F98-2ECD-431F-9AC9-5AFEFE2D3A5C",
"DC99FBDD-7A52-4100-A5BB-344131646C30",
"D1359D09-1373-4F3B-B0E3-1A4DE573E4A3",
"E2078879-A29C-4D6F-BACB-E3BBE6C3EB91",
"6191423D-8DB8-4D4C-92BE-9BBBA308AAC4",
"35329C57-B963-48D6-BB75-6AFF9370CBBC",
"8846E3E6-8AC8-4857-8448-E3D025784410",
"7F74DD34-5920-4DA3-B284-479887A34F66",
"52083079-73D5-4921-AC1B-FE76F279133F",
"B13F4485-94E0-41CD-AF71-913095D62E31", # Frítest.jpg
"1793FAAB-DE75-4E25-886C-2BD66C780D6A", # Frítest.jpg
"35329C57-B963-48D6-BB75-6AFF9370CBBC",
"52083079-73D5-4921-AC1B-FE76F279133F",
"6191423D-8DB8-4D4C-92BE-9BBBA308AAC4",
"7F74DD34-5920-4DA3-B284-479887A34F66",
"8846E3E6-8AC8-4857-8448-E3D025784410",
"A1DD1F98-2ECD-431F-9AC9-5AFEFE2D3A5C",
"A8266C97-9BAF-4AF4-99F3-0013832869B8", # Frítest.jpg
"B13F4485-94E0-41CD-AF71-913095D62E31", # Frítest.jpg
"D1359D09-1373-4F3B-B0E3-1A4DE573E4A3",
"D1D4040D-D141-44E8-93EA-E403D9F63E07", # Frítest.jpg
"DC99FBDD-7A52-4100-A5BB-344131646C30",
"E2078879-A29C-4D6F-BACB-E3BBE6C3EB91",
"F207D5DE-EFAD-4217-8424-0764AAC971D0",
]
UUID_DUPLICATES = [
"7F74DD34-5920-4DA3-B284-479887A34F66",
"2DFD33F1-A5D8-486F-A3A9-98C07995535A",
"52083079-73D5-4921-AC1B-FE76F279133F",
"54E76FCB-D353-4557-9997-0A457BCB4D48",
"7F74DD34-5920-4DA3-B284-479887A34F66",
"A92D9C26-3A50-4197-9388-CB5F7DB9FA91",
"F207D5DE-EFAD-4217-8424-0764AAC971D0",
]
UUID_LOCATION = "D79B8D77-BFFC-460B-9312-034F2877D35B" # Pumkins2.jpg
@@ -2517,7 +2561,8 @@ def test_export_duplicate():
# pylint: disable=not-context-manager
with runner.isolated_filesystem():
result = runner.invoke(
export, [os.path.join(cwd, CLI_PHOTOS_DB), ".", "-V", "--duplicate"]
export,
[os.path.join(cwd, CLI_PHOTOS_DB), ".", "-V", "--duplicate", "--skip-raw"],
)
assert result.exit_code == 0
files = glob.glob("*")
@@ -4293,7 +4338,7 @@ def test_export_deleted_only_2():
def test_export_error(monkeypatch):
"""Test that export catches errors thrown by export2"""
"""Test that export catches errors thrown by export"""
# Note: I often comment out the try/except block in cli.py::export_photo_with_template when
# debugging to see exactly where the error is
# this test verifies I've re-enabled that code
@@ -4307,7 +4352,7 @@ def test_export_error(monkeypatch):
def throw_error(*args, **kwargs):
raise ValueError("Argh!")
monkeypatch.setattr(osxphotos.PhotoExporter, "export2", throw_error)
monkeypatch.setattr(osxphotos.PhotoExporter, "export", throw_error)
with runner.isolated_filesystem():
result = runner.invoke(
export,
@@ -4700,7 +4745,14 @@ def test_export_live_edited():
# basic export
result = runner.invoke(
export,
[os.path.join(cwd, PHOTOS_DB_RHET), ".", "-V", "--uuid", UUID_LIVE_EDITED],
[
os.path.join(cwd, PHOTOS_DB_RHET),
".",
"-V",
"--uuid",
UUID_LIVE_EDITED,
"--download-missing",
],
)
assert result.exit_code == 0
files = glob.glob("*")
@@ -5077,7 +5129,7 @@ def test_export_dry_run():
in result.output
)
for filepath in CLI_EXPORT_FILENAMES_DRY_RUN:
assert re.search(r"Exported.*" + f"{filepath}", result.output)
assert re.search(r"Exported.*" + f"{re.escape(filepath)}", result.output)
assert not os.path.isfile(normalize_fs_path(filepath))
@@ -6022,7 +6074,7 @@ def test_export_cleanup_empty_album():
def test_export_cleanup_accented_album_name():
"""test export with --cleanup flag and photos in album with accented unicode characters (#561)"""
"""test export with --cleanup flag and photos in album with accented unicode characters (#561, #618)"""
import pathlib
from osxphotos.cli import export
@@ -6045,6 +6097,89 @@ def test_export_cleanup_accented_album_name():
)
assert "Deleted: 0 files, 0 directories" in result.output
# do it again
result = runner.invoke(
export,
[
os.path.join(cwd, CLI_PHOTOS_DB),
tempdir,
"-V",
"--update",
"--cleanup",
"--directory",
"{folder_album}",
"--update",
],
)
assert "exported: 0, updated: 0" in result.output
assert "Deleted: 0 files, 0 directories" in result.output
@pytest.mark.skipif(exiftool is None, reason="exiftool not installed")
def test_export_cleanup_exiftool_accented_album_name_same_filenames():
"""test export with --cleanup flag and photos in album with accented unicode characters (#561, #618)"""
import pathlib
from osxphotos.cli import export
runner = CliRunner()
cwd = os.getcwd()
# pylint: disable=not-context-manager
with tempfile.TemporaryDirectory() as report_dir:
# keep report file out of of expor dir for --cleanup
report_file = os.path.join(report_dir, "test.csv")
with tempfile.TemporaryDirectory() as tempdir:
result = runner.invoke(
export,
[
os.path.join(cwd, CLI_PHOTOS_DB),
tempdir,
"-V",
"--cleanup",
"--directory",
"{album[/,.|:,.]}",
"--exiftool",
"--exiftool-merge-keywords",
"--exiftool-merge-persons",
"--keyword-template",
"{keyword}",
"--report",
report_file,
"--skip-original-if-edited",
"--update",
"--touch-file",
"--not-hidden",
],
)
assert "Deleted: 0 files, 0 directories" in result.output
# do it again
result = runner.invoke(
export,
[
os.path.join(cwd, CLI_PHOTOS_DB),
tempdir,
"-V",
"--cleanup",
"--directory",
"{album[/,.|:,.]}",
"--exiftool",
"--exiftool-merge-keywords",
"--exiftool-merge-persons",
"--keyword-template",
"{keyword}",
"--report",
report_file,
"--skip-original-if-edited",
"--update",
"--touch-file",
"--not-hidden",
],
)
assert "exported: 0, updated: 0" in result.output
assert "updated EXIF data: 0" in result.output
assert "Deleted: 0 files, 0 directories" in result.output
def test_save_load_config():
"""test --save-config, --load-config"""
@@ -6871,6 +7006,77 @@ def test_export_download_missing_file_exists():
assert "skipped: 1" in result.output
@pytest.mark.skipif(
"OSXPHOTOS_TEST_EXPORT" not in os.environ,
reason="Skip if not running on author's personal library.",
)
def test_export_download_missing_preview():
"""test --download-missing --preview, #564"""
import glob
import os
import os.path
import pathlib
from osxphotos.cli import export
runner = CliRunner()
cwd = os.getcwd()
# pylint: disable=not-context-manager
with runner.isolated_filesystem():
result = runner.invoke(
export,
[
os.path.join(cwd, PHOTOS_DB_RHET),
".",
"-V",
"--uuid",
UUID_DOWNLOAD_MISSING,
"--download-missing",
"--use-photos-export",
"--use-photokit",
"--preview",
],
)
assert result.exit_code == 0
assert "exported: 2" in result.output
@pytest.mark.skipif(
"OSXPHOTOS_TEST_EXPORT" not in os.environ,
reason="Skip if not running on author's personal library.",
)
def test_export_download_missing_preview_applesccript():
"""test --download-missing --preview and applescript download, #564"""
import glob
import os
import os.path
import pathlib
from osxphotos.cli import export
runner = CliRunner()
cwd = os.getcwd()
# pylint: disable=not-context-manager
with runner.isolated_filesystem():
result = runner.invoke(
export,
[
os.path.join(cwd, PHOTOS_DB_RHET),
".",
"-V",
"--uuid",
UUID_DOWNLOAD_MISSING,
"--download-missing",
"--use-photos-export",
"--preview",
],
)
assert result.exit_code == 0
assert "exported: 2" in result.output
@pytest.mark.skipif(
"OSXPHOTOS_TEST_EXPORT" not in os.environ,
reason="Skip if not running on author's personal library.",
@@ -6930,6 +7136,30 @@ def test_query_name():
assert json_got[0]["original_filename"] == "DSC03584.dng"
def test_query_name_unicode():
"""test query --name with a unicode name"""
import json
import os
import os.path
import osxphotos
from osxphotos.cli import query
runner = CliRunner()
cwd = os.getcwd()
result = runner.invoke(
query,
["--json", "--db", os.path.join(cwd, PHOTOS_DB_15_7), "--name", "Frítest"],
)
assert result.exit_code == 0
json_got = json.loads(result.output)
assert len(json_got) == 4
assert normalize_unicode(json_got[0]["original_filename"]).startswith(
normalize_unicode("Frítest.jpg")
)
def test_query_name_i():
"""test query --name -i"""
import json
@@ -6959,6 +7189,46 @@ def test_query_name_i():
assert json_got[0]["original_filename"] == "DSC03584.dng"
def test_query_name_original_filename():
"""test query --name only searches original filename on Photos 5+"""
import json
import os
import os.path
from osxphotos.cli import query
runner = CliRunner()
cwd = os.getcwd()
result = runner.invoke(
query,
["--json", "--db", os.path.join(cwd, PHOTOS_DB_15_7), "--name", "AA"],
)
assert result.exit_code == 0
json_got = json.loads(result.output)
assert len(json_got) == 4
def test_query_name_original_filename_i():
"""test query --name only searches original filename on Photos 5+ with -i"""
import json
import os
import os.path
from osxphotos.cli import query
runner = CliRunner()
cwd = os.getcwd()
result = runner.invoke(
query,
["--json", "--db", os.path.join(cwd, PHOTOS_DB_15_7), "--name", "aa", "-i"],
)
assert result.exit_code == 0
json_got = json.loads(result.output)
assert len(json_got) == 4
def test_export_name():
"""test export --name"""
import glob

View File

@@ -40,7 +40,7 @@ def test_export_convert_raw_to_jpeg(photosdb):
photos = photosdb.photos(uuid=[UUID_DICT["raw"]])
export_options = ExportOptions(convert_to_jpeg=True)
results = PhotoExporter(photos[0]).export2(dest, options=export_options)
results = PhotoExporter(photos[0]).export(dest, options=export_options)
got_dest = pathlib.Path(results.exported[0])
assert got_dest.is_file()
@@ -58,7 +58,7 @@ def test_export_convert_heic_to_jpeg(photosdb):
photos = photosdb.photos(uuid=[UUID_DICT["heic"]])
export_options = ExportOptions(convert_to_jpeg=True)
results = PhotoExporter(photos[0]).export2(dest, options=export_options)
results = PhotoExporter(photos[0]).export(dest, options=export_options)
got_dest = pathlib.Path(results.exported[0])
assert got_dest.is_file()
@@ -86,7 +86,7 @@ def test_export_convert_live_heic_to_jpeg():
photo = photosdb.get_photo(UUID_LIVE_HEIC)
export_options = ExportOptions(convert_to_jpeg=True, live_photo=True)
results = PhotoExporter(photo).export2(dest, options=export_options)
results = PhotoExporter(photo).export(dest, options=export_options)
for name in NAMES_LIVE_HEIC:
assert f"{tempdir.name}/{name}" in results.exported

View File

@@ -74,6 +74,26 @@ def test_export_db():
assert db.get_stat_edited_for_file(filepath2) == (10, 11, 12)
assert sorted(db.get_previous_uuids()) == (["BAR-FOO", "FOO-BAR"])
# test set_data value=None doesn't overwrite existing data
db.set_data(
filepath2,
"BAR-FOO",
None,
None,
None,
None,
None,
None,
)
assert db.get_uuid_for_file(filepath2) == "BAR-FOO"
assert db.get_info_for_uuid("BAR-FOO") == INFO_DATA
assert db.get_exifdata_for_file(filepath2) == EXIF_DATA
assert db.get_stat_orig_for_file(filepath2) == (1, 2, 3)
assert db.get_stat_exif_for_file(filepath2) == (4, 5, 6)
assert db.get_stat_converted_for_file(filepath2) == (7, 8, 9)
assert db.get_stat_edited_for_file(filepath2) == (10, 11, 12)
assert sorted(db.get_previous_uuids()) == (["BAR-FOO", "FOO-BAR"])
# close and re-open
db.close()
db = ExportDB(dbname, tempdir.name)

View File

@@ -93,7 +93,7 @@ def test_exportresults_iadd():
def test_all_files():
""" test ExportResults.all_files() """
"""test ExportResults.all_files()"""
results = ExportResults()
for x in EXPORT_RESULT_ATTRIBUTES:
setattr(results, x, [f"{x}1"])
@@ -106,13 +106,3 @@ def test_all_files():
assert sorted(
results.all_files() + results.deleted_files + results.deleted_directories
) == sorted([f"{x}1" for x in EXPORT_RESULT_ATTRIBUTES])
def test_str():
""" test ExportResults.__str__ """
results = ExportResults()
assert (
str(results)
== "ExportResults(exported=[],new=[],updated=[],skipped=[],exif_updated=[],touched=[],converted_to_jpeg=[],sidecar_json_written=[],sidecar_json_skipped=[],sidecar_exiftool_written=[],sidecar_exiftool_skipped=[],sidecar_xmp_written=[],sidecar_xmp_skipped=[],missing=[],error=[],exiftool_warning=[],exiftool_error=[],deleted_files=[],deleted_directories=[],exported_album=[],skipped_album=[],missing_album=[])"
)

View File

@@ -41,7 +41,7 @@ def test_sidecar_xmp(photosdb):
dest = tempdir.name
photo = photosdb.get_photo(uuid)
export_options = ExportOptions(sidecar=SIDECAR_XMP)
PhotoExporter(photo).export2(
PhotoExporter(photo).export(
dest, photo.original_filename, options=export_options
)
filepath = str(pathlib.Path(dest) / photo.original_filename)

View File

@@ -1,27 +1,33 @@
import logging
import os.path
import pathlib
import tempfile
import pytest
import osxphotos
DB_LOCKED_10_12 = "./tests/Test-Lock-10_12.photoslibrary/database/photos.db"
DB_LOCKED_10_15 = "./tests/Test-Lock-10_15_1.photoslibrary/database/Photos.sqlite"
DB_UNLOCKED_10_15 = "./tests/Test-10.15.1.photoslibrary/database/photos.db"
UTI_DICT = {"public.jpeg": "jpeg", "com.canon.cr2-raw-image": "cr2"}
from osxphotos.utils import (
_dd_to_dms,
increment_filename,
increment_filename_with_count,
list_directory,
)
def test_debug_enable():
import logging
import osxphotos
osxphotos._set_debug(True)
logger = osxphotos._get_logger()
assert logger.isEnabledFor(logging.DEBUG)
def test_debug_disable():
import logging
import osxphotos
osxphotos._set_debug(False)
logger = osxphotos._get_logger()
assert not logger.isEnabledFor(logging.DEBUG)
@@ -29,14 +35,12 @@ def test_debug_disable():
def test_dd_to_dms():
# expands coverage for edge case in _dd_to_dms
from osxphotos.utils import _dd_to_dms
assert _dd_to_dms(-0.001) == (0, 0, -3.6)
@pytest.mark.skip(reason="Fails on some machines")
def test_get_system_library_path():
import osxphotos
_, major, _ = osxphotos.utils._get_os_version()
if int(major) < 15:
@@ -46,51 +50,73 @@ def test_get_system_library_path():
def test_db_is_locked_locked():
import osxphotos
assert osxphotos.utils._db_is_locked(DB_LOCKED_10_12)
assert osxphotos.utils._db_is_locked(DB_LOCKED_10_15)
def test_db_is_locked_unlocked():
import osxphotos
assert not osxphotos.utils._db_is_locked(DB_UNLOCKED_10_15)
def test_findfiles():
import os.path
import tempfile
from osxphotos.utils import findfiles
def test_list_directory():
"""test list_directory"""
temp_dir = tempfile.TemporaryDirectory(prefix="osxphotos_")
fd = open(os.path.join(temp_dir.name, "file1.jpg"), "w+")
fd.close
fd = open(os.path.join(temp_dir.name, "file2.JPG"), "w+")
fd.close
files = findfiles("*.jpg", temp_dir.name)
temp_dir_name = pathlib.Path(temp_dir.name)
file1 = (temp_dir_name / "file1.jpg").touch()
file2 = (temp_dir_name / "File2.JPG").touch()
file3 = (temp_dir_name / "File.png").touch()
file4 = (temp_dir_name / "document.pdf").touch()
files = list_directory(temp_dir.name, glob="*.jpg")
assert len(files) == 2
assert "file1.jpg" in files
assert "file2.JPG" in files
assert "File2.JPG" in files
assert isinstance(files[0], str)
files = list_directory(temp_dir.name, glob="*.jpg", case_sensitive=True)
assert len(files) == 1
assert "file1.jpg" in files
files = list_directory(temp_dir.name, startswith="file")
assert len(files) == 3
files = list_directory(temp_dir.name, endswith="jpg")
assert len(files) == 2
files = list_directory(temp_dir.name, contains="doc")
assert len(files) == 1
assert "document.pdf" in files
files = list_directory(temp_dir.name, startswith="File", case_sensitive=True)
assert len(files) == 2
files = list_directory(temp_dir.name, startswith="File", case_sensitive=False)
assert len(files) == 3
files = list_directory(temp_dir.name, startswith="document", include_path=True)
assert len(files) == 1
assert files[0] == str(pathlib.Path(temp_dir.name) / "document.pdf")
# test pathlib.Path
files = list_directory(temp_dir_name, glob="*.jpg")
assert isinstance(files[0], pathlib.Path)
files = list_directory(temp_dir.name, glob="FooBar*.jpg")
assert not files
def test_findfiles_invalid_dir():
import tempfile
from osxphotos.utils import findfiles
def test_list_directory_invalid():
temp_dir = tempfile.TemporaryDirectory(prefix="osxphotos_")
files = findfiles("*.jpg", f"{temp_dir.name}/no_such_dir")
files = list_directory(f"{temp_dir.name}/no_such_dir", glob="*.jpg")
assert len(files) == 0
def test_increment_filename():
# test that increment_filename works
import pathlib
import tempfile
from osxphotos.utils import increment_filename, increment_filename_with_count
with tempfile.TemporaryDirectory(prefix="osxphotos_") as temp_dir:
temp_dir = pathlib.Path(temp_dir)

View File

@@ -0,0 +1,57 @@
"""Read the "Supported File Types" table from exiftool.org and build a json file from the table"""
import json
import sys
import requests
from bs4 import BeautifulSoup
if __name__ == "__main__":
url = "https://www.exiftool.org/"
json_file = "exiftool_filetypes.json"
html_content = requests.get(url).text
soup = BeautifulSoup(html_content, "html.parser")
# uncomment to see all table classes
# print("Classes of each table:")
# for table in soup.find_all("table"):
# print(table.get("class"))
# strip footnotes in <span> tags
for span_tag in soup.findAll("span"):
span_tag.replace_with("")
# find the table for Supported File Types
table = soup.find("table", class_="sticky tight sm bm")
# get table headers
table_headers = [tx.text.lower() for tx in table.find_all("th")]
# get table data
table_data = []
for tr in table.find_all("tr"):
if row := [td.text for td in tr.find_all("td")]:
table_data.append(row)
# make a dictionary of the table data
supported_filetypes = {}
for row in table_data:
row_dict = dict(zip(table_headers, row))
for key, value in row_dict.items():
if value == "-":
row_dict[key] = None
row_dict["file type"] = row_dict["file type"].split(",")
row_dict["file type"] = [ft.strip() for ft in row_dict["file type"]]
row_dict["read"] = "R" in row_dict["support"]
row_dict["write"] = "W" in row_dict["support"]
row_dict["create"] = "C" in row_dict["support"]
filetypes = [ft.lower() for ft in row_dict["file type"]]
for filetype in filetypes:
supported_filetypes[filetype] = {"extension": filetype, **row_dict}
with open(json_file, "w") as jsonfile:
print(f"Writing {json_file}...")
json.dump(supported_filetypes, jsonfile, indent=4)