Compare commits

...

41 Commits

Author SHA1 Message Date
Rhet Turnbull
459d91d7b1 Partial fix for issue #213 2020-09-13 18:15:46 -07:00
Rhet Turnbull
eb00ffd737 Fixed exception handling in export 2020-09-13 12:19:21 -07:00
Rhet Turnbull
a1776fa148 Updated README.md 2020-09-07 06:59:49 -07:00
Rhet Turnbull
f1d20103ff Updated CHANGELOG.md 2020-09-07 06:55:05 -07:00
Rhet Turnbull
5f2d401048 Added --skip-original-if-edited for issue #159 2020-09-07 06:33:37 -07:00
Rhet Turnbull
58b3869a7c Still working on issue #208 2020-09-04 12:47:27 -07:00
Rhet Turnbull
c2fecc9d30 Fixed sidecar collisions, closes #210 2020-08-31 06:30:44 -07:00
Rhet Turnbull
1f343c1c11 Updated CHANGELOG.md 2020-08-31 05:43:19 -07:00
Rhet Turnbull
a36eb416b1 Normalize unicode for issue #208 2020-08-31 05:24:54 -07:00
Rhet Turnbull
c9b15186a0 Updated README.md 2020-08-29 22:04:09 -07:00
Rhet Turnbull
315fe6a6a3 Merge pull request #212 from dmd/patch-1
typo fix - thanks to @dmd
2020-08-29 21:59:23 -07:00
Rhet Turnbull
b611d34d19 Added force_download.py to examples 2020-08-29 21:53:57 -07:00
Daniel M. Drucker
001e474d56 typo fix 2020-08-29 16:58:49 -04:00
Rhet Turnbull
60d96a8f56 Added photoshop:SidecarForExtension to XMP, partial fix for #210 2020-08-25 21:46:07 -07:00
Rhet Turnbull
42e8fba125 Update README.md 2020-08-25 15:21:40 -07:00
Rhet Turnbull
a91617cce4 Updated CHANGELOG.md 2020-08-25 14:25:56 -07:00
Rhet Turnbull
0cc4beaede Fixed DST handling for from_date/to_date, closes #193 (again) 2020-08-25 06:43:06 -07:00
Rhet Turnbull
0f457a4082 Added raw timestamps to PhotoInfo._info 2020-08-24 06:00:57 -07:00
Rhet Turnbull
1f717b0579 Fixed portrait for Catalina/Big Sur; see issue #203 2020-08-23 16:34:23 -07:00
Rhet Turnbull
0cbd005bcd Merge pull request #207 from RhetTbull/issue206
Closes issue #206, adds --touch-file
2020-08-23 11:18:31 -07:00
Rhet Turnbull
1bf7105737 Fixed touch tests 2020-08-23 11:06:01 -07:00
Rhet Turnbull
6e5ea8e013 Fixed touch tests to use correct timezone 2020-08-23 08:37:12 -07:00
Rhet Turnbull
9f64262757 Finished --touch-file, closes #206 2020-08-23 08:27:21 -07:00
Rhet Turnbull
6c11e3fa5b --touch-file now working with --update 2020-08-22 08:12:26 -07:00
Rhet Turnbull
c9c9202205 Working on issue #206 2020-08-21 05:53:52 -07:00
Rhet Turnbull
ebd878a075 Working on issue 206 2020-08-20 06:39:48 -07:00
Rhet Turnbull
2cf3b6bb67 Updated tests/README.md 2020-08-19 06:06:04 -07:00
Rhet Turnbull
beb7970b3b Merge pull request #205 from PabloKohan/touch_files__fix_194
Touch files - fixes #194 -- thanks to @PabloKohan
2020-08-18 06:00:27 -07:00
Rhet Turnbull
2567974f5b Merge pull request #204 from PabloKohan/refactor_export_photo
Refactor/cleanup _export_photo - thanks to @PabloKohan
2020-08-18 05:59:57 -07:00
Pablo 'merKur' Kohan
78d494ff2c Touch file upon image date - Issue #194 2020-08-17 21:58:11 +03:00
Pablo 'merKur' Kohan
eefa1f181f Refactor/cleanup _export_photo 2020-08-17 21:54:47 +03:00
Rhet Turnbull
2bf5fae093 Working on fix for issue #203 2020-08-17 06:32:55 -07:00
Rhet Turnbull
9b13d1e00b Updated README.md 2020-08-16 23:03:00 -07:00
Rhet Turnbull
f2df6f1a12 Updated CHANGELOG.md 2020-08-16 23:01:04 -07:00
Rhet Turnbull
98e417023e Added ImportInfo for Photos 5+ 2020-08-16 22:57:33 -07:00
Rhet Turnbull
360c8d8e1b Update README.md 2020-08-15 15:20:47 -07:00
Rhet Turnbull
868cda8482 Update README.md 2020-08-15 15:14:45 -07:00
Rhet Turnbull
fa149dc7e1 Replaced call to which, closes #171 2020-08-09 18:09:32 -07:00
Rhet Turnbull
7467bbf62b Added contributors to README.md, closes #200 2020-08-09 17:56:40 -07:00
Rhet Turnbull
d2deefff83 Added tests for 10.15.6 2020-08-09 12:14:18 -07:00
Rhet Turnbull
f474dcd2cb Updated CHANGELOG.md 2020-08-09 11:04:59 -07:00
431 changed files with 5113 additions and 352 deletions

View File

@@ -4,6 +4,94 @@ All notable changes to this project will be documented in this file. Dates are d
Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog). Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
#### [v0.34.0](https://github.com/RhetTbull/osxphotos/compare/v0.33.8...v0.34.0)
> 7 September 2020
- Added --skip-original-if-edited for issue #159 [`5f2d401`](https://github.com/RhetTbull/osxphotos/commit/5f2d401048850fd68f31b37a7e71abc11ca80dc5)
- Still working on issue #208 [`58b3869`](https://github.com/RhetTbull/osxphotos/commit/58b3869a7cce7cb3f211599e544d7e5426ceb4a6)
#### [v0.33.8](https://github.com/RhetTbull/osxphotos/compare/v0.33.7...v0.33.8)
> 31 August 2020
- Fixed sidecar collisions, closes #210 [`#210`](https://github.com/RhetTbull/osxphotos/issues/210)
#### [v0.33.7](https://github.com/RhetTbull/osxphotos/compare/v0.33.5...v0.33.7)
> 31 August 2020
- typo fix - thanks to @dmd [`#212`](https://github.com/RhetTbull/osxphotos/pull/212)
- Normalize unicode for issue #208 [`a36eb41`](https://github.com/RhetTbull/osxphotos/commit/a36eb416b19284477922b6a5f837f4040327138b)
- Added force_download.py to examples [`b611d34`](https://github.com/RhetTbull/osxphotos/commit/b611d34d19db480af72f57ef55eacd0a32c8d1e8)
- Added photoshop:SidecarForExtension to XMP, partial fix for #210 [`60d96a8`](https://github.com/RhetTbull/osxphotos/commit/60d96a8f563882fba2365a6ab58c1276725eedaa)
- Updated README.md [`c9b1518`](https://github.com/RhetTbull/osxphotos/commit/c9b15186a022d91248451279e5f973e3f2dca4b4)
- Update README.md [`42e8fba`](https://github.com/RhetTbull/osxphotos/commit/42e8fba125a3c6b1bd0d538f2af511aabfbeb478)
#### [v0.33.5](https://github.com/RhetTbull/osxphotos/compare/v0.33.3...v0.33.5)
> 25 August 2020
- Fixed DST handling for from_date/to_date, closes #193 (again) [`#193`](https://github.com/RhetTbull/osxphotos/issues/193)
- Added raw timestamps to PhotoInfo._info [`0f457a4`](https://github.com/RhetTbull/osxphotos/commit/0f457a4082a4eebc42a5df2160a02ad987b6f96c)
#### [v0.33.3](https://github.com/RhetTbull/osxphotos/compare/v0.33.2...v0.33.3)
> 23 August 2020
- Fixed portrait for Catalina/Big Sur; see issue #203 [`1f717b0`](https://github.com/RhetTbull/osxphotos/commit/1f717b05794c2088c7c15d2aab0c5d24b6309c06)
#### [v0.33.2](https://github.com/RhetTbull/osxphotos/compare/v0.33.0...v0.33.2)
> 23 August 2020
- Closes issue #206, adds --touch-file [`#207`](https://github.com/RhetTbull/osxphotos/pull/207)
- Touch files - fixes #194 -- thanks to @PabloKohan [`#205`](https://github.com/RhetTbull/osxphotos/pull/205)
- Refactor/cleanup _export_photo - thanks to @PabloKohan [`#204`](https://github.com/RhetTbull/osxphotos/pull/204)
- Finished --touch-file, closes #206 [`#206`](https://github.com/RhetTbull/osxphotos/issues/206)
- Merge pull request #205 from PabloKohan/touch_files__fix_194 [`#194`](https://github.com/RhetTbull/osxphotos/issues/194)
- --touch-file now working with --update [`6c11e3f`](https://github.com/RhetTbull/osxphotos/commit/6c11e3fa5b5b05b98b9fdbb0e59e3a78c7dff980)
- Refactor/cleanup _export_photo [`eefa1f1`](https://github.com/RhetTbull/osxphotos/commit/eefa1f181f4fd7b027ae69abd2b764afb590c081)
- Fixed touch tests [`1bf7105`](https://github.com/RhetTbull/osxphotos/commit/1bf7105737fbd756064a2f9ef4d4bbd0b067978c)
- Working on issue 206 [`ebd878a`](https://github.com/RhetTbull/osxphotos/commit/ebd878a075983ef3df0b1ead1a725e01508721f8)
- Working on issue #206 [`c9c9202`](https://github.com/RhetTbull/osxphotos/commit/c9c920220545dc27c8cb1379d7bde15987cce72c)
#### [v0.33.0](https://github.com/RhetTbull/osxphotos/compare/v0.32.0...v0.33.0)
> 17 August 2020
- Replaced call to which, closes #171 [`#171`](https://github.com/RhetTbull/osxphotos/issues/171)
- Added contributors to README.md, closes #200 [`#200`](https://github.com/RhetTbull/osxphotos/issues/200)
- Added tests for 10.15.6 [`d2deeff`](https://github.com/RhetTbull/osxphotos/commit/d2deefff834e46e1a26adc01b1b025ac839dbc78)
- Added ImportInfo for Photos 5+ [`98e4170`](https://github.com/RhetTbull/osxphotos/commit/98e417023ec5bd8292b25040d0844f3706645950)
- Update README.md [`360c8d8`](https://github.com/RhetTbull/osxphotos/commit/360c8d8e1b4760e95a8b71b3a0bf0df4fb5adaf5)
- Update README.md [`868cda8`](https://github.com/RhetTbull/osxphotos/commit/868cda8482ce6b29dd00e04a209d40550e6b128b)
#### [v0.32.0](https://github.com/RhetTbull/osxphotos/compare/v0.31.2...v0.32.0)
> 9 August 2020
- Alpha support for MacOS Big Sur/10.16, see issue #187 [`6acf9ac`](https://github.com/RhetTbull/osxphotos/commit/6acf9acd6364e1996158179493d128ec0958e652)
#### [v0.31.2](https://github.com/RhetTbull/osxphotos/compare/v0.31.0...v0.31.2)
> 9 August 2020
- Fixed from_date and to_date to be timezone aware, closes #193 [`#193`](https://github.com/RhetTbull/osxphotos/issues/193)
- Added test for valid XMP file, closes #197 [`#197`](https://github.com/RhetTbull/osxphotos/issues/197)
- Dropped py36 due to datetime.fromisoformat [`a714ae0`](https://github.com/RhetTbull/osxphotos/commit/a714ae0af089b13acf70c4f29934393aa48ed222)
- Added --uuid-from-file to CLI [`840e993`](https://github.com/RhetTbull/osxphotos/commit/840e9937bede407ef55972a361618683245e086b)
- Added write_uuid_to_file.applescript to utils [`bea770b`](https://github.com/RhetTbull/osxphotos/commit/bea770b322d21cf3f8245d20e182006247cb71d6)
- Updated README.md [`002fce8`](https://github.com/RhetTbull/osxphotos/commit/002fce8e93edd936d4b866118ae6d4c94e5d6744)
- Added py37 [`d0ec862`](https://github.com/RhetTbull/osxphotos/commit/d0ec8620c721fe7576ab7d519a5eaac4d17a317e)
#### [v0.31.0](https://github.com/RhetTbull/osxphotos/compare/v0.30.13...v0.31.0)
> 27 July 2020
- Initial FaceInfo support for Issue #21 [`6f29cda`](https://github.com/RhetTbull/osxphotos/commit/6f29cda99f1b8d94a95597c7046620cf21fecae4)
- Updated Github Actions to run on PR [`9fc4f76`](https://github.com/RhetTbull/osxphotos/commit/9fc4f762193699dd45b586b51aa2d3066928aab1)
#### [v0.30.13](https://github.com/RhetTbull/osxphotos/compare/v0.30.12...v0.30.13) #### [v0.30.13](https://github.com/RhetTbull/osxphotos/compare/v0.30.12...v0.30.13)
> 23 July 2020 > 23 July 2020

View File

@@ -15,6 +15,7 @@
+ [PhotoInfo](#photoinfo) + [PhotoInfo](#photoinfo)
+ [ExifInfo](#exifinfo) + [ExifInfo](#exifinfo)
+ [AlbumInfo](#albuminfo) + [AlbumInfo](#albuminfo)
+ [ImportInfo](#importinfo)
+ [FolderInfo](#folderinfo) + [FolderInfo](#folderinfo)
+ [PlaceInfo](#placeinfo) + [PlaceInfo](#placeinfo)
+ [ScoreInfo](#scoreinfo) + [ScoreInfo](#scoreinfo)
@@ -52,10 +53,12 @@ OSXPhotos uses setuptools, thus simply run:
python3 setup.py install python3 setup.py install
You can also install directly from [pypi](https://pypi.org/): You can also install directly from [pypi](https://pypi.org/project/osxphotos/):
pip install osxphotos pip install osxphotos
**WARNING** The git repo for this project is very large (> 1GB) because it contains multiple Photos libraries used for testing on different versions of MacOS. If you just want to use the osxphotos package in your own code, I recommend you install the latest version from [PyPI](https://pypi.org/project/osxphotos/). If you just want to use the command line utility, you can download a pre-built executable of the latest [release](https://github.com/RhetTbull/osxphotos/releases) or you can install via `pip` which also installs the command line app. If you aren't comfortable with running python on your Mac, start with the pre-built executable.
## Command Line Usage ## Command Line Usage
This package will install a command line utility called `osxphotos` that allows you to query the Photos database. Alternatively, you can also run the command line utility like this: `python3 -m osxphotos` This package will install a command line utility called `osxphotos` that allows you to query the Photos database. Alternatively, you can also run the command line utility like this: `python3 -m osxphotos`
@@ -202,14 +205,14 @@ Options:
both images and movies). both images and movies).
--only-photos Search only for photos/images (default --only-photos Search only for photos/images (default
searches both images and movies). searches both images and movies).
--from-date [%Y-%m-%d|%Y-%m-%dT%H:%M:%S|%Y-%m-%d %H:%M:%S] --from-date DATETIME Search by start item date, e.g.
Search by start item date, e.g. 2000-01-12T12:00:00,
2000-01-12T12:00:00 or 2000-12-31 (ISO 8601 2001-01-12T12:00:00-07:00, or 2000-12-31
w/o TZ). (ISO 8601).
--to-date [%Y-%m-%d|%Y-%m-%dT%H:%M:%S|%Y-%m-%d %H:%M:%S] --to-date DATETIME Search by end item date, e.g.
Search by end item date, e.g. 2000-01-12T12:00:00,
2000-01-12T12:00:00 or 2000-12-31 (ISO 8601 2001-01-12T12:00:00-07:00, or 2000-12-31
w/o TZ). (ISO 8601).
--deleted Include photos from the 'Recently Deleted' --deleted Include photos from the 'Recently Deleted'
folder. folder.
--deleted-only Include only photos from the 'Recently --deleted-only Include only photos from the 'Recently
@@ -221,6 +224,8 @@ Options:
--export-as-hardlink Hardlink files instead of copying them. --export-as-hardlink Hardlink files instead of copying them.
Cannot be used with --exiftool which creates Cannot be used with --exiftool which creates
copies of the files with embedded EXIF data. copies of the files with embedded EXIF data.
--touch-file Sets the file's modification time to match
photo date.
--overwrite Overwrite existing files. Default behavior --overwrite Overwrite existing files. Default behavior
is to add (1), (2), etc to filename if file is to add (1), (2), etc to filename if file
already exists. Use this with caution as it already exists. Use this with caution as it
@@ -231,6 +236,8 @@ Options:
DEST/2019/12/20/photoname.jpg). DEST/2019/12/20/photoname.jpg).
--skip-edited Do not export edited version of photo if an --skip-edited Do not export edited version of photo if an
edited version exists. edited version exists.
--skip-original-if-edited Do not export original if there is an edited
version (exports only the edited version).
--skip-bursts Do not export all associated burst images in --skip-bursts Do not export all associated burst images in
the library if a photo is a burst photo. the library if a photo is a burst photo.
--skip-live Do not export the associated live video --skip-live Do not export the associated live video
@@ -670,7 +677,7 @@ if __name__ == "__main__":
#### Read a Photos library database #### Read a Photos library database
```python ```python
osxphotos.PhotosDB() # not recommended, see Note below osxphotos.PhotosDB()
osxphotos.PhotosDB(path) osxphotos.PhotosDB(path)
osxphotos.PhotosDB(dbfile=path) osxphotos.PhotosDB(dbfile=path)
``` ```
@@ -681,7 +688,7 @@ Pass the path to a Photos library or to a specific database file (e.g. "/Users/s
If an invalid path is passed, PhotosDB will raise `FileNotFoundError` exception. If an invalid path is passed, PhotosDB will raise `FileNotFoundError` exception.
**Note**: If neither path or dbfile is passed, PhotosDB will use get_last_library_path to open the last opened Photos library. This usually works but is not 100% reliable. It can also lead to loading a different library than expected if the user has held down *option* key when opening Photos to switch libraries. It is therefore recommended you explicitely pass the path to `PhotosDB()`. **Note**: If neither path or dbfile is passed, PhotosDB will use get_last_library_path to open the last opened Photos library. This usually works but is not 100% reliable. It can also lead to loading a different library than expected if the user has held down *option* key when opening Photos to switch libraries. You may therefore want to explicitely pass the path to `PhotosDB()`.
#### Open the default (last opened) Photos library #### Open the default (last opened) Photos library
@@ -767,6 +774,10 @@ Returns list of shared album names found in photos database (e.g. albums shared
**Note**: *Only valid for Photos 5 / MacOS 10.15*; on Photos <= 4, prints warning and returns empty list. **Note**: *Only valid for Photos 5 / MacOS 10.15*; on Photos <= 4, prints warning and returns empty list.
#### `import_info`
Returns a list of [ImportInfo](#importinfo) objects representing the import sessions for the database.
#### `folder_info` #### `folder_info`
```python ```python
# assumes photosdb is a PhotosDB object (see above) # assumes photosdb is a PhotosDB object (see above)
@@ -1053,6 +1064,9 @@ Returns a list of albums the photo is contained in. See also [album_info](#album
#### `album_info` #### `album_info`
Returns a list of [AlbumInfo](#AlbumInfo) objects representing the albums the photo is contained in. See also [albums](#albums). Returns a list of [AlbumInfo](#AlbumInfo) objects representing the albums the photo is contained in. See also [albums](#albums).
#### `import_info`
Returns an [ImportInfo](#importinfo) object representing the import session associated with the photo or `None` if there is no associated import session.
#### `persons` #### `persons`
Returns a list of the names of the persons in the photo Returns a list of the names of the persons in the photo
@@ -1378,6 +1392,15 @@ Returns the title or name of the album.
#### <a name="albumphotos">`photos`</a> #### <a name="albumphotos">`photos`</a>
Returns a list of [PhotoInfo](#PhotoInfo) objects representing each photo contained in the album sorted in the same order as in Photos. (e.g. if photos were manually sorted in the Photos albums, photos returned by `photos` will be in same order as they appear in the Photos album) Returns a list of [PhotoInfo](#PhotoInfo) objects representing each photo contained in the album sorted in the same order as in Photos. (e.g. if photos were manually sorted in the Photos albums, photos returned by `photos` will be in same order as they appear in the Photos album)
#### `creation_date`
Returns the creation date as a timezone aware datetime.datetime object of the album.
#### `start_date`
Returns the date of earliest photo in the album as a timezone aware datetime.datetime object.
#### `end_date`
Returns the date of latest photo in the album as a timezone aware datetime.datetime object.
#### `folder_list` #### `folder_list`
Returns a hierarchical list of [FolderInfo](#FolderInfo) objects representing the folders the album is contained in. For example, if album "AlbumInFolder" is in SubFolder2 of Folder1 as illustrated below, would return a list of `FolderInfo` objects representing ["Folder1", "SubFolder2"] Returns a hierarchical list of [FolderInfo](#FolderInfo) objects representing the folders the album is contained in. For example, if album "AlbumInFolder" is in SubFolder2 of Folder1 as illustrated below, would return a list of `FolderInfo` objects representing ["Folder1", "SubFolder2"]
@@ -1403,6 +1426,25 @@ Photos Library
#### `parent` #### `parent`
Returns a [FolderInfo](#FolderInfo) object representing the albums parent folder or `None` if album is not a in a folder. Returns a [FolderInfo](#FolderInfo) object representing the albums parent folder or `None` if album is not a in a folder.
### ImportInfo
PhotosDB.import_info returns a list of ImportInfo objects. Each ImportInfo object represents an import session in the library. PhotoInfo.import_info returns a single ImportInfo object representing the import session for the photo (or `None` if no associated import session).
**Note**: Photos 5+ only. Not implemented for Photos version <= 4.
#### `uuid`
Returns the universally unique identifier (uuid) of the import session. This is how Photos keeps track of individual objects within the database.
#### <a name="importphotos">`photos`</a>
Returns a list of [PhotoInfo](#PhotoInfo) objects representing each photo contained in the import session.
#### `creation_date`
Returns the creation date as a timezone aware datetime.datetime object of the import session.
#### `start_date`
Returns the start date as a timezone aware datetime.datetime object for when the import session bega.
#### `end_date`
Returns the end date as a timezone aware datetime.datetime object for when the import session completed.
### FolderInfo ### FolderInfo
PhotosDB.folder_info returns a list of FolderInfo objects representing the top level folders in the library. Each FolderInfo object represents a single folder in the Photos library. PhotosDB.folder_info returns a list of FolderInfo objects representing the top level folders in the library. Each FolderInfo object represents a single folder in the Photos library.
@@ -1807,6 +1849,7 @@ if __name__ == "__main__":
## Related Projects ## Related Projects
- [rhettbull/photosmeta](https://github.com/rhettbull/photosmeta): uses osxphotos and [exiftool](https://exiftool.org/) to apply metadata from Photos as exif data in the photo files. Can also export photos while preserving metadata and also apply Photos keywords as spotlight tags to make it easier to search for photos using spotlight. This is mostly made obsolete by osxphotos. The one feature that photosmeta has that osxphotos does not is ability to update the metadata of the actual photo files in the Photos library without exporting them. (Use with caution!) - [rhettbull/photosmeta](https://github.com/rhettbull/photosmeta): uses osxphotos and [exiftool](https://exiftool.org/) to apply metadata from Photos as exif data in the photo files. Can also export photos while preserving metadata and also apply Photos keywords as spotlight tags to make it easier to search for photos using spotlight. This is mostly made obsolete by osxphotos. The one feature that photosmeta has that osxphotos does not is ability to update the metadata of the actual photo files in the Photos library without exporting them. (Use with caution!)
- [rhettbull/PhotoScript](https://github.com/RhetTbull/PhotoScript): python wrapper around Photos' applescript API allowing automation of Photos (including creation/deletion of items) from python.
- [patrikhson/photo-export](https://github.com/patrikhson/photo-export): Exports older versions of Photos databases. Provided the inspiration for osxphotos. - [patrikhson/photo-export](https://github.com/patrikhson/photo-export): Exports older versions of Photos databases. Provided the inspiration for osxphotos.
- [orangeturtle739/photos-export](https://github.com/orangeturtle739/photos-export): Set of scripts to export Photos libraries. - [orangeturtle739/photos-export](https://github.com/orangeturtle739/photos-export): Set of scripts to export Photos libraries.
- [ndbroadbent/icloud_photos_downloader](https://github.com/ndbroadbent/icloud_photos_downloader): Download photos from iCloud. Currently unmaintained. - [ndbroadbent/icloud_photos_downloader](https://github.com/ndbroadbent/icloud_photos_downloader): Download photos from iCloud. Currently unmaintained.
@@ -1824,6 +1867,21 @@ If you have an interesting example that shows usage of this package, submit an i
Testing against "real world" Photos libraries would be especially helpful. If you discover issues in testing against your Photos libraries, please open an issue. I've done extensive testing against my own Photos library but that's a since data point and I'm certain there are issues lurking in various edge cases I haven't discovered yet. Testing against "real world" Photos libraries would be especially helpful. If you discover issues in testing against your Photos libraries, please open an issue. I've done extensive testing against my own Photos library but that's a since data point and I'm certain there are issues lurking in various edge cases I haven't discovered yet.
### Contributors
Thank-you to the following people who have contributed to improving osxphotos! If I've inadvertently left you off, please open an issue or send me a note.
- [britiscurious](https://github.com/britiscurious)
- [Michel Wortmann](https://github.com/mwort)
- [hshore29](https://github.com/hshore29)
- [Pablo 'merKur' Kohan](https://github.com/PabloKohan)
- [Jean-Yves Stervinou](https://github.com/jystervinou)
- [Thibault Deutsch](https://github.com/dethi)
- [grundsch](https://github.com/grundsch)
- [Ag Primatic](https://github.com/agprimatic)
- [Daniel M. Drucker](https://github.com/dmd)
## Known Bugs ## Known Bugs
My goal is make osxphotos as reliable and comprehensive as possible. The test suite currently has over 600 tests--but there are still some [bugs](https://github.com/RhetTbull/osxphotos/issues?q=is%3Aissue+is%3Aopen+label%3Abug) or incomplete features lurking. If you find bugs please open an [issue](https://github.com/RhetTbull/osxphotos/issues). Notable issues include: My goal is make osxphotos as reliable and comprehensive as possible. The test suite currently has over 600 tests--but there are still some [bugs](https://github.com/RhetTbull/osxphotos/issues?q=is%3Aissue+is%3Aopen+label%3Abug) or incomplete features lurking. If you find bugs please open an [issue](https://github.com/RhetTbull/osxphotos/issues). Notable issues include:
@@ -1838,7 +1896,7 @@ This package works by creating a copy of the sqlite3 database that photos uses t
If apple changes the database format this will likely break. If apple changes the database format this will likely break.
Apple does provide a framework ([PhotoKit](https://developer.apple.com/documentation/photokit?language=objc)) for querying the user's Photos library and I attempted to create the funcationality in this package using this framework but unfortunately PhotoKit does not provide access to much of the needed metadata (such as Faces/Persons) and Apple's System Integrity Protection (SIP) made the interface unreliable. If you'd like to experiment with the PhotoKit interface, here's some sample [code](https://gist.github.com/RhetTbull/41cc85e5bdeb30f761147ce32fba5c94). While copying the sqlite file is a bit kludgy, it allows osxphotos to provide access to all available metadata. Apple does provide a framework ([PhotoKit](https://developer.apple.com/documentation/photokit?language=objc)) for querying the user's Photos library and I attempted to create the functionality in this package using this framework but unfortunately PhotoKit does not provide access to much of the needed metadata (such as Faces/Persons) and Apple's System Integrity Protection (SIP) made the interface unreliable. If you'd like to experiment with the PhotoKit interface, here's some sample [code](https://gist.github.com/RhetTbull/41cc85e5bdeb30f761147ce32fba5c94). While copying the sqlite file is a bit kludgy, it allows osxphotos to provide access to all available metadata.
For additional details about how osxphotos is implemented or if you would like to extend the code, see the [wiki](https://github.com/RhetTbull/osxphotos/wiki). For additional details about how osxphotos is implemented or if you would like to extend the code, see the [wiki](https://github.com/RhetTbull/osxphotos/wiki).

View File

@@ -0,0 +1,42 @@
""" use osxphotos to force the download of photos from iCloud
downloads images to a temporary directory then deletes them
resulting in the photo being downloaded to Photos library
"""
import os
import sys
import tempfile
import osxphotos
def main():
photosdb = osxphotos.PhotosDB()
tempdir = tempfile.TemporaryDirectory()
photos = photosdb.photos()
downloaded = 0
missing = [photo for photo in photos if photo.ismissing and not photo.shared]
if not missing:
print(f"Did not find any missing photos to download")
sys.exit(0)
print(f"Downloading {len(missing)} photos")
for photo in missing:
if photo.ismissing:
print(f"Downloading photo {photo.original_filename}")
downloaded += 1
exported = photo.export(tempdir.name, use_photos_export=True, timeout=300)
if photo.hasadjustments:
exported.extend(
photo.export(tempdir.name, use_photos_export=True, edited=True, timeout=300)
)
for filename in exported:
print(f"Removing temporary file {filename}")
os.unlink(filename)
print(f"Downloaded {downloaded} photos")
tempdir.cleanup()
if __name__ == "__main__":
main()

View File

@@ -10,6 +10,7 @@ import pathlib
import pprint import pprint
import sys import sys
import time import time
import unicodedata
import click import click
import yaml import yaml
@@ -22,7 +23,12 @@ from pathvalidate import (
import osxphotos import osxphotos
from ._constants import _EXIF_TOOL_URL, _PHOTOS_4_VERSION, _UNKNOWN_PLACE from ._constants import (
_EXIF_TOOL_URL,
_PHOTOS_4_VERSION,
_UNKNOWN_PLACE,
UNICODE_FORMAT,
)
from ._export_db import ExportDB, ExportDBInMemory from ._export_db import ExportDB, ExportDBInMemory
from ._version import __version__ from ._version import __version__
from .datetime_formatter import DateTimeFormatter from .datetime_formatter import DateTimeFormatter
@@ -40,10 +46,24 @@ OSXPHOTOS_EXPORT_DB = ".osxphotos_export.db"
def verbose(*args, **kwargs): def verbose(*args, **kwargs):
""" print output if verbose flag set """
if VERBOSE: if VERBOSE:
click.echo(*args, **kwargs) click.echo(*args, **kwargs)
def normalize_unicode(value):
""" normalize unicode data """
if value is not None:
if isinstance(value, tuple):
return tuple(unicodedata.normalize(UNICODE_FORMAT, v) for v in value)
elif isinstance(value, str):
return unicodedata.normalize(UNICODE_FORMAT, value)
else:
return value
else:
return None
def get_photos_db(*db_options): def get_photos_db(*db_options):
""" Return path to photos db, select first non-None db_options """ Return path to photos db, select first non-None db_options
If no db_options are non-None, try to find library to use in If no db_options are non-None, try to find library to use in
@@ -592,9 +612,9 @@ def keywords(ctx, cli_obj, db, json_, photos_library):
photosdb = osxphotos.PhotosDB(dbfile=db) photosdb = osxphotos.PhotosDB(dbfile=db)
keywords = {"keywords": photosdb.keywords_as_dict} keywords = {"keywords": photosdb.keywords_as_dict}
if json_ or cli_obj.json: if json_ or cli_obj.json:
click.echo(json.dumps(keywords)) click.echo(json.dumps(keywords, ensure_ascii=False))
else: else:
click.echo(yaml.dump(keywords, sort_keys=False)) click.echo(yaml.dump(keywords, sort_keys=False, allow_unicode=True))
@cli.command() @cli.command()
@@ -621,9 +641,9 @@ def albums(ctx, cli_obj, db, json_, photos_library):
albums["shared albums"] = photosdb.albums_shared_as_dict albums["shared albums"] = photosdb.albums_shared_as_dict
if json_ or cli_obj.json: if json_ or cli_obj.json:
click.echo(json.dumps(albums)) click.echo(json.dumps(albums, ensure_ascii=False))
else: else:
click.echo(yaml.dump(albums, sort_keys=False)) click.echo(yaml.dump(albums, sort_keys=False, allow_unicode=True))
@cli.command() @cli.command()
@@ -647,9 +667,9 @@ def persons(ctx, cli_obj, db, json_, photos_library):
photosdb = osxphotos.PhotosDB(dbfile=db) photosdb = osxphotos.PhotosDB(dbfile=db)
persons = {"persons": photosdb.persons_as_dict} persons = {"persons": photosdb.persons_as_dict}
if json_ or cli_obj.json: if json_ or cli_obj.json:
click.echo(json.dumps(persons)) click.echo(json.dumps(persons, ensure_ascii=False))
else: else:
click.echo(yaml.dump(persons, sort_keys=False)) click.echo(yaml.dump(persons, sort_keys=False, allow_unicode=True))
@cli.command() @cli.command()
@@ -673,9 +693,9 @@ def labels(ctx, cli_obj, db, json_, photos_library):
photosdb = osxphotos.PhotosDB(dbfile=db) photosdb = osxphotos.PhotosDB(dbfile=db)
labels = {"labels": photosdb.labels_as_dict} labels = {"labels": photosdb.labels_as_dict}
if json_ or cli_obj.json: if json_ or cli_obj.json:
click.echo(json.dumps(labels)) click.echo(json.dumps(labels, ensure_ascii=False))
else: else:
click.echo(yaml.dump(labels, sort_keys=False)) click.echo(yaml.dump(labels, sort_keys=False, allow_unicode=True))
@cli.command() @cli.command()
@@ -733,9 +753,9 @@ def info(ctx, cli_obj, db, json_, photos_library):
info["persons"] = persons info["persons"] = persons
if cli_obj.json or json_: if cli_obj.json or json_:
click.echo(json.dumps(info)) click.echo(json.dumps(info, ensure_ascii=False))
else: else:
click.echo(yaml.dump(info, sort_keys=False)) click.echo(yaml.dump(info, sort_keys=False, allow_unicode=True))
@cli.command() @cli.command()
@@ -783,9 +803,9 @@ def places(ctx, cli_obj, db, json_, photos_library):
# below needed for to make CliRunner work for testing # below needed for to make CliRunner work for testing
cli_json = cli_obj.json if cli_obj is not None else None cli_json = cli_obj.json if cli_obj is not None else None
if json_ or cli_json: if json_ or cli_json:
click.echo(json.dumps(places)) click.echo(json.dumps(places, ensure_ascii=False))
else: else:
click.echo(yaml.dump(places, sort_keys=False)) click.echo(yaml.dump(places, sort_keys=False, allow_unicode=True))
@cli.command() @cli.command()
@@ -848,7 +868,7 @@ def _list_libraries(json_=False, error=True):
"system_library": sys_lib, "system_library": sys_lib,
"last_library": last_lib, "last_library": last_lib,
} }
click.echo(json.dumps(libs)) click.echo(json.dumps(libs, ensure_ascii=False))
else: else:
last_lib_flag = sys_lib_flag = False last_lib_flag = sys_lib_flag = False
@@ -1121,6 +1141,11 @@ def query(
help="Hardlink files instead of copying them. " help="Hardlink files instead of copying them. "
"Cannot be used with --exiftool which creates copies of the files with embedded EXIF data.", "Cannot be used with --exiftool which creates copies of the files with embedded EXIF data.",
) )
@click.option(
"--touch-file",
is_flag=True,
help="Sets the file's modification time to match photo date.",
)
@click.option( @click.option(
"--overwrite", "--overwrite",
is_flag=True, is_flag=True,
@@ -1140,6 +1165,11 @@ def query(
is_flag=True, is_flag=True,
help="Do not export edited version of photo if an edited version exists.", help="Do not export edited version of photo if an edited version exists.",
) )
@click.option(
"--skip-original-if-edited",
is_flag=True,
help="Do not export original if there is an edited version (exports only the edited version).",
)
@click.option( @click.option(
"--skip-bursts", "--skip-bursts",
is_flag=True, is_flag=True,
@@ -1264,6 +1294,13 @@ def query(
"to a filesystem that doesn't support Mac OS extended attributes. Only use this if you get " "to a filesystem that doesn't support Mac OS extended attributes. Only use this if you get "
"an error while exporting.", "an error while exporting.",
) )
@click.option(
"--use-photos-export",
is_flag=True,
default=False,
hidden=True,
help="Force the use of AppleScript to export even if not missing (see also --download-missing).",
)
@DB_ARGUMENT @DB_ARGUMENT
@click.argument("dest", nargs=1, type=click.Path(exists=True)) @click.argument("dest", nargs=1, type=click.Path(exists=True))
@click.pass_obj @click.pass_obj
@@ -1299,9 +1336,11 @@ def export(
update, update,
dry_run, dry_run,
export_as_hardlink, export_as_hardlink,
touch_file,
overwrite, overwrite,
export_by_date, export_by_date,
skip_edited, skip_edited,
skip_original_if_edited,
skip_bursts, skip_bursts,
skip_live, skip_live,
skip_raw, skip_raw,
@@ -1344,6 +1383,7 @@ def export(
label, label,
deleted, deleted,
deleted_only, deleted_only,
use_photos_export,
): ):
""" Export photos from the Photos database. """ Export photos from the Photos database.
Export path DEST is required. Export path DEST is required.
@@ -1383,6 +1423,7 @@ def export(
(export_as_hardlink, exiftool), (export_as_hardlink, exiftool),
(any(place), no_place), (any(place), no_place),
(deleted, deleted_only), (deleted, deleted_only),
(skip_edited, skip_original_if_edited),
] ]
if any(all(bb) for bb in exclusive): if any(all(bb) for bb in exclusive):
click.echo("Incompatible export options", err=True) click.echo("Incompatible export options", err=True)
@@ -1514,7 +1555,6 @@ def export(
deleted_only=deleted_only, deleted_only=deleted_only,
) )
results_exported = []
if photos: if photos:
if export_bursts: if export_bursts:
# add the burst_photos to the export set # add the burst_photos to the export set
@@ -1533,10 +1573,12 @@ def export(
# because the original code used --original-name as an option # because the original code used --original-name as an option
original_name = not current_name original_name = not current_name
results_exported = []
results_new = [] results_new = []
results_updated = [] results_updated = []
results_skipped = [] results_skipped = []
results_exif_updated = [] results_exif_updated = []
results_touched = []
if verbose_: if verbose_:
for p in photos: for p in photos:
results = export_photo( results = export_photo(
@@ -1549,6 +1591,7 @@ def export(
export_as_hardlink=export_as_hardlink, export_as_hardlink=export_as_hardlink,
overwrite=overwrite, overwrite=overwrite,
export_edited=export_edited, export_edited=export_edited,
skip_original_if_edited=skip_original_if_edited,
original_name=original_name, original_name=original_name,
export_live=export_live, export_live=export_live,
download_missing=download_missing, download_missing=download_missing,
@@ -1564,13 +1607,16 @@ def export(
export_db=export_db, export_db=export_db,
fileutil=fileutil, fileutil=fileutil,
dry_run=dry_run, dry_run=dry_run,
touch_file=touch_file,
edited_suffix=edited_suffix, edited_suffix=edited_suffix,
use_photos_export=use_photos_export,
) )
results_exported.extend(results.exported) results_exported.extend(results.exported)
results_new.extend(results.new) results_new.extend(results.new)
results_updated.extend(results.updated) results_updated.extend(results.updated)
results_skipped.extend(results.skipped) results_skipped.extend(results.skipped)
results_exif_updated.extend(results.exif_updated) results_exif_updated.extend(results.exif_updated)
results_touched.extend(results.touched)
else: else:
# show progress bar # show progress bar
@@ -1586,6 +1632,7 @@ def export(
export_as_hardlink=export_as_hardlink, export_as_hardlink=export_as_hardlink,
overwrite=overwrite, overwrite=overwrite,
export_edited=export_edited, export_edited=export_edited,
skip_original_if_edited=skip_original_if_edited,
original_name=original_name, original_name=original_name,
export_live=export_live, export_live=export_live,
download_missing=download_missing, download_missing=download_missing,
@@ -1601,31 +1648,38 @@ def export(
export_db=export_db, export_db=export_db,
fileutil=fileutil, fileutil=fileutil,
dry_run=dry_run, dry_run=dry_run,
touch_file=touch_file,
edited_suffix=edited_suffix, edited_suffix=edited_suffix,
use_photos_export=use_photos_export,
) )
results_exported.extend(results.exported) results_exported.extend(results.exported)
results_new.extend(results.new) results_new.extend(results.new)
results_updated.extend(results.updated) results_updated.extend(results.updated)
results_skipped.extend(results.skipped) results_skipped.extend(results.skipped)
results_exif_updated.extend(results.exif_updated) results_exif_updated.extend(results.exif_updated)
results_touched.extend(results.touched)
stop_time = time.perf_counter() stop_time = time.perf_counter()
# print summary results # print summary results
if update: if update:
photo_str_new = "photos" if len(results_new) != 1 else "photo" photo_str_new = "photos" if len(results_new) != 1 else "photo"
photo_str_updated = "photos" if len(results_new) != 1 else "photo" photo_str_updated = "photos" if len(results_updated) != 1 else "photo"
photo_str_skipped = "photos" if len(results_skipped) != 1 else "photo" photo_str_skipped = "photos" if len(results_skipped) != 1 else "photo"
photo_str_exif_updated = ( photo_str_exif_updated = (
"photos" if len(results_exif_updated) != 1 else "photo" "photos" if len(results_exif_updated) != 1 else "photo"
) )
click.echo( summary = (
f"Exported: {len(results_new)} {photo_str_new}, " f"Exported: {len(results_new)} {photo_str_new}, "
+ f"updated: {len(results_updated)} {photo_str_updated}, " f"updated: {len(results_updated)} {photo_str_updated}, "
+ f"skipped: {len(results_skipped)} {photo_str_skipped}, " f"skipped: {len(results_skipped)} {photo_str_skipped}, "
+ f"updated EXIF data: {len(results_exif_updated)} {photo_str_exif_updated}" f"updated EXIF data: {len(results_exif_updated)} {photo_str_exif_updated}"
) )
else: else:
photo_str = "photos" if len(results_exported) != 1 else "photo" photo_str = "photos" if len(results_exported) != 1 else "photo"
click.echo(f"Exported: {len(results_exported)} {photo_str}") summary = f"Exported: {len(results_exported)} {photo_str}"
photo_str_touched = "photos" if len(results_touched) != 1 else "photo"
if touch_file:
summary += f", touched date: {len(results_touched)} {photo_str_touched}"
click.echo(summary)
click.echo(f"Elapsed time: {(stop_time-start_time):.3f} seconds") click.echo(f"Elapsed time: {(stop_time-start_time):.3f} seconds")
else: else:
click.echo("Did not find any photos to export") click.echo("Did not find any photos to export")
@@ -1834,6 +1888,15 @@ def _query(
to_date=to_date, to_date=to_date,
) )
person = normalize_unicode(person)
keyword = normalize_unicode(keyword)
album = normalize_unicode(album)
folder = normalize_unicode(folder)
title = normalize_unicode(title)
description = normalize_unicode(description)
place = normalize_unicode(place)
label = normalize_unicode(label)
if album: if album:
photos = get_photos_by_attribute(photos, "albums", album, ignore_case) photos = get_photos_by_attribute(photos, "albums", album, ignore_case)
@@ -2058,6 +2121,7 @@ def export_photo(
export_as_hardlink=None, export_as_hardlink=None,
overwrite=None, overwrite=None,
export_edited=None, export_edited=None,
skip_original_if_edited=None,
original_name=None, original_name=None,
export_live=None, export_live=None,
download_missing=None, download_missing=None,
@@ -2073,7 +2137,9 @@ def export_photo(
export_db=None, export_db=None,
fileutil=FileUtil, fileutil=FileUtil,
dry_run=None, dry_run=None,
touch_file=None,
edited_suffix="_edited", edited_suffix="_edited",
use_photos_export=False,
): ):
""" Helper function for export that does the actual export """ Helper function for export that does the actual export
@@ -2094,6 +2160,8 @@ def export_photo(
filename_template: template use to determine output file filename_template: template use to determine output file
no_extended_attributes: boolean; if True, exports photo without preserving extended attributes no_extended_attributes: boolean; if True, exports photo without preserving extended attributes
export_raw: boolean; if True exports RAW image associate with the photo export_raw: boolean; if True exports RAW image associate with the photo
export_edited: boolean; if True exports edited version of photo if there is one
skip_original_if_edited: boolean; if True does not export original if photo has been edited
album_keyword: boolean; if True, exports album names as keywords in metadata album_keyword: boolean; if True, exports album names as keywords in metadata
person_keyword: boolean; if True, exports person names as keywords in metadata person_keyword: boolean; if True, exports person names as keywords in metadata
keyword_template: list of strings; if provided use rendered template strings as keywords keyword_template: list of strings; if provided use rendered template strings as keywords
@@ -2101,6 +2169,8 @@ def export_photo(
export_db: export database instance compatible with ExportDB_ABC export_db: export database instance compatible with ExportDB_ABC
fileutil: file util class compatible with FileUtilABC fileutil: file util class compatible with FileUtilABC
dry_run: boolean; if True, doesn't actually export or update any files dry_run: boolean; if True, doesn't actually export or update any files
touch_file: boolean; sets file's modification time to match photo date
use_photos_export: boolean; if True forces the use of AppleScript to export even if photo not missing
Returns: Returns:
list of path(s) of exported photo or None if photo was missing list of path(s) of exported photo or None if photo was missing
@@ -2114,30 +2184,33 @@ def export_photo(
if not download_missing: if not download_missing:
if photo.ismissing: if photo.ismissing:
space = " " if not verbose_ else "" space = " " if not verbose_ else ""
verbose(f"{space}Skipping missing photo {photo.filename}") verbose(f"{space}Skipping missing photo {photo.original_filename}")
return ExportResults([], [], [], [], []) return ExportResults([], [], [], [], [], [])
elif not os.path.exists(photo.path): elif not os.path.exists(photo.path):
space = " " if not verbose_ else "" space = " " if not verbose_ else ""
verbose( verbose(
f"{space}WARNING: file {photo.path} is missing but ismissing=False, " f"{space}WARNING: file {photo.path} is missing but ismissing=False, "
f"skipping {photo.filename}" f"skipping {photo.original_filename}"
) )
return ExportResults([], [], [], [], []) return ExportResults([], [], [], [], [], [])
elif photo.ismissing and not photo.iscloudasset or not photo.incloud: elif photo.ismissing and not photo.iscloudasset or not photo.incloud:
verbose( verbose(
f"Skipping missing {photo.filename}: not iCloud asset or missing from cloud" f"Skipping missing {photo.original_filename}: not iCloud asset or missing from cloud"
) )
return ExportResults([], [], [], [], []) return ExportResults([], [], [], [], [], [])
results_exported = [] results_exported = []
results_new = [] results_new = []
results_updated = [] results_updated = []
results_skipped = [] results_skipped = []
results_exif_updated = [] results_exif_updated = []
results_touched = []
export_original = not (skip_original_if_edited and photo.hasadjustments)
filenames = get_filenames_from_template(photo, filename_template, original_name) filenames = get_filenames_from_template(photo, filename_template, original_name)
for filename in filenames: for filename in filenames:
verbose(f"Exporting {photo.filename} as {filename}") verbose(f"Exporting {photo.original_filename} ({photo.filename}) as {filename}")
dest_paths = get_dirnames_from_template( dest_paths = get_dirnames_from_template(
photo, directory, export_by_date, dest, dry_run photo, directory, export_by_date, dest, dry_run
@@ -2152,56 +2225,64 @@ def export_photo(
# if download_missing and the photo is missing or path doesn't exist, # if download_missing and the photo is missing or path doesn't exist,
# try to download with Photos # try to download with Photos
use_photos_export = download_missing and ( use_photos_export = (
photo.ismissing or not os.path.exists(photo.path) download_missing and (photo.ismissing or not os.path.exists(photo.path))
if not use_photos_export
else True
) )
# export the photo to each path in dest_paths # export the photo to each path in dest_paths
for dest_path in dest_paths: for dest_path in dest_paths:
export_results = photo.export2( if not export_original:
dest_path, verbose(f"Skipping original version of {photo.original_filename}")
filename, else:
sidecar_json=sidecar_json, export_results = photo.export2(
sidecar_xmp=sidecar_xmp, dest_path,
live_photo=export_live, filename,
raw_photo=export_raw, sidecar_json=sidecar_json,
export_as_hardlink=export_as_hardlink, sidecar_xmp=sidecar_xmp,
overwrite=overwrite, live_photo=export_live,
use_photos_export=use_photos_export, raw_photo=export_raw,
exiftool=exiftool, export_as_hardlink=export_as_hardlink,
no_xattr=no_extended_attributes, overwrite=overwrite,
use_albums_as_keywords=album_keyword, use_photos_export=use_photos_export,
use_persons_as_keywords=person_keyword, exiftool=exiftool,
keyword_template=keyword_template, no_xattr=no_extended_attributes,
description_template=description_template, use_albums_as_keywords=album_keyword,
update=update, use_persons_as_keywords=person_keyword,
export_db=export_db, keyword_template=keyword_template,
fileutil=fileutil, description_template=description_template,
dry_run=dry_run, update=update,
) export_db=export_db,
fileutil=fileutil,
dry_run=dry_run,
touch_file=touch_file,
)
results_exported.extend(export_results.exported) results_exported.extend(export_results.exported)
results_new.extend(export_results.new) results_new.extend(export_results.new)
results_updated.extend(export_results.updated) results_updated.extend(export_results.updated)
results_skipped.extend(export_results.skipped) results_skipped.extend(export_results.skipped)
results_exif_updated.extend(export_results.exif_updated) results_exif_updated.extend(export_results.exif_updated)
results_touched.extend(export_results.touched)
if verbose_: if verbose_:
for exported in export_results.exported: for exported in export_results.exported:
verbose(f"Exported {exported}") verbose(f"Exported {exported}")
for new in export_results.new: for new in export_results.new:
verbose(f"Exported new file {new}") verbose(f"Exported new file {new}")
for updated in export_results.updated: for updated in export_results.updated:
verbose(f"Exported updated file {updated}") verbose(f"Exported updated file {updated}")
for skipped in export_results.skipped: for skipped in export_results.skipped:
verbose(f"Skipped up to date file {skipped}") verbose(f"Skipped up to date file {skipped}")
for touched in export_results.touched:
verbose(f"Touched date on file {touched}")
# if export-edited, also export the edited version # if export-edited, also export the edited version
# verify the photo has adjustments and valid path to avoid raising an exception # verify the photo has adjustments and valid path to avoid raising an exception
if export_edited and photo.hasadjustments: if export_edited and photo.hasadjustments:
# if download_missing and the photo is missing or path doesn't exist, # if download_missing and the photo is missing or path doesn't exist,
# try to download with Photos # try to download with Photos
use_photos_export = download_missing and photo.path_edited is None
if not download_missing and photo.path_edited is None: if not download_missing and photo.path_edited is None:
verbose(f"Skipping missing edited photo for {filename}") verbose(f"Skipping missing edited photo for {filename}")
else: else:
@@ -2234,6 +2315,7 @@ def export_photo(
export_db=export_db, export_db=export_db,
fileutil=fileutil, fileutil=fileutil,
dry_run=dry_run, dry_run=dry_run,
touch_file=touch_file,
) )
results_exported.extend(export_results_edited.exported) results_exported.extend(export_results_edited.exported)
@@ -2241,6 +2323,7 @@ def export_photo(
results_updated.extend(export_results_edited.updated) results_updated.extend(export_results_edited.updated)
results_skipped.extend(export_results_edited.skipped) results_skipped.extend(export_results_edited.skipped)
results_exif_updated.extend(export_results_edited.exif_updated) results_exif_updated.extend(export_results_edited.exif_updated)
results_touched.extend(export_results_edited.touched)
if verbose_: if verbose_:
for exported in export_results_edited.exported: for exported in export_results_edited.exported:
@@ -2251,6 +2334,8 @@ def export_photo(
verbose(f"Exported updated file {updated}") verbose(f"Exported updated file {updated}")
for skipped in export_results_edited.skipped: for skipped in export_results_edited.skipped:
verbose(f"Skipped up to date file {skipped}") verbose(f"Skipped up to date file {skipped}")
for touched in export_results_edited.touched:
verbose(f"Touched date on file {touched}")
return ExportResults( return ExportResults(
results_exported, results_exported,
@@ -2258,6 +2343,7 @@ def export_photo(
results_updated, results_updated,
results_skipped, results_skipped,
results_exif_updated, results_exif_updated,
results_touched,
) )

View File

@@ -3,6 +3,14 @@ Constants used by osxphotos
""" """
import os.path import os.path
from datetime import datetime
# Time delta: add this to Photos times to get unix time
# Apple Epoch is Jan 1, 2001
TIME_DELTA = (datetime(2001, 1, 1, 0, 0) - datetime(1970, 1, 1, 0, 0)).total_seconds()
# Unicode format to use for comparing strings
UNICODE_FORMAT = "NFC"
# which Photos library database versions have been tested # which Photos library database versions have been tested
# Photos 2.0 (10.12.6) == 2622 # Photos 2.0 (10.12.6) == 2622
@@ -36,11 +44,17 @@ _DB_TABLE_NAMES = {
"ASSET": "ZGENERICASSET", "ASSET": "ZGENERICASSET",
"KEYWORD_JOIN": "Z_1KEYWORDS.Z_37KEYWORDS", "KEYWORD_JOIN": "Z_1KEYWORDS.Z_37KEYWORDS",
"ALBUM_JOIN": "Z_26ASSETS.Z_34ASSETS", "ALBUM_JOIN": "Z_26ASSETS.Z_34ASSETS",
"ALBUM_SORT_ORDER": "Z_26ASSETS.Z_FOK_34ASSETS",
"IMPORT_FOK": "ZGENERICASSET.Z_FOK_IMPORTSESSION",
"DEPTH_STATE": "ZGENERICASSET.ZDEPTHSTATES",
}, },
6: { 6: {
"ASSET": "ZASSET", "ASSET": "ZASSET",
"KEYWORD_JOIN": "Z_1KEYWORDS.Z_36KEYWORDS", "KEYWORD_JOIN": "Z_1KEYWORDS.Z_36KEYWORDS",
"ALBUM_JOIN": "Z_26ASSETS.Z_3ASSETS", "ALBUM_JOIN": "Z_26ASSETS.Z_3ASSETS",
"ALBUM_SORT_ORDER": "Z_26ASSETS.Z_FOK_3ASSETS",
"IMPORT_FOK": "null",
"DEPTH_STATE": "ZASSET.ZDEPTHTYPE",
}, },
} }
@@ -71,6 +85,7 @@ _PHOTOS_5_ALBUM_KIND = 2 # normal user album
_PHOTOS_5_SHARED_ALBUM_KIND = 1505 # shared album _PHOTOS_5_SHARED_ALBUM_KIND = 1505 # shared album
_PHOTOS_5_FOLDER_KIND = 4000 # user folder _PHOTOS_5_FOLDER_KIND = 4000 # user folder
_PHOTOS_5_ROOT_FOLDER_KIND = 3999 # root folder _PHOTOS_5_ROOT_FOLDER_KIND = 3999 # root folder
_PHOTOS_5_IMPORT_SESSION_ALBUM_KIND = 1506 # import session
_PHOTOS_4_ALBUM_KIND = 3 # RKAlbum.albumSubclass _PHOTOS_4_ALBUM_KIND = 3 # RKAlbum.albumSubclass
_PHOTOS_4_TOP_LEVEL_ALBUM = "TopLevelAlbums" _PHOTOS_4_TOP_LEVEL_ALBUM = "TopLevelAlbums"

View File

@@ -189,7 +189,12 @@ class ExportDB(ExportDB_ABC):
(filename,), (filename,),
) )
results = c.fetchone() results = c.fetchone()
stats = results[0:3] if results else None if results:
stats = results[0:3]
mtime = int(stats[2]) if stats[2] is not None else None
stats = (stats[0], stats[1], mtime)
else:
stats = (None, None, None)
except Error as e: except Error as e:
logging.warning(e) logging.warning(e)
stats = (None, None, None) stats = (None, None, None)
@@ -232,7 +237,12 @@ class ExportDB(ExportDB_ABC):
(filename,), (filename,),
) )
results = c.fetchone() results = c.fetchone()
stats = results[0:3] if results else None if results:
stats = results[0:3]
mtime = int(stats[2]) if stats[2] is not None else None
stats = (stats[0], stats[1], mtime)
else:
stats = (None, None, None)
except Error as e: except Error as e:
logging.warning(e) logging.warning(e)
stats = (None, None, None) stats = (None, None, None)

View File

@@ -1,3 +1,3 @@
""" version info """ """ version info """
__version__ = "0.32.0" __version__ = "0.34.2"

View File

@@ -10,7 +10,7 @@ Represents a single Folder in the Photos library and provides access to the fold
PhotosDB.folders() returns a list of FolderInfo objects PhotosDB.folders() returns a list of FolderInfo objects
""" """
import logging from datetime import datetime, timedelta, timezone
from ._constants import ( from ._constants import (
_PHOTOS_4_ALBUM_KIND, _PHOTOS_4_ALBUM_KIND,
@@ -18,11 +18,34 @@ from ._constants import (
_PHOTOS_4_VERSION, _PHOTOS_4_VERSION,
_PHOTOS_5_ALBUM_KIND, _PHOTOS_5_ALBUM_KIND,
_PHOTOS_5_FOLDER_KIND, _PHOTOS_5_FOLDER_KIND,
TIME_DELTA,
) )
from .datetime_utils import get_local_tz
class AlbumInfo: def sort_list_by_keys(values, sort_keys):
""" Sorts list values by a second list sort_keys
e.g. given ["a","c","b"], [1, 3, 2], returns ["a", "b", "c"]
Args:
values: a list of values to be sorted
sort_keys: a list of keys to sort values by
Returns:
list of values, sorted by sort_keys
Raises:
ValueError: raised if len(values) != len(sort_keys)
""" """
if len(values) != len(sort_keys):
return ValueError("values and sort_keys must have same length")
return list(zip(*sorted(zip(sort_keys, values))))[1]
class AlbumInfoBaseClass:
"""
Base class for AlbumInfo, ImportInfo
Info about a specific Album, contains all the details about the album Info about a specific Album, contains all the details about the album
including folders, photos, etc. including folders, photos, etc.
""" """
@@ -31,33 +54,109 @@ class AlbumInfo:
self._uuid = uuid self._uuid = uuid
self._db = db self._db = db
self._title = self._db._dbalbum_details[uuid]["title"] self._title = self._db._dbalbum_details[uuid]["title"]
self._creation_date_timestamp = self._db._dbalbum_details[uuid]["creation_date"]
@property self._start_date_timestamp = self._db._dbalbum_details[uuid]["start_date"]
def title(self): self._end_date_timestamp = self._db._dbalbum_details[uuid]["end_date"]
""" return title / name of album """ self._local_tz = get_local_tz(
return self._title datetime.fromtimestamp(self._creation_date_timestamp + TIME_DELTA)
)
@property @property
def uuid(self): def uuid(self):
""" return uuid of album """ """ return uuid of album """
return self._uuid return self._uuid
@property
def creation_date(self):
""" return creation date of album """
try:
return self._creation_date
except AttributeError:
try:
self._creation_date = (
datetime.fromtimestamp(
self._creation_date_timestamp + TIME_DELTA
).astimezone(tz=self._local_tz)
if self._creation_date_timestamp
else datetime(1970, 1, 1, 0, 0, 0).astimezone(
tz=timezone(timedelta(0))
)
)
except ValueError:
self._creation_date = datetime(1970, 1, 1, 0, 0, 0).astimezone(
tz=timezone(timedelta(0))
)
return self._creation_date
@property
def start_date(self):
""" For Albums, return start date (earliest image) of album or None for albums with no images
For Import Sessions, return start date of import session (when import began) """
try:
return self._start_date
except AttributeError:
try:
self._start_date = (
datetime.fromtimestamp(
self._start_date_timestamp + TIME_DELTA
).astimezone(tz=self._local_tz)
if self._start_date_timestamp
else None
)
except ValueError:
self._start_date = None
return self._start_date
@property
def end_date(self):
""" For Albums, return end date (most recent image) of album or None for albums with no images
For Import Sessions, return end date of import sessions (when import was completed) """
try:
return self._end_date
except AttributeError:
try:
self._end_date = (
datetime.fromtimestamp(
self._end_date_timestamp + TIME_DELTA
).astimezone(tz=self._local_tz)
if self._end_date_timestamp
else None
)
except ValueError:
self._end_date = None
return self._end_date
@property @property
def photos(self): def photos(self):
""" return list of photos contained in album """ return []
def __len__(self):
""" return number of photos contained in album """
return len(self.photos)
class AlbumInfo(AlbumInfoBaseClass):
"""
Base class for AlbumInfo, ImportInfo
Info about a specific Album, contains all the details about the album
including folders, photos, etc.
"""
@property
def title(self):
""" return title / name of album """
return self._title
@property
def photos(self):
""" return list of photos contained in album sorted in same sort order as Photos """
try: try:
return self._photos return self._photos
except AttributeError: except AttributeError:
if self.uuid in self._db._dbalbums_album: if self.uuid in self._db._dbalbums_album:
uuid, sort_order = zip(*self._db._dbalbums_album[self.uuid]) uuid, sort_order = zip(*self._db._dbalbums_album[self.uuid])
self._photos = self._db.photos(uuid=uuid) sorted_uuid = sort_list_by_keys(uuid, sort_order)
# PhotosDB.photos does not preserve order when passing in list of uuids self._photos = self._db.photos_by_uuid(sorted_uuid)
# so need to build photo list one a time
# sort uuids by sort order
sorted_uuid = sorted(zip(sort_order, uuid))
self._photos = [
self._db.photos(uuid=[uuid])[0] for _, uuid in sorted_uuid
]
else: else:
self._photos = [] self._photos = []
return self._photos return self._photos
@@ -110,9 +209,24 @@ class AlbumInfo:
) )
return self._parent return self._parent
def __len__(self):
""" return number of photos contained in album """ class ImportInfo(AlbumInfoBaseClass):
return len(self.photos) @property
def photos(self):
""" return list of photos contained in import session """
try:
return self._photos
except AttributeError:
uuid_list, sort_order = zip(
*[
(uuid, self._db._dbphotos[uuid]["fok_import_session"])
for uuid in self._db._dbphotos
if self._db._dbphotos[uuid]["import_uuid"] == self.uuid
]
)
sorted_uuid = sort_list_by_keys(uuid_list, sort_order)
self._photos = self._db.photos_by_uuid(sorted_uuid)
return self._photos
class FolderInfo: class FolderInfo:

View File

@@ -2,14 +2,23 @@
import datetime import datetime
def get_local_tz():
""" return local timezone as datetime.timezone tzinfo """ def get_local_tz(dt):
local_tz = ( """ return local timezone as datetime.timezone tzinfo for dt
datetime.datetime.now(datetime.timezone(datetime.timedelta(0)))
.astimezone() Args:
.tzinfo dt: datetime.datetime
)
return local_tz Returns:
local timezone for dt as datetime.timezone
Raises:
ValueError if dt is not timezone naive
"""
if not datetime_has_tz(dt):
return dt.astimezone().tzinfo
else:
raise ValueError("dt must be naive datetime.datetime object")
def datetime_remove_tz(dt): def datetime_remove_tz(dt):
@@ -20,8 +29,7 @@ def datetime_remove_tz(dt):
if type(dt) != datetime.datetime: if type(dt) != datetime.datetime:
raise TypeError(f"dt must be type datetime.datetime, not {type(dt)}") raise TypeError(f"dt must be type datetime.datetime, not {type(dt)}")
dt_new = dt.replace(tzinfo=None) return dt.replace(tzinfo=None)
return dt_new
def datetime_has_tz(dt): def datetime_has_tz(dt):
@@ -32,9 +40,7 @@ def datetime_has_tz(dt):
if type(dt) != datetime.datetime: if type(dt) != datetime.datetime:
raise TypeError(f"dt must be type datetime.datetime, not {type(dt)}") raise TypeError(f"dt must be type datetime.datetime, not {type(dt)}")
if dt.tzinfo is not None and dt.tzinfo.utcoffset(dt) is not None: return dt.tzinfo is not None and dt.tzinfo.utcoffset(dt) is not None
return True
return False
def datetime_naive_to_local(dt): def datetime_naive_to_local(dt):
@@ -53,5 +59,4 @@ def datetime_naive_to_local(dt):
f"{dt} has tzinfo {dt.tzinfo} and offset {dt.tizinfo.utcoffset(dt)}" f"{dt} has tzinfo {dt.tzinfo} and offset {dt.tizinfo.utcoffset(dt)}"
) )
dt_local = dt.replace(tzinfo=get_local_tz()) return dt.replace(tzinfo=get_local_tz(dt))
return dt_local

View File

@@ -8,6 +8,7 @@
import json import json
import logging import logging
import os import os
import shutil
import subprocess import subprocess
import sys import sys
from functools import lru_cache # pylint: disable=syntax-error from functools import lru_cache # pylint: disable=syntax-error
@@ -22,8 +23,7 @@ EXIFTOOL_STAYOPEN_EOF_LEN = len(EXIFTOOL_STAYOPEN_EOF)
@lru_cache(maxsize=1) @lru_cache(maxsize=1)
def get_exiftool_path(): def get_exiftool_path():
""" return path of exiftool, cache result """ """ return path of exiftool, cache result """
result = subprocess.run(["which", "exiftool"], stdout=subprocess.PIPE) exiftool_path = shutil.which('exiftool')
exiftool_path = result.stdout.decode("utf-8")
if _debug(): if _debug():
logging.debug("exiftool path = %s" % (exiftool_path)) logging.debug("exiftool path = %s" % (exiftool_path))
if exiftool_path: if exiftool_path:

View File

@@ -29,7 +29,17 @@ class FileUtilABC(ABC):
@classmethod @classmethod
@abstractmethod @abstractmethod
def cmp_sig(cls, file1, file2): def utime(cls, path, times):
pass
@classmethod
@abstractmethod
def cmp(cls, file1, file2, mtime1=None):
pass
@classmethod
@abstractmethod
def cmp_file_sig(cls, file1, file2):
pass pass
@classmethod @classmethod
@@ -104,11 +114,37 @@ class FileUtilMacOS(FileUtilABC):
os.unlink(filepath) os.unlink(filepath)
@classmethod @classmethod
def cmp_sig(cls, f1, s2): def utime(cls, path, times):
""" Set the access and modified time of path. """
os.utime(path, times)
@classmethod
def cmp(cls, f1, f2, mtime1=None):
"""Does shallow compare (file signatures) of f1 to file f2.
Arguments:
f1 -- File name
f2 -- File name
mtime1 -- optional, pass alternate file modification timestamp for f1; will be converted to int
Return value:
True if the file signatures as returned by stat are the same, False otherwise.
Does not do a byte-by-byte comparison.
"""
s1 = cls._sig(os.stat(f1))
if mtime1 is not None:
s1 = (s1[0], s1[1], int(mtime1))
s2 = cls._sig(os.stat(f2))
if s1[0] != stat.S_IFREG or s2[0] != stat.S_IFREG:
return False
return s1 == s2
@classmethod
def cmp_file_sig(cls, f1, s2):
"""Compare file f1 to signature s2. """Compare file f1 to signature s2.
Arguments: Arguments:
f1 -- File name f1 -- File name
s2 -- stats as returned by sig s2 -- stats as returned by _sig
Return value: Return value:
True if the files are the same, False otherwise. True if the files are the same, False otherwise.
@@ -130,7 +166,12 @@ class FileUtilMacOS(FileUtilABC):
@staticmethod @staticmethod
def _sig(st): def _sig(st):
return (stat.S_IFMT(st.st_mode), st.st_size, st.st_mtime) """ return tuple of (mode, size, mtime) of file based on os.stat
Args:
st: os.stat signature
"""
# use int(st.st_mtime) because ditto does not copy fractional portion of mtime
return (stat.S_IFMT(st.st_mode), st.st_size, int(st.st_mtime))
class FileUtil(FileUtilMacOS): class FileUtil(FileUtilMacOS):
@@ -141,8 +182,8 @@ class FileUtil(FileUtilMacOS):
class FileUtilNoOp(FileUtil): class FileUtilNoOp(FileUtil):
""" No-Op implementation of FileUtil for testing / dry-run mode """ No-Op implementation of FileUtil for testing / dry-run mode
all methods with exception of cmp_sig and file_cmp are no-op all methods with exception of cmp, cmp_file_sig and file_cmp are no-op
cmp_sig functions as FileUtil.cmp_sig does cmp and cmp_file_sig functions as FileUtil methods do
file_cmp returns mock data file_cmp returns mock data
""" """
@@ -172,6 +213,10 @@ class FileUtilNoOp(FileUtil):
def unlink(cls, dest): def unlink(cls, dest):
cls.verbose(f"unlink: {dest}") cls.verbose(f"unlink: {dest}")
@classmethod
def utime(cls, path, times):
cls.verbose(f"utime: {path}, {times}")
@classmethod @classmethod
def file_sig(cls, file1): def file_sig(cls, file1):
cls.verbose(f"file_sig: {file1}") cls.verbose(f"file_sig: {file1}")

View File

@@ -11,7 +11,6 @@
# TODO: should this be its own PhotoExporter class? # TODO: should this be its own PhotoExporter class?
import filecmp
import glob import glob
import json import json
import logging import logging
@@ -37,7 +36,8 @@ from ..fileutil import FileUtil
from ..utils import dd_to_dms_str, findfiles from ..utils import dd_to_dms_str, findfiles
ExportResults = namedtuple( ExportResults = namedtuple(
"ExportResults", ["exported", "new", "updated", "skipped", "exif_updated"] "ExportResults",
["exported", "new", "updated", "skipped", "exif_updated", "touched"],
) )
@@ -106,7 +106,7 @@ def _export_photo_uuid_applescript(
) )
dest = pathlib.Path(dest) dest = pathlib.Path(dest)
if not dest.is_dir: if not dest.is_dir():
raise ValueError(f"dest {dest} must be a directory") raise ValueError(f"dest {dest} must be a directory")
if not original ^ edited: if not original ^ edited:
@@ -305,6 +305,7 @@ def export2(
export_db=None, export_db=None,
fileutil=FileUtil, fileutil=FileUtil,
dry_run=False, dry_run=False,
touch_file=False,
): ):
""" export photo, like export but with update and dry_run options """ export photo, like export but with update and dry_run options
dest: must be valid destination path or exception raised dest: must be valid destination path or exception raised
@@ -347,6 +348,7 @@ def export2(
for getting/setting data related to exported files to compare update state for getting/setting data related to exported files to compare update state
fileutil: (FileUtilABC); class that conforms to FileUtilABC with various file utilities fileutil: (FileUtilABC); class that conforms to FileUtilABC with various file utilities
dry_run: (boolean, default=False); set to True to run in "dry run" mode dry_run: (boolean, default=False); set to True to run in "dry run" mode
touch_file: (boolean, default=False); if True, sets file's modification time upon photo date
Returns: ExportResults namedtuple with fields: exported, new, updated, skipped Returns: ExportResults namedtuple with fields: exported, new, updated, skipped
where each field is a list of file paths where each field is a list of file paths
@@ -375,6 +377,9 @@ def export2(
# list of all files skipped because they do not need to be updated (for use with update=True) # list of all files skipped because they do not need to be updated (for use with update=True)
update_skipped_files = [] update_skipped_files = []
# list of all files with utime touched (touch_file = True)
touched_files = []
# check edited and raise exception trying to export edited version of # check edited and raise exception trying to export edited version of
# photo that hasn't been edited # photo that hasn't been edited
if edited and not self.hasadjustments: if edited and not self.hasadjustments:
@@ -482,7 +487,7 @@ def export2(
if update and dest.exists(): if update and dest.exists():
# destination exists, check to see if destination is the right UUID # destination exists, check to see if destination is the right UUID
dest_uuid = export_db.get_uuid_for_file(dest) dest_uuid = export_db.get_uuid_for_file(dest)
if dest_uuid is None and filecmp.cmp(src, dest): if dest_uuid is None and fileutil.cmp(src, dest):
# might be exporting into a pre-ExportDB folder or the DB got deleted # might be exporting into a pre-ExportDB folder or the DB got deleted
logging.debug( logging.debug(
f"Found matching file with blank uuid: {self.uuid}, {dest}" f"Found matching file with blank uuid: {self.uuid}, {dest}"
@@ -514,7 +519,7 @@ def export2(
dest = pathlib.Path(file_) dest = pathlib.Path(file_)
found_match = True found_match = True
break break
elif dest_uuid is None and filecmp.cmp(src, file_): elif dest_uuid is None and fileutil.cmp(src, file_):
# files match, update the UUID # files match, update the UUID
logging.debug( logging.debug(
f"Found matching file with blank uuid: {self.uuid}, {file_}" f"Found matching file with blank uuid: {self.uuid}, {file_}"
@@ -558,12 +563,14 @@ def export2(
no_xattr, no_xattr,
export_as_hardlink, export_as_hardlink,
exiftool, exiftool,
touch_file,
fileutil, fileutil,
) )
exported_files = results.exported exported_files = results.exported
update_new_files = results.new update_new_files = results.new
update_updated_files = results.updated update_updated_files = results.updated
update_skipped_files = results.skipped update_skipped_files = results.skipped
touched_files = results.touched
# copy live photo associated .mov if requested # copy live photo associated .mov if requested
if live_photo and self.live_photo: if live_photo and self.live_photo:
@@ -583,12 +590,14 @@ def export2(
no_xattr, no_xattr,
export_as_hardlink, export_as_hardlink,
exiftool, exiftool,
touch_file,
fileutil, fileutil,
) )
exported_files.extend(results.exported) exported_files.extend(results.exported)
update_new_files.extend(results.new) update_new_files.extend(results.new)
update_updated_files.extend(results.updated) update_updated_files.extend(results.updated)
update_skipped_files.extend(results.skipped) update_skipped_files.extend(results.skipped)
touched_files.extend(results.touched)
else: else:
logging.debug(f"Skipping missing live movie for {filename}") logging.debug(f"Skipping missing live movie for {filename}")
@@ -608,17 +617,19 @@ def export2(
no_xattr, no_xattr,
export_as_hardlink, export_as_hardlink,
exiftool, exiftool,
touch_file,
fileutil, fileutil,
) )
exported_files.extend(results.exported) exported_files.extend(results.exported)
update_new_files.extend(results.new) update_new_files.extend(results.new)
update_updated_files.extend(results.updated) update_updated_files.extend(results.updated)
update_skipped_files.extend(results.skipped) update_skipped_files.extend(results.skipped)
touched_files.extend(results.touched)
else: else:
logging.debug(f"Skipping missing RAW photo for {filename}") logging.debug(f"Skipping missing RAW photo for {filename}")
else: else:
# use_photo_export # use_photo_export
exported = None exported = []
# export live_photo .mov file? # export live_photo .mov file?
live_photo = True if live_photo and self.live_photo else False live_photo = True if live_photo and self.live_photo else False
if edited: if edited:
@@ -628,7 +639,7 @@ def export2(
filestem = dest.stem filestem = dest.stem
else: else:
# didn't get passed a filename, add _edited # didn't get passed a filename, add _edited
filestem = f"{dest.stem}_edited" filestem = f"{dest.stem}{edited_identifier}"
dest = dest.parent / f"{filestem}.jpeg" dest = dest.parent / f"{filestem}.jpeg"
exported = _export_photo_uuid_applescript( exported = _export_photo_uuid_applescript(
@@ -657,8 +668,16 @@ def export2(
dry_run=dry_run, dry_run=dry_run,
) )
if exported is not None: if exported:
if touch_file:
for exported_file in exported:
touched_files.append(exported_file)
ts = int(self.date.timestamp())
fileutil.utime(exported_file, (ts, ts))
exported_files.extend(exported) exported_files.extend(exported)
if update:
update_new_files.extend(exported)
else: else:
logging.warning( logging.warning(
f"Error exporting photo {self.uuid} to {dest} with use_photos_export" f"Error exporting photo {self.uuid} to {dest} with use_photos_export"
@@ -669,7 +688,7 @@ def export2(
if sidecar_json: if sidecar_json:
logging.debug("writing exiftool_json_sidecar") logging.debug("writing exiftool_json_sidecar")
sidecar_filename = dest.parent / pathlib.Path(f"{dest.stem}.json") sidecar_filename = dest.parent / pathlib.Path(f"{dest.stem}{dest.suffix}.json")
sidecar_str = self._exiftool_json_sidecar( sidecar_str = self._exiftool_json_sidecar(
use_albums_as_keywords=use_albums_as_keywords, use_albums_as_keywords=use_albums_as_keywords,
use_persons_as_keywords=use_persons_as_keywords, use_persons_as_keywords=use_persons_as_keywords,
@@ -685,12 +704,13 @@ def export2(
if sidecar_xmp: if sidecar_xmp:
logging.debug("writing xmp_sidecar") logging.debug("writing xmp_sidecar")
sidecar_filename = dest.parent / pathlib.Path(f"{dest.stem}.xmp") sidecar_filename = dest.parent / pathlib.Path(f"{dest.stem}{dest.suffix}.xmp")
sidecar_str = self._xmp_sidecar( sidecar_str = self._xmp_sidecar(
use_albums_as_keywords=use_albums_as_keywords, use_albums_as_keywords=use_albums_as_keywords,
use_persons_as_keywords=use_persons_as_keywords, use_persons_as_keywords=use_persons_as_keywords,
keyword_template=keyword_template, keyword_template=keyword_template,
description_template=description_template, description_template=description_template,
extension=dest.suffix[1:] if dest.suffix else None,
) )
if not dry_run: if not dry_run:
try: try:
@@ -760,6 +780,7 @@ def export2(
keyword_template=keyword_template, keyword_template=keyword_template,
description_template=description_template, description_template=description_template,
) )
export_db.set_exifdata_for_file( export_db.set_exifdata_for_file(
exported_file, exported_file,
self._exiftool_json_sidecar( self._exiftool_json_sidecar(
@@ -774,13 +795,23 @@ def export2(
) )
exif_files_updated.append(exported_file) exif_files_updated.append(exported_file)
return ExportResults( if touch_file:
for exif_file in exif_files_updated:
touched_files.append(exported_file)
ts = int(self.date.timestamp())
fileutil.utime(exported_file, (ts, ts))
touched_files = list(set(touched_files))
results = ExportResults(
exported_files, exported_files,
update_new_files, update_new_files,
update_updated_files, update_updated_files,
update_skipped_files, update_skipped_files,
exif_files_updated, exif_files_updated,
touched_files,
) )
return results
def _export_photo( def _export_photo(
@@ -793,11 +824,12 @@ def _export_photo(
no_xattr, no_xattr,
export_as_hardlink, export_as_hardlink,
exiftool, exiftool,
touch_file,
fileutil=FileUtil, fileutil=FileUtil,
): ):
""" Helper function for export() """ Helper function for export()
Does the actual copy or hardlink taking the appropriate Does the actual copy or hardlink taking the appropriate
action depending on update, overwrite action depending on update, overwrite, export_as_hardlink
Assumes destination is the right destination (e.g. UUID matches) Assumes destination is the right destination (e.g. UUID matches)
sets UUID and JSON info foo exported file using set_uuid_for_file, set_inf_for_uuido sets UUID and JSON info foo exported file using set_uuid_for_file, set_inf_for_uuido
@@ -810,6 +842,7 @@ def _export_photo(
no_xattr: don't copy extended attributes no_xattr: don't copy extended attributes
export_as_hardlink: bool export_as_hardlink: bool
exiftool: bool exiftool: bool
touch_file: bool
fileutil: FileUtil class that conforms to fileutil.FileUtilABC fileutil: FileUtil class that conforms to fileutil.FileUtilABC
Returns: Returns:
@@ -820,143 +853,99 @@ def _export_photo(
update_updated_files = [] update_updated_files = []
update_new_files = [] update_new_files = []
update_skipped_files = [] update_skipped_files = []
touched_files = []
dest_str = str(dest) dest_str = str(dest)
dest_exists = dest.exists() dest_exists = dest.exists()
if export_as_hardlink: if export_as_hardlink:
# use hardlink instead of copy op_desc = "export_as_hardlink"
if not update:
# not update, do the the hardlink
if overwrite and dest.exists():
# need to remove the destination first
# dest.unlink()
fileutil.unlink(dest)
logging.debug(f"Not update: export_as_hardlink linking file {src} {dest}")
fileutil.hardlink(src, dest)
export_db.set_data(
dest_str,
self.uuid,
fileutil.file_sig(dest_str),
(None, None, None),
self.json(),
None,
)
exported_files.append(dest_str)
elif dest_exists and dest.samefile(src):
# update, hardlink and it already points to the right file, do nothing
logging.debug(
f"Update: skipping samefile with export_as_hardlink {src} {dest}"
)
update_skipped_files.append(dest_str)
elif dest_exists:
# update, not the same file (e.g. user may not have used export_as_hardlink last time it was run
logging.debug(
f"Update: removing existing file prior to export_as_hardlink {src} {dest}"
)
# dest.unlink()
fileutil.unlink(dest)
fileutil.hardlink(src, dest)
export_db.set_data(
dest_str,
self.uuid,
fileutil.file_sig(dest_str),
(None, None, None),
self.json(),
None,
)
update_updated_files.append(dest_str)
exported_files.append(dest_str)
else:
# update, hardlink, destination doesn't exist (new file)
logging.debug(
f"Update: exporting new file with export_as_hardlink {src} {dest}"
)
fileutil.hardlink(src, dest)
export_db.set_data(
dest_str,
self.uuid,
fileutil.file_sig(dest_str),
(None, None, None),
self.json(),
None,
)
exported_files.append(dest_str)
update_new_files.append(dest_str)
else: else:
if not update: op_desc = "export_by_copying"
# not update, do the the copy
if overwrite and dest.exists(): if not update:
# need to remove the destination first # not update, export the file
# dest.unlink() logging.debug(f"Exporting file with {op_desc} {src} {dest}")
fileutil.unlink(dest) exported_files.append(dest_str)
logging.debug(f"Not update: copying file {src} {dest}") if touch_file:
fileutil.copy(src, dest_str, norsrc=no_xattr) sig = fileutil.file_sig(src)
exported_files.append(dest_str) sig = (sig[0], sig[1], int(self.date.timestamp()))
export_db.set_data( if not fileutil.cmp_file_sig(src, sig):
dest_str, touched_files.append(dest_str)
self.uuid, else: # updating
fileutil.file_sig(dest_str), if not dest_exists:
(None, None, None), # update, destination doesn't exist (new file)
self.json(), logging.debug(f"Update: exporting new file with {op_desc} {src} {dest}")
None,
)
# elif dest_exists and not exiftool and cmp_file(dest_str, export_db.get_stat_orig_for_file(dest_str)):
elif (
dest_exists
and not exiftool
and filecmp.cmp(src, dest)
and not dest.samefile(src)
):
# destination exists but is identical
logging.debug(f"Update: skipping identifical original files {src} {dest}")
# call set_stat because code can reach this spot if no export DB but exporting a RAW or live photo
# potentially re-writes the data in the database but ensures database is complete
export_db.set_stat_orig_for_file(dest_str, fileutil.file_sig(dest_str))
update_skipped_files.append(dest_str)
elif (
dest_exists
and exiftool
and fileutil.cmp_sig(dest_str, export_db.get_stat_exif_for_file(dest_str))
and not dest.samefile(src)
):
# destination exists but is identical
logging.debug(f"Update: skipping identifical exiftool files {src} {dest}")
update_skipped_files.append(dest_str)
elif dest_exists:
# destination exists but is different or is a hardlink
logging.debug(f"Update: removing existing file prior to copy {src} {dest}")
stat_src = os.stat(src)
stat_dest = os.stat(dest)
# dest.unlink()
fileutil.unlink(dest)
fileutil.copy(src, dest_str, norsrc=no_xattr)
export_db.set_data(
dest_str,
self.uuid,
fileutil.file_sig(dest_str),
(None, None, None),
self.json(),
None,
)
exported_files.append(dest_str)
update_updated_files.append(dest_str)
else:
# destination doesn't exist, copy the file
logging.debug(f"Update: copying new file {src} {dest}")
fileutil.copy(src, dest_str, norsrc=no_xattr)
export_db.set_data(
dest_str,
self.uuid,
fileutil.file_sig(dest_str),
(None, None, None),
self.json(),
None,
)
exported_files.append(dest_str)
update_new_files.append(dest_str) update_new_files.append(dest_str)
if touch_file:
touched_files.append(dest_str)
else:
# update, destination exists, but we might not need to replace it...
if exiftool:
sig_exif = export_db.get_stat_exif_for_file(dest_str)
cmp_orig = fileutil.cmp_file_sig(dest_str, sig_exif)
sig_exif = (sig_exif[0], sig_exif[1], int(self.date.timestamp()))
cmp_touch = fileutil.cmp_file_sig(dest_str, sig_exif)
else:
cmp_orig = fileutil.cmp(src, dest)
cmp_touch = fileutil.cmp(src, dest, mtime1=int(self.date.timestamp()))
sig_cmp = cmp_touch if touch_file else cmp_orig
if (export_as_hardlink and dest.samefile(src)) or (
not export_as_hardlink and not dest.samefile(src) and sig_cmp
):
# destination exists and signatures match, skip it
update_skipped_files.append(dest_str)
else:
# destination exists but signature is different
if touch_file and cmp_orig and not cmp_touch:
# destination exists, signature matches original but does not match expected touch time
# skip exporting but update touch time
update_skipped_files.append(dest_str)
touched_files.append(dest_str)
elif not touch_file and cmp_touch and not cmp_orig:
# destination exists, signature matches expected touch but not original
# user likely exported with touch_file and is now exporting without touch_file
# don't update the file because it's same but leave touch time
update_skipped_files.append(dest_str)
else:
# destination exists but is different
update_updated_files.append(dest_str)
if touch_file:
touched_files.append(dest_str)
if not update_skipped_files:
if dest_exists and (update or overwrite):
# need to remove the destination first
logging.debug(
f"Update: removing existing file prior to {op_desc} {src} {dest}"
)
fileutil.unlink(dest)
if export_as_hardlink:
fileutil.hardlink(src, dest)
else:
fileutil.copy(src, dest_str, norsrc=no_xattr)
export_db.set_data(
dest_str,
self.uuid,
fileutil.file_sig(dest_str),
(None, None, None),
self.json(),
None,
)
if touched_files:
ts = int(self.date.timestamp())
fileutil.utime(dest, (ts, ts))
return ExportResults( return ExportResults(
exported_files, update_new_files, update_updated_files, update_skipped_files, [] exported_files + update_new_files + update_updated_files,
update_new_files,
update_updated_files,
update_skipped_files,
[],
touched_files,
) )
@@ -1132,6 +1121,7 @@ def _xmp_sidecar(
use_persons_as_keywords=False, use_persons_as_keywords=False,
keyword_template=None, keyword_template=None,
description_template=None, description_template=None,
extension=None
): ):
""" returns string for XMP sidecar """ returns string for XMP sidecar
use_albums_as_keywords: treat album names as keywords use_albums_as_keywords: treat album names as keywords
@@ -1139,10 +1129,12 @@ def _xmp_sidecar(
keyword_template: (list of strings); list of template strings to render as keywords keyword_template: (list of strings); list of template strings to render as keywords
description_template: string; optional template string that will be rendered for use as photo description """ description_template: string; optional template string that will be rendered for use as photo description """
# TODO: add additional fields to XMP file?
xmp_template = Template(filename=os.path.join(_TEMPLATE_DIR, _XMP_TEMPLATE_NAME)) xmp_template = Template(filename=os.path.join(_TEMPLATE_DIR, _XMP_TEMPLATE_NAME))
if extension is None:
extension = pathlib.Path(self.original_filename)
extension = extension.suffix[1:] if extension.suffix else None
if description_template is not None: if description_template is not None:
description = self.render_template( description = self.render_template(
description_template, expand_inplace=True, inplace_sep=", " description_template, expand_inplace=True, inplace_sep=", "
@@ -1211,6 +1203,7 @@ def _xmp_sidecar(
keywords=keyword_list, keywords=keyword_list,
persons=person_list, persons=person_list,
subjects=subject_list, subjects=subject_list,
extension=extension,
) )
# remove extra lines that mako inserts from template # remove extra lines that mako inserts from template

View File

@@ -5,16 +5,12 @@ PhotosDB.photos() returns a list of PhotoInfo objects
""" """
import dataclasses import dataclasses
import glob
import json import json
import logging import logging
import os import os
import os.path import os.path
import pathlib import pathlib
import subprocess
import sys
from datetime import timedelta, timezone from datetime import timedelta, timezone
from pprint import pformat
import yaml import yaml
@@ -25,10 +21,11 @@ from .._constants import (
_PHOTOS_4_ROOT_FOLDER, _PHOTOS_4_ROOT_FOLDER,
_PHOTOS_4_VERSION, _PHOTOS_4_VERSION,
_PHOTOS_5_ALBUM_KIND, _PHOTOS_5_ALBUM_KIND,
_PHOTOS_5_IMPORT_SESSION_ALBUM_KIND,
_PHOTOS_5_SHARED_ALBUM_KIND, _PHOTOS_5_SHARED_ALBUM_KIND,
_PHOTOS_5_SHARED_PHOTO_PATH, _PHOTOS_5_SHARED_PHOTO_PATH,
) )
from ..albuminfo import AlbumInfo from ..albuminfo import AlbumInfo, ImportInfo
from ..personinfo import FaceInfo, PersonInfo from ..personinfo import FaceInfo, PersonInfo
from ..phototemplate import PhotoTemplate from ..phototemplate import PhotoTemplate
from ..placeinfo import PlaceInfo4, PlaceInfo5 from ..placeinfo import PlaceInfo4, PlaceInfo5
@@ -88,7 +85,7 @@ class PhotoInfo:
def date(self): def date(self):
""" image creation date as timezone aware datetime object """ """ image creation date as timezone aware datetime object """
return self._info["imageDate"] return self._info["imageDate"]
@property @property
def date_modified(self): def date_modified(self):
""" image modification date as timezone aware datetime object """ image modification date as timezone aware datetime object
@@ -357,7 +354,7 @@ class PhotoInfo:
except AttributeError: except AttributeError:
try: try:
faces = self._db._db_faceinfo_uuid[self._uuid] faces = self._db._db_faceinfo_uuid[self._uuid]
self._faceinfo = [FaceInfo(db=self._db, pk=pk) for pk in faces] self._faceinfo = [FaceInfo(db=self._db, pk=pk) for pk in faces]
except KeyError: except KeyError:
# no faces # no faces
self._faceinfo = [] self._faceinfo = []
@@ -387,6 +384,19 @@ class PhotoInfo:
] ]
return self._album_info return self._album_info
@property
def import_info(self):
""" ImportInfo object representing import session for the photo or None if no import session """
try:
return self._import_info
except AttributeError:
self._import_info = (
ImportInfo(db=self._db, uuid=self._info["import_uuid"])
if self._info["import_uuid"] is not None
else None
)
return self._import_info
@property @property
def keywords(self): def keywords(self):
""" list of keywords for picture """ """ list of keywords for picture """
@@ -745,7 +755,7 @@ class PhotoInfo:
""" Return list of album UUIDs this photo is found in """ Return list of album UUIDs this photo is found in
Filters out albums in the trash and any special album types Filters out albums in the trash and any special album types
Returns: list of album UUIDs Returns: list of album UUIDs
""" """
if self._db._db_version <= _PHOTOS_4_VERSION: if self._db._db_version <= _PHOTOS_4_VERSION:

View File

@@ -4,7 +4,7 @@
import logging import logging
from .._constants import _DB_TABLE_NAMES, _PHOTOS_4_VERSION from .._constants import _DB_TABLE_NAMES, _PHOTOS_4_VERSION
from ..utils import _open_sql_file from ..utils import _open_sql_file, normalize_unicode
from .photosdb_utils import get_db_version from .photosdb_utils import get_db_version
@@ -121,7 +121,7 @@ def _process_faceinfo_4(photosdb):
face["asset_uuid"] = asset_uuid face["asset_uuid"] = asset_uuid
face["uuid"] = row[2] face["uuid"] = row[2]
face["person"] = person_id face["person"] = person_id
face["fullname"] = row[3] face["fullname"] = normalize_unicode(row[3])
face["sourcewidth"] = row[7] face["sourcewidth"] = row[7]
face["sourceheight"] = row[8] face["sourceheight"] = row[8]
face["centerx"] = row[9] face["centerx"] = row[9]
@@ -282,7 +282,7 @@ def _process_faceinfo_5(photosdb):
face["asset_uuid"] = asset_uuid face["asset_uuid"] = asset_uuid
face["uuid"] = row[2] face["uuid"] = row[2]
face["person"] = person_pk face["person"] = person_pk
face["fullname"] = row[4] face["fullname"] = normalize_unicode(row[4])
face["agetype"] = row[5] face["agetype"] = row[5]
face["baldtype"] = row[6] face["baldtype"] = row[6]
face["eyemakeuptype"] = row[7] face["eyemakeuptype"] = row[7]

View File

@@ -10,7 +10,7 @@ import uuid as uuidlib
from pprint import pformat from pprint import pformat
from .._constants import _PHOTOS_4_VERSION, SEARCH_CATEGORY_LABEL from .._constants import _PHOTOS_4_VERSION, SEARCH_CATEGORY_LABEL
from ..utils import _db_is_locked, _debug, _open_sql_file from ..utils import _db_is_locked, _debug, _open_sql_file, normalize_unicode
""" """
This module should be imported in the class defintion of PhotosDB in photosdb.py This module should be imported in the class defintion of PhotosDB in photosdb.py
@@ -112,8 +112,8 @@ def _process_searchinfo(self):
record["groupid"] = row[3] record["groupid"] = row[3]
record["category"] = row[4] record["category"] = row[4]
record["owning_groupid"] = row[5] record["owning_groupid"] = row[5]
record["content_string"] = row[6].replace("\x00", "") record["content_string"] = normalize_unicode(row[6].replace("\x00", ""))
record["normalized_string"] = row[7].replace("\x00", "") record["normalized_string"] = normalize_unicode(row[7].replace("\x00", ""))
record["lookup_identifier"] = row[8] record["lookup_identifier"] = row[8]
try: try:
@@ -147,9 +147,10 @@ def _process_searchinfo(self):
"_db_searchinfo_labels_normalized: \n" "_db_searchinfo_labels_normalized: \n"
+ pformat(self._db_searchinfo_labels_normalized) + pformat(self._db_searchinfo_labels_normalized)
) )
conn.close() conn.close()
@property @property
def labels(self): def labels(self):
""" return list of all search info labels found in the library """ """ return list of all search info labels found in the library """

View File

@@ -8,7 +8,6 @@ import os
import os.path import os.path
import pathlib import pathlib
import platform import platform
import sqlite3
import sys import sys
import tempfile import tempfile
from datetime import datetime, timedelta, timezone from datetime import datetime, timedelta, timezone
@@ -26,15 +25,15 @@ from .._constants import (
_PHOTOS_4_VERSION, _PHOTOS_4_VERSION,
_PHOTOS_5_ALBUM_KIND, _PHOTOS_5_ALBUM_KIND,
_PHOTOS_5_FOLDER_KIND, _PHOTOS_5_FOLDER_KIND,
_PHOTOS_5_IMPORT_SESSION_ALBUM_KIND,
_PHOTOS_5_ROOT_FOLDER_KIND, _PHOTOS_5_ROOT_FOLDER_KIND,
_PHOTOS_5_SHARED_ALBUM_KIND, _PHOTOS_5_SHARED_ALBUM_KIND,
_PHOTOS_5_VERSION,
_TESTED_DB_VERSIONS,
_TESTED_OS_VERSIONS, _TESTED_OS_VERSIONS,
_UNKNOWN_PERSON, _UNKNOWN_PERSON,
TIME_DELTA,
) )
from .._version import __version__ from .._version import __version__
from ..albuminfo import AlbumInfo, FolderInfo from ..albuminfo import AlbumInfo, FolderInfo, ImportInfo
from ..datetime_utils import datetime_has_tz, datetime_naive_to_local from ..datetime_utils import datetime_has_tz, datetime_naive_to_local
from ..personinfo import PersonInfo from ..personinfo import PersonInfo
from ..photoinfo import PhotoInfo from ..photoinfo import PhotoInfo
@@ -45,8 +44,9 @@ from ..utils import (
_get_os_version, _get_os_version,
_open_sql_file, _open_sql_file,
get_last_library_path, get_last_library_path,
normalize_unicode,
) )
from .photosdb_utils import get_db_version, get_db_model_version from .photosdb_utils import get_db_model_version, get_db_version
# TODO: Add test for imageTimeZoneOffsetSeconds = None # TODO: Add test for imageTimeZoneOffsetSeconds = None
# TODO: Add test for __str__ # TODO: Add test for __str__
@@ -485,6 +485,18 @@ class PhotosDB:
self._albums_shared = self._get_albums(shared=True) self._albums_shared = self._get_albums(shared=True)
return self._albums_shared return self._albums_shared
@property
def import_info(self):
""" return list of ImportInfo objects for each import session in the database """
try:
return self._import_info
except AttributeError:
self._import_info = [
ImportInfo(db=self, uuid=album)
for album in self._get_album_uuids(import_session=True)
]
return self._import_info
@property @property
def db_version(self): def db_version(self):
""" return the database version as stored in LiGlobals table """ """ return the database version as stored in LiGlobals table """
@@ -514,6 +526,7 @@ class PhotosDB:
""" If sqlite shared memory and write-ahead log files exist, those are copied too """ """ If sqlite shared memory and write-ahead log files exist, those are copied too """
# required because python's sqlite3 implementation can't read a locked file # required because python's sqlite3 implementation can't read a locked file
# _, suffix = os.path.splitext(fname) # _, suffix = os.path.splitext(fname)
dest_name = dest_path = ""
try: try:
dest_name = pathlib.Path(fname).name dest_name = pathlib.Path(fname).name
dest_path = os.path.join(self._tempdir_name, dest_name) dest_path = os.path.join(self._tempdir_name, dest_name)
@@ -536,9 +549,6 @@ class PhotosDB:
""" process the Photos database to extract info """ process the Photos database to extract info
works on Photos version <= 4.0 """ works on Photos version <= 4.0 """
# Epoch is Jan 1, 2001
td = (datetime(2001, 1, 1, 0, 0) - datetime(1970, 1, 1, 0, 0)).total_seconds()
(conn, c) = _open_sql_file(self._tmp_db) (conn, c) = _open_sql_file(self._tmp_db)
# get info to associate persons with photos # get info to associate persons with photos
@@ -685,7 +695,8 @@ class PhotosDB:
isInTrash, isInTrash,
folderUuid, folderUuid,
albumType, albumType,
albumSubclass albumSubclass,
createDate
FROM RKAlbum """ FROM RKAlbum """
) )
@@ -698,11 +709,12 @@ class PhotosDB:
# 5: folderUuid # 5: folderUuid
# 6: albumType # 6: albumType
# 7: albumSubclass -- if 3, normal user album # 7: albumSubclass -- if 3, normal user album
# 8: createDate
for album in c: for album in c:
self._dbalbum_details[album[0]] = { self._dbalbum_details[album[0]] = {
"_uuid": album[0], "_uuid": album[0],
"title": album[1], "title": normalize_unicode(album[1]),
"cloudlibrarystate": album[2], "cloudlibrarystate": album[2],
"cloudidentifier": album[3], "cloudidentifier": album[3],
"intrash": False if album[4] == 0 else True, "intrash": False if album[4] == 0 else True,
@@ -715,6 +727,9 @@ class PhotosDB:
"albumSubclass": album[7], "albumSubclass": album[7],
# for compatability with Photos 5 where album kind is ZKIND # for compatability with Photos 5 where album kind is ZKIND
"kind": album[7], "kind": album[7],
"creation_date": album[8],
"start_date": None, # Photos 5 only
"end_date": None, # Photos 5 only
} }
# get details about folders # get details about folders
@@ -746,7 +761,7 @@ class PhotosDB:
self._dbfolder_details[uuid] = { self._dbfolder_details[uuid] = {
"_uuid": row[0], "_uuid": row[0],
"modelId": row[1], "modelId": row[1],
"name": row[2], "name": normalize_unicode(row[2]),
"isMagic": row[3], "isMagic": row[3],
"intrash": row[4], "intrash": row[4],
"folderType": row[5], "folderType": row[5],
@@ -918,9 +933,10 @@ class PhotosDB:
# There are sometimes negative values for lastmodifieddate in the database # There are sometimes negative values for lastmodifieddate in the database
# I don't know what these mean but they will raise exception in datetime if # I don't know what these mean but they will raise exception in datetime if
# not accounted for # not accounted for
self._dbphotos[uuid]["lastmodifieddate_timestamp"] = row[4]
try: try:
self._dbphotos[uuid]["lastmodifieddate"] = datetime.fromtimestamp( self._dbphotos[uuid]["lastmodifieddate"] = datetime.fromtimestamp(
row[4] + td row[4] + TIME_DELTA
) )
except ValueError: except ValueError:
self._dbphotos[uuid]["lastmodifieddate"] = None self._dbphotos[uuid]["lastmodifieddate"] = None
@@ -928,9 +944,10 @@ class PhotosDB:
self._dbphotos[uuid]["lastmodifieddate"] = None self._dbphotos[uuid]["lastmodifieddate"] = None
self._dbphotos[uuid]["imageTimeZoneOffsetSeconds"] = row[9] self._dbphotos[uuid]["imageTimeZoneOffsetSeconds"] = row[9]
self._dbphotos[uuid]["imageDate_timestamp"] = row[5]
try: try:
imagedate = datetime.fromtimestamp(row[5] + td) imagedate = datetime.fromtimestamp(row[5] + TIME_DELTA)
seconds = self._dbphotos[uuid]["imageTimeZoneOffsetSeconds"] or 0 seconds = self._dbphotos[uuid]["imageTimeZoneOffsetSeconds"] or 0
delta = timedelta(seconds=seconds) delta = timedelta(seconds=seconds)
tz = timezone(delta) tz = timezone(delta)
@@ -947,7 +964,7 @@ class PhotosDB:
self._dbphotos[uuid]["volumeId"] = row[10] self._dbphotos[uuid]["volumeId"] = row[10]
self._dbphotos[uuid]["imagePath"] = row[11] self._dbphotos[uuid]["imagePath"] = row[11]
self._dbphotos[uuid]["extendedDescription"] = row[12] self._dbphotos[uuid]["extendedDescription"] = row[12]
self._dbphotos[uuid]["name"] = row[13] self._dbphotos[uuid]["name"] = normalize_unicode(row[13])
self._dbphotos[uuid]["isMissing"] = row[14] self._dbphotos[uuid]["isMissing"] = row[14]
self._dbphotos[uuid]["originalFilename"] = row[15] self._dbphotos[uuid]["originalFilename"] = row[15]
self._dbphotos[uuid]["favorite"] = row[16] self._dbphotos[uuid]["favorite"] = row[16]
@@ -1066,6 +1083,11 @@ class PhotosDB:
self._dbphotos[uuid]["original_orientation"] = row[38] self._dbphotos[uuid]["original_orientation"] = row[38]
self._dbphotos[uuid]["original_filesize"] = row[39] self._dbphotos[uuid]["original_filesize"] = row[39]
# import session not yet handled for Photos 4
self._dbphotos[uuid]["import_session"] = None
self._dbphotos[uuid]["import_uuid"] = None
self._dbphotos[uuid]["fok_import_session"] = None
# get additional details from RKMaster, needed for RAW processing # get additional details from RKMaster, needed for RAW processing
c.execute( c.execute(
""" SELECT """ SELECT
@@ -1419,16 +1441,17 @@ class PhotosDB:
if _debug(): if _debug():
logging.debug(f"_process_database5") logging.debug(f"_process_database5")
# Epoch is Jan 1, 2001 (conn, c) = _open_sql_file(self._tmp_db)
td = (datetime(2001, 1, 1, 0, 0) - datetime(1970, 1, 1, 0, 0)).total_seconds()
# some of the tables/columns have different names in different versions of Photos
photos_ver = get_db_model_version(self._tmp_db) photos_ver = get_db_model_version(self._tmp_db)
self._photos_ver = photos_ver self._photos_ver = photos_ver
asset_table = _DB_TABLE_NAMES[photos_ver]["ASSET"] asset_table = _DB_TABLE_NAMES[photos_ver]["ASSET"]
keyword_join = _DB_TABLE_NAMES[photos_ver]["KEYWORD_JOIN"] keyword_join = _DB_TABLE_NAMES[photos_ver]["KEYWORD_JOIN"]
album_join = _DB_TABLE_NAMES[photos_ver]["ALBUM_JOIN"] album_join = _DB_TABLE_NAMES[photos_ver]["ALBUM_JOIN"]
album_sort = _DB_TABLE_NAMES[photos_ver]["ALBUM_SORT_ORDER"]
(conn, c) = _open_sql_file(self._tmp_db) import_fok = _DB_TABLE_NAMES[photos_ver]["IMPORT_FOK"]
depth_state = _DB_TABLE_NAMES[photos_ver]["DEPTH_STATE"]
# Look for all combinations of persons and pictures # Look for all combinations of persons and pictures
if _debug(): if _debug():
@@ -1539,7 +1562,7 @@ class PhotosDB:
f""" SELECT f""" SELECT
ZGENERICALBUM.ZUUID, ZGENERICALBUM.ZUUID,
{asset_table}.ZUUID, {asset_table}.ZUUID,
{album_join} {album_sort}
FROM {asset_table} FROM {asset_table}
JOIN Z_26ASSETS ON {album_join} = {asset_table}.Z_PK JOIN Z_26ASSETS ON {album_join} = {asset_table}.Z_PK
JOIN ZGENERICALBUM ON ZGENERICALBUM.Z_PK = Z_26ASSETS.Z_26ALBUMS JOIN ZGENERICALBUM ON ZGENERICALBUM.Z_PK = Z_26ASSETS.Z_26ALBUMS
@@ -1577,13 +1600,16 @@ class PhotosDB:
"ZKIND, " # 6 "ZKIND, " # 6
"ZPARENTFOLDER, " # 7 "ZPARENTFOLDER, " # 7
"Z_PK, " # 8 "Z_PK, " # 8
"ZTRASHEDSTATE " # 9 "ZTRASHEDSTATE, " # 9
"ZCREATIONDATE, " # 10
"ZSTARTDATE, " # 11
"ZENDDATE " # 12
"FROM ZGENERICALBUM " "FROM ZGENERICALBUM "
) )
for album in c: for album in c:
self._dbalbum_details[album[0]] = { self._dbalbum_details[album[0]] = {
"_uuid": album[0], "_uuid": album[0],
"title": album[1], "title": normalize_unicode(album[1]),
"cloudlocalstate": album[2], "cloudlocalstate": album[2],
"cloudownerfirstname": album[3], "cloudownerfirstname": album[3],
"cloudownderlastname": album[4], "cloudownderlastname": album[4],
@@ -1594,6 +1620,9 @@ class PhotosDB:
"parentfolder": album[7], "parentfolder": album[7],
"pk": album[8], "pk": album[8],
"intrash": False if album[9] == 0 else True, "intrash": False if album[9] == 0 else True,
"creation_date": album[10],
"start_date": album[11],
"end_date": album[12],
} }
# add cross-reference by pk to uuid # add cross-reference by pk to uuid
@@ -1655,12 +1684,13 @@ class PhotosDB:
JOIN ZKEYWORD ON ZKEYWORD.Z_PK = {keyword_join} """ JOIN ZKEYWORD ON ZKEYWORD.Z_PK = {keyword_join} """
) )
for keyword in c: for keyword in c:
keyword_title = normalize_unicode(keyword[0])
if not keyword[1] in self._dbkeywords_uuid: if not keyword[1] in self._dbkeywords_uuid:
self._dbkeywords_uuid[keyword[1]] = [] self._dbkeywords_uuid[keyword[1]] = []
if not keyword[0] in self._dbkeywords_keyword: if not keyword_title in self._dbkeywords_keyword:
self._dbkeywords_keyword[keyword[0]] = [] self._dbkeywords_keyword[keyword_title] = []
self._dbkeywords_uuid[keyword[1]].append(keyword[0]) self._dbkeywords_uuid[keyword[1]].append(keyword[0])
self._dbkeywords_keyword[keyword[0]].append(keyword[1]) self._dbkeywords_keyword[keyword_title].append(keyword[1])
if _debug(): if _debug():
logging.debug(f"Finished walking through keywords") logging.debug(f"Finished walking through keywords")
@@ -1714,7 +1744,8 @@ class PhotosDB:
ZADDITIONALASSETATTRIBUTES.ZORIGINALHEIGHT, ZADDITIONALASSETATTRIBUTES.ZORIGINALHEIGHT,
ZADDITIONALASSETATTRIBUTES.ZORIGINALWIDTH, ZADDITIONALASSETATTRIBUTES.ZORIGINALWIDTH,
ZADDITIONALASSETATTRIBUTES.ZORIGINALORIENTATION, ZADDITIONALASSETATTRIBUTES.ZORIGINALORIENTATION,
ZADDITIONALASSETATTRIBUTES.ZORIGINALFILESIZE ZADDITIONALASSETATTRIBUTES.ZORIGINALFILESIZE,
{depth_state}
FROM {asset_table} FROM {asset_table}
JOIN ZADDITIONALASSETATTRIBUTES ON ZADDITIONALASSETATTRIBUTES.ZASSET = {asset_table}.Z_PK JOIN ZADDITIONALASSETATTRIBUTES ON ZADDITIONALASSETATTRIBUTES.ZASSET = {asset_table}.Z_PK
ORDER BY {asset_table}.ZUUID """ ORDER BY {asset_table}.ZUUID """
@@ -1757,6 +1788,7 @@ class PhotosDB:
# 33 ZADDITIONALASSETATTRIBUTES.ZORIGINALWIDTH, # 33 ZADDITIONALASSETATTRIBUTES.ZORIGINALWIDTH,
# 34 ZADDITIONALASSETATTRIBUTES.ZORIGINALORIENTATION, # 34 ZADDITIONALASSETATTRIBUTES.ZORIGINALORIENTATION,
# 35 ZADDITIONALASSETATTRIBUTES.ZORIGINALFILESIZE # 35 ZADDITIONALASSETATTRIBUTES.ZORIGINALFILESIZE
# 36 ZGENERICASSET.ZDEPTHSTATES / ZASSET.ZDEPTHTYPE
for row in c: for row in c:
uuid = row[0] uuid = row[0]
@@ -1765,22 +1797,24 @@ class PhotosDB:
info["modelID"] = None info["modelID"] = None
info["masterUuid"] = None info["masterUuid"] = None
info["masterFingerprint"] = row[1] info["masterFingerprint"] = row[1]
info["name"] = row[2] info["name"] = normalize_unicode(row[2])
# There are sometimes negative values for lastmodifieddate in the database # There are sometimes negative values for lastmodifieddate in the database
# I don't know what these mean but they will raise exception in datetime if # I don't know what these mean but they will raise exception in datetime if
# not accounted for # not accounted for
info["lastmodifieddate_timestamp"] = row[4]
try: try:
info["lastmodifieddate"] = datetime.fromtimestamp(row[4] + td) info["lastmodifieddate"] = datetime.fromtimestamp(row[4] + TIME_DELTA)
except ValueError: except ValueError:
info["lastmodifieddate"] = None info["lastmodifieddate"] = None
except TypeError: except TypeError:
info["lastmodifieddate"] = None info["lastmodifieddate"] = None
info["imageTimeZoneOffsetSeconds"] = row[6] info["imageTimeZoneOffsetSeconds"] = row[6]
info["imageDate_timestamp"] = row[5]
try: try:
imagedate = datetime.fromtimestamp(row[5] + td) imagedate = datetime.fromtimestamp(row[5] + TIME_DELTA)
seconds = info["imageTimeZoneOffsetSeconds"] or 0 seconds = info["imageTimeZoneOffsetSeconds"] or 0
delta = timedelta(seconds=seconds) delta = timedelta(seconds=seconds)
tz = timezone(delta) tz = timezone(delta)
@@ -1876,10 +1910,10 @@ class PhotosDB:
# 3 = HDR photo # 3 = HDR photo
# 4 = non-HDR version of the photo # 4 = non-HDR version of the photo
# 6 = panorama # 6 = panorama
# 8 = portrait # > 6 = portrait (sometimes, see ZDEPTHSTATE/ZDEPTHTYPE)
info["customRenderedValue"] = row[22] info["customRenderedValue"] = row[22]
info["hdr"] = True if row[22] == 3 else False info["hdr"] = True if row[22] == 3 else False
info["portrait"] = True if row[22] == 8 else False info["portrait"] = True if row[36] != 0 else False
# Set panorama from either KindSubType or RenderedValue # Set panorama from either KindSubType or RenderedValue
info["panorama"] = True if row[21] == 1 or row[22] == 6 else False info["panorama"] = True if row[21] == 1 or row[22] == 6 else False
@@ -1925,6 +1959,12 @@ class PhotosDB:
info["original_orientation"] = row[34] info["original_orientation"] = row[34]
info["original_filesize"] = row[35] info["original_filesize"] = row[35]
# initialize import session info which will be filled in later
# not every photo has an import session so initialize all records now
info["import_session"] = None
info["fok_import_session"] = None
info["import_uuid"] = None
# associated RAW image info # associated RAW image info
# will be filled in later # will be filled in later
info["has_raw"] = False info["has_raw"] = False
@@ -1951,6 +1991,32 @@ class PhotosDB:
# else: # else:
# info["burst"] = False # info["burst"] = False
# get info on import sessions
# 0 ZGENERICASSET.ZUUID
# 1 ZGENERICASSET.ZIMPORTSESSION
# 2 ZGENERICASSET.Z_FOK_IMPORTSESSION
# 3 ZGENERICALBUM.ZUUID,
c.execute(
f"""SELECT
{asset_table}.ZUUID,
{asset_table}.ZIMPORTSESSION,
{import_fok},
ZGENERICALBUM.ZUUID
FROM
{asset_table}
JOIN ZGENERICALBUM ON ZGENERICALBUM.Z_PK = {asset_table}.ZIMPORTSESSION
"""
)
for row in c:
uuid = row[0]
try:
self._dbphotos[uuid]["import_session"] = row[1]
self._dbphotos[uuid]["fok_import_session"] = row[2]
self._dbphotos[uuid]["import_uuid"] = row[3]
except KeyError:
logging.debug(f"No info record for uuid {uuid} for import session")
# Get extended description # Get extended description
c.execute( c.execute(
f"""SELECT {asset_table}.ZUUID, f"""SELECT {asset_table}.ZUUID,
@@ -1963,7 +2029,7 @@ class PhotosDB:
for row in c: for row in c:
uuid = row[0] uuid = row[0]
if uuid in self._dbphotos: if uuid in self._dbphotos:
self._dbphotos[uuid]["extendedDescription"] = row[1] self._dbphotos[uuid]["extendedDescription"] = normalize_unicode(row[1])
else: else:
if _debug(): if _debug():
logging.debug( logging.debug(
@@ -2362,16 +2428,26 @@ class PhotosDB:
hierarchy = _recurse_folder_hierarchy(folders) hierarchy = _recurse_folder_hierarchy(folders)
return hierarchy return hierarchy
def _get_album_uuids(self, shared=False): def _get_album_uuids(self, shared=False, import_session=False):
""" Return list of album UUIDs found in photos database """ Return list of album UUIDs found in photos database
Filters out albums in the trash and any special album types Filters out albums in the trash and any special album types
Args: Args:
shared: boolean; if True, returns shared albums, else normal albums shared: boolean; if True, returns shared albums, else normal albums
import_session: boolean, if True, returns import session albums, else normal or shared albums
Note: flags (shared, import_session) are mutually exclusive
Raises:
ValueError: raised if mutually exclusive flags passed
Returns: list of album UUIDs Returns: list of album UUIDs
""" """
if shared and import_session:
raise ValueError(
"flags are mutually exclusive: pass zero or one of shared, import_session"
)
if self._db_version <= _PHOTOS_4_VERSION: if self._db_version <= _PHOTOS_4_VERSION:
version4 = True version4 = True
if shared: if shared:
@@ -2379,11 +2455,21 @@ class PhotosDB:
f"Shared albums not implemented for Photos library version {self._db_version}" f"Shared albums not implemented for Photos library version {self._db_version}"
) )
return [] # not implemented for _PHOTOS_4_VERSION return [] # not implemented for _PHOTOS_4_VERSION
elif import_session:
logging.warning(
f"Import sessions not implemented for Photos library version {self._db_version}"
)
return [] # not implemented for _PHOTOS_4_VERSION
else: else:
album_kind = _PHOTOS_4_ALBUM_KIND album_kind = _PHOTOS_4_ALBUM_KIND
else: else:
version4 = False version4 = False
album_kind = _PHOTOS_5_SHARED_ALBUM_KIND if shared else _PHOTOS_5_ALBUM_KIND if shared:
album_kind = _PHOTOS_5_SHARED_ALBUM_KIND
elif import_session:
album_kind = _PHOTOS_5_IMPORT_SESSION_ALBUM_KIND
else:
album_kind = _PHOTOS_5_ALBUM_KIND
album_list = [] album_list = []
# look through _dbalbum_details because _dbalbums_album won't have empty albums it # look through _dbalbum_details because _dbalbums_album won't have empty albums it

View File

@@ -621,6 +621,9 @@ class PhotoTemplate:
""" return list of values for a multi-valued template field """ """ return list of values for a multi-valued template field """
if field == "album": if field == "album":
values = self.photo.albums values = self.photo.albums
values = [
value.replace("/", ":") for value in values
] # TODO: temp fix for issue #213
elif field == "keyword": elif field == "keyword":
values = self.photo.keywords values = self.photo.keywords
elif field == "person": elif field == "person":
@@ -638,11 +641,13 @@ class PhotoTemplate:
if album.folder_names: if album.folder_names:
# album in folder # album in folder
folder = path_sep.join(album.folder_names) folder = path_sep.join(album.folder_names)
folder += path_sep + album.title folder += path_sep + album.title.replace(
"/", ":"
) # TODO: temp fix for issue #213
values.append(folder) values.append(folder)
else: else:
# album not in folder # album not in folder
values.append(album.title) values.append(album.title.replace("/", ":"))
else: else:
raise ValueError(f"Unhandleded template value: {field}") raise ValueError(f"Unhandleded template value: {field}")

View File

@@ -11,6 +11,9 @@ from collections import namedtuple # pylint: disable=syntax-error
import yaml import yaml
from bpylist import archiver from bpylist import archiver
from ._constants import UNICODE_FORMAT
from .utils import normalize_unicode
# postal address information, returned by PlaceInfo.address # postal address information, returned by PlaceInfo.address
PostalAddress = namedtuple( PostalAddress = namedtuple(
"PostalAddress", "PostalAddress",
@@ -76,12 +79,12 @@ class PLRevGeoLocationInfo:
geoServiceProvider, geoServiceProvider,
postalAddress, postalAddress,
): ):
self.addressString = addressString self.addressString = normalize_unicode(addressString)
self.countryCode = countryCode self.countryCode = countryCode
self.mapItem = mapItem self.mapItem = mapItem
self.isHome = isHome self.isHome = isHome
self.compoundNames = compoundNames self.compoundNames = normalize_unicode(compoundNames)
self.compoundSecondaryNames = compoundSecondaryNames self.compoundSecondaryNames = normalize_unicode(compoundSecondaryNames)
self.version = version self.version = version
self.geoServiceProvider = geoServiceProvider self.geoServiceProvider = geoServiceProvider
self.postalAddress = postalAddress self.postalAddress = postalAddress
@@ -183,7 +186,7 @@ class PLRevGeoMapItemAdditionalPlaceInfo:
def __init__(self, area, name, placeType, dominantOrderType): def __init__(self, area, name, placeType, dominantOrderType):
self.area = area self.area = area
self.name = name self.name = normalize_unicode(name)
self.placeType = placeType self.placeType = placeType
self.dominantOrderType = dominantOrderType self.dominantOrderType = dominantOrderType
@@ -232,13 +235,13 @@ class CNPostalAddress:
_subLocality, _subLocality,
): ):
self._ISOCountryCode = _ISOCountryCode self._ISOCountryCode = _ISOCountryCode
self._city = _city self._city = normalize_unicode(_city)
self._country = _country self._country = normalize_unicode(_country)
self._postalCode = _postalCode self._postalCode = normalize_unicode(_postalCode)
self._state = _state self._state = normalize_unicode(_state)
self._street = _street self._street = normalize_unicode(_street)
self._subAdministrativeArea = _subAdministrativeArea self._subAdministrativeArea = normalize_unicode(_subAdministrativeArea)
self._subLocality = _subLocality self._subLocality = normalize_unicode(_subLocality)
def __eq__(self, other): def __eq__(self, other):
return all( return all(
@@ -414,9 +417,9 @@ class PlaceInfo4(PlaceInfo):
# 2: type # 2: type
# 3: area # 3: area
try: try:
places_dict[p[2]].append((p[1], p[3])) places_dict[p[2]].append((normalize_unicode(p[1]), p[3]))
except KeyError: except KeyError:
places_dict[p[2]] = [(p[1], p[3])] places_dict[p[2]] = [(normalize_unicode(p[1]), p[3])]
# build list to populate PlaceNames tuple # build list to populate PlaceNames tuple
# initialize with empty lists for each field in PlaceNames # initialize with empty lists for each field in PlaceNames

View File

@@ -1,5 +1,13 @@
<!-- Created with osxphotos https://github.com/RhetTbull/osxphotos --> <!-- Created with osxphotos https://github.com/RhetTbull/osxphotos -->
<%def name="photoshop_sidecar_for_extension(extension)">
% if extension is None:
<photoshop:SidecarForExtension></photoshop:SidecarForExtension>
% else:
<photoshop:SidecarForExtension>${extension}</photoshop:SidecarForExtension>
% endif
</%def>
<%def name="dc_description(desc)"> <%def name="dc_description(desc)">
% if desc is None: % if desc is None:
<dc:description></dc:description> <dc:description></dc:description>
@@ -86,6 +94,7 @@
<rdf:Description rdf:about="" <rdf:Description rdf:about=""
xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:photoshop="http://ns.adobe.com/photoshop/1.0/"> xmlns:photoshop="http://ns.adobe.com/photoshop/1.0/">
${photoshop_sidecar_for_extension(extension)}
${dc_description(description)} ${dc_description(description)}
${dc_title(photo.title)} ${dc_title(photo.title)}
${dc_subject(subjects)} ${dc_subject(subjects)}

View File

@@ -10,6 +10,7 @@ import sqlite3
import subprocess import subprocess
import sys import sys
import tempfile import tempfile
import unicodedata
import urllib.parse import urllib.parse
from plistlib import load as plistload from plistlib import load as plistload
@@ -18,6 +19,7 @@ import CoreServices
import objc import objc
from Foundation import * from Foundation import *
from ._constants import UNICODE_FORMAT
from .fileutil import FileUtil from .fileutil import FileUtil
_DEBUG = False _DEBUG = False
@@ -352,3 +354,13 @@ def _db_is_locked(dbname):
# attr = xattr.xattr(filepath) # attr = xattr.xattr(filepath)
# uuid_bytes = bytes(uuid, 'utf-8') # uuid_bytes = bytes(uuid, 'utf-8')
# attr.set(OSXPHOTOS_XATTR_UUID, uuid_bytes) # attr.set(OSXPHOTOS_XATTR_UUID, uuid_bytes)
def normalize_unicode(value):
""" normalize unicode data """
if value is not None:
if not isinstance(value, str):
raise ValueError("value must be str")
return unicodedata.normalize(UNICODE_FORMAT, value)
else:
return None

View File

@@ -17,6 +17,9 @@ Some of the export tests rely on photos in my local library and will look for `O
One test for locale does not run on GitHub's automated workflow and will look for `OSXPHOTOS_TEST_LOCALE=1` to determine if it should be run. If you want to run this test, set the environment variable. One test for locale does not run on GitHub's automated workflow and will look for `OSXPHOTOS_TEST_LOCALE=1` to determine if it should be run. If you want to run this test, set the environment variable.
## Test Photo Libraries
**Important**: The test code uses several test photo libraries created on various version of MacOS. If you need to inspect one of these or modify one for a test, make a copy of the library (for example, copy it to your ~/Pictures folder) then open the copy in Photos. Once done, copy the revised library back to the tests/ folder. If you do not do this, the Photos background process photoanalysisd will forever try to process the library resulting in updates to the database which will cause git to see changes to the file you didn't intend. I'm not aware of any way to disassociate photoanalysisd from the library once you've opened it in Photos.
## Attribution ## ## Attribution ##
These tests utilize a test Photos library. The test library is populated with photos from [flickr](https://www.flickr.com) and from my own photo library. All images used are licensed under Creative Commons 2.0 Attribution [license](https://creativecommons.org/licenses/by/2.0/). These tests utilize a test Photos library. The test library is populated with photos from [flickr](https://www.flickr.com) and from my own photo library. All images used are licensed under Creative Commons 2.0 Attribution [license](https://creativecommons.org/licenses/by/2.0/).

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>LibrarySchemaVersion</key>
<integer>5001</integer>
<key>MetaSchemaVersion</key>
<integer>3</integer>
</dict>
</plist>

View File

@@ -0,0 +1,16 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>hostname</key>
<string>Rhets-MacBook-Pro.local</string>
<key>hostuuid</key>
<string>9575E48B-8D5F-5654-ABAC-4431B1167324</string>
<key>pid</key>
<integer>80508</integer>
<key>processname</key>
<string>photolibraryd</string>
<key>uid</key>
<integer>501</integer>
</dict>
</plist>

Binary file not shown.

View File

@@ -0,0 +1,26 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>insertAlbum</key>
<array/>
<key>insertAsset</key>
<array/>
<key>insertHighlight</key>
<array/>
<key>insertMemory</key>
<array/>
<key>insertMoment</key>
<array/>
<key>removeAlbum</key>
<array/>
<key>removeAsset</key>
<array/>
<key>removeHighlight</key>
<array/>
<key>removeMemory</key>
<array/>
<key>removeMoment</key>
<array/>
</dict>
</plist>

View File

@@ -0,0 +1,14 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>embeddingVersion</key>
<string>1</string>
<key>localeIdentifier</key>
<string>en_US</string>
<key>sceneTaxonomySHA</key>
<string>87914a047c69fbe8013fad2c70fa70c6c03b08b56190fe4054c880e6b9f57cc3</string>
<key>searchIndexVersion</key>
<string>10</string>
</dict>
</plist>

Binary file not shown.

After

Width:  |  Height:  |  Size: 574 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.4 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.9 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 500 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 524 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.1 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.8 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 528 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 450 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 541 KiB

View File

@@ -0,0 +1,26 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>MigrationService</key>
<dict>
<key>State</key>
<integer>4</integer>
</dict>
<key>MigrationService.LastCompletedTask</key>
<integer>12</integer>
<key>MigrationService.ValidationCounts</key>
<dict>
<key>MigrationDetectedFaceprint</key>
<integer>6</integer>
<key>MigrationManagedAsset</key>
<integer>0</integer>
<key>MigrationSceneClassification</key>
<integer>44</integer>
<key>MigrationUnmanagedAdjustment</key>
<integer>0</integer>
<key>RDVersion.cloudLocalState.CPLIsNotPushed</key>
<integer>7</integer>
</dict>
</dict>
</plist>

View File

@@ -0,0 +1,53 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>CollapsedSidebarSectionIdentifiers</key>
<array/>
<key>ExpandedSidebarItemIdentifiers</key>
<array>
<string>92D68107-B6C7-453B-96D2-97B0F26D5B8B/L0/020</string>
<string>88A5F8B8-5B9A-43C7-BB85-3952B81580EB/L0/020</string>
<string>29EF7A97-7E76-4D5F-A5E0-CC0A93E8524C/L0/020</string>
<string>2C2AF115-BD1D-4434-A747-D1C8BD8E2045/L0/020</string>
<string>CB051A4C-2CB7-4B90-B59B-08CC4D0C2823/L0/020</string>
</array>
<key>Photos</key>
<dict>
<key>CollapsedSidebarSectionIdentifiers</key>
<array/>
<key>ExpandedSidebarItemIdentifiers</key>
<array>
<string>TopLevelAlbums</string>
<string>TopLevelSlideshows</string>
</array>
<key>IPXWorkspaceControllerZoomLevelsKey</key>
<dict>
<key>kZoomLevelIdentifierAlbums</key>
<integer>7</integer>
<key>kZoomLevelIdentifierVersions</key>
<integer>7</integer>
</dict>
<key>lastAddToDestination</key>
<dict>
<key>key</key>
<integer>1</integer>
<key>lastKnownDisplayName</key>
<string>September 28, 2018</string>
<key>type</key>
<string>album</string>
<key>uuid</key>
<string>DFFKmHt3Tk+AGzZLe2Xq+g</string>
</dict>
<key>lastKnownItemCounts</key>
<dict>
<key>other</key>
<integer>0</integer>
<key>photos</key>
<integer>7</integer>
<key>videos</key>
<integer>0</integer>
</dict>
</dict>
</dict>
</plist>

View File

@@ -0,0 +1,26 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>BackgroundHighlightCollection</key>
<date>2020-06-24T04:02:13Z</date>
<key>BackgroundHighlightEnrichment</key>
<date>2020-06-24T04:02:12Z</date>
<key>BackgroundJobAssetRevGeocode</key>
<date>2020-06-24T04:02:13Z</date>
<key>BackgroundJobSearch</key>
<date>2020-06-24T04:02:13Z</date>
<key>BackgroundPeopleSuggestion</key>
<date>2020-06-24T04:02:12Z</date>
<key>BackgroundUserBehaviorProcessor</key>
<date>2020-06-24T04:02:13Z</date>
<key>PhotoAnalysisGraphLastBackgroundGraphConsistencyUpdateJobDateKey</key>
<date>2020-05-30T02:16:06Z</date>
<key>PhotoAnalysisGraphLastBackgroundGraphRebuildJobDate</key>
<date>2020-05-29T04:31:37Z</date>
<key>PhotoAnalysisGraphLastBackgroundMemoryGenerationJobDate</key>
<date>2020-06-24T04:02:13Z</date>
<key>SiriPortraitDonation</key>
<date>2020-06-24T04:02:13Z</date>
</dict>
</plist>

View File

@@ -0,0 +1,8 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>revgeoprovider</key>
<string>7618</string>
</dict>
</plist>

Some files were not shown because too many files have changed in this diff Show More