init exiv2

This commit is contained in:
Yaha 2023-06-02 00:25:13 +08:00
commit c25967a256
1448 changed files with 721418 additions and 0 deletions

28
.clang-format.optional Normal file
View File

@ -0,0 +1,28 @@
---
BasedOnStyle: Google
Language: Cpp
Standard: Cpp03
TabWidth: 4
UseTab: Never
ColumnLimit: 120
NamespaceIndentation: All
AccessModifierOffset: -4
ContinuationIndentWidth: 4
IndentWidth: 4
BreakBeforeBraces: Custom
BraceWrapping:
AfterStruct: true
AfterClass: true
AfterFunction: true
AfterControlStatement: false
AfterEnum: true
AfterNamespace: true
AllowShortFunctionsOnASingleLine: None
AllowShortBlocksOnASingleLine: false
AllowShortIfStatementsOnASingleLine: false
AllowShortLoopsOnASingleLine: false
...

40
.github/CODEOWNERS vendored Normal file
View File

@ -0,0 +1,40 @@
contrib/vs2019 @sridharb1
# https://help.github.com/en/github/creating-cloning-and-archiving-repositories/about-code-owners
# # This is a comment.
# # Each line is a file pattern followed by one or more owners.
#
# # These owners will be the default owners for everything in
# # the repo. Unless a later match takes precedence,
# # @global-owner1 and @global-owner2 will be requested for
# # review when someone opens a pull request.
# * @global-owner1 @global-owner2
#
# # Order is important; the last matching pattern takes the most
# # precedence. When someone opens a pull request that only
# # modifies JS files, only @js-owner and not the global
# # owner(s) will be requested for a review.
# *.js @js-owner
#
# # You can also use email addresses if you prefer. They'll be
# # used to look up users just like we do for commit author
# # emails.
# *.go docs@example.com
#
# # In this example, @doctocat owns any files in the build/logs
# # directory at the root of the repository and any of its
# # subdirectories.
# /build/logs/ @doctocat
#
# # The `docs/*` pattern will match files like
# # `docs/getting-started.md` but not further nested files like
# # `docs/build-app/troubleshooting.md`.
# docs/* docs@example.com
#
# # In this example, @octocat owns any file in an apps directory
# # anywhere in your repository.
# apps/ @octocat
#
# # In this example, @doctocat owns any file in the `/docs`
# # directory in the root of your repository.
# /docs/ @doctocat

View File

@ -0,0 +1,3 @@
# Reusing existing QL Pack
- import: codeql-suites/cpp-code-scanning.qls
from: codeql-cpp

View File

@ -0,0 +1,23 @@
<!DOCTYPE qhelp SYSTEM "qhelp.dtd">
<qhelp>
<overview>
<p>
A C++ iterator is a lot like a C pointer: if you dereference it without
first checking that it's valid then it can cause a crash.
</p>
</overview>
<recommendation>
<p>
Always check that the iterator is valid before derefencing it.
</p>
</recommendation>
<example>
<p>
<a href="https://github.com/Exiv2/exiv2/issues/1763">Issue 1763</a> was caused by
<a href="https://github.com/Exiv2/exiv2/blob/9b3ed3f9564b4ea51b43c78671435bde6b862e08/src/canonmn_int.cpp#L2755">this
dereference</a> of the iterator <tt>pos</tt>.
The bug was <a href="https://github.com/Exiv2/exiv2/pull/1767">fixed</a> by not dereferencing
<tt>pos</tt> if <tt>pos == metadata->end()</tt>.
</p>
</example>
</qhelp>

View File

@ -0,0 +1,47 @@
/**
* @name NULL iterator deref
* @description Dereferencing an iterator without checking that it's valid could cause a crash.
* @kind problem
* @problem.severity warning
* @id cpp/null-iterator-deref
* @tags security
* external/cwe/cwe-476
*/
import cpp
import semmle.code.cpp.controlflow.Guards
import semmle.code.cpp.dataflow.DataFlow
// Holds if `cond` is a condition like `use == table.end()`.
// `eq_is_true` is `true` for `==`, `false` for '!=`.
// Note: the `==` is actually an overloaded `operator==`.
predicate end_condition(GuardCondition cond, Expr use, FunctionCall endCall, boolean eq_is_true) {
exists(FunctionCall eq |
exists(string opName | eq.getTarget().getName() = opName |
opName = "operator==" and eq_is_true = true
or
opName = "operator!=" and eq_is_true = false
) and
DataFlow::localExprFlow(use, eq.getAnArgument()) and
DataFlow::localExprFlow(endCall, eq.getAnArgument()) and
endCall.getTarget().getName() = "end" and
DataFlow::localExprFlow(eq, cond)
)
}
from FunctionCall call, Expr use
where
call.getTarget().getName() = "findKey" and
DataFlow::localExprFlow(call, use) and
use != call and
not use.(AssignExpr).getRValue() = call and
not end_condition(_, use, _, _) and
not exists(
Expr cond_use, FunctionCall endCall, GuardCondition cond, BasicBlock block, boolean branch
|
end_condition(cond, cond_use, endCall, branch) and
DataFlow::localExprFlow(call, cond_use) and
cond.controls(block, branch.booleanNot()) and
block.contains(use)
)
select call, "Iterator returned by findKey might cause a null deref $@.", use, "here"

View File

@ -0,0 +1,4 @@
name: exiv2-cpp-queries
version: 0.0.0
libraryPathDependencies: codeql-cpp
suites: exiv2-cpp-suite

View File

@ -0,0 +1,30 @@
<!DOCTYPE qhelp SYSTEM "qhelp.dtd">
<qhelp>
<overview>
<p>
The <a href="https://en.cppreference.com/w/cpp/container/vector/operator_at"><tt>operator[]</tt></a> method of <a href="https://en.cppreference.com/w/cpp/container/vector"><tt>std::vector</tt></a> does not do any bounds checking on the index. It is safer to use the <a href="https://en.cppreference.com/w/cpp/container/vector/at"><tt>at()</tt></a> method, which does do bounds checking.
</p>
</overview>
<recommendation>
<p>
Use the <a href="https://en.cppreference.com/w/cpp/container/vector/at"><tt>at()</tt></a> method, rather than <a href="https://en.cppreference.com/w/cpp/container/vector/operator_at"><tt>operator[]</tt></a>.
</p>
<p>
Some uses of <a href="https://en.cppreference.com/w/cpp/container/vector/operator_at"><tt>operator[]</tt></a> are safe because they are protected by a bounds check. The query recognises the following safe coding patterns:
</p>
<ul>
<li><tt>if (!x.empty()) { ...x[0]... }</tt></li>
<li><tt>if (x.length()) { ...x[0]... }</tt></li>
<li><tt>if (x.size() > 2) { ...x[2]... }</tt></li>
<li><tt>if (x.size() == 2) { ...x[1]... }</tt></li>
<li><tt>if (x.size() != 0) { ...x[0]... }</tt></li>
<li><tt>if (i < x.size()) { ... x[i] ... }</tt></li>
<li><tt>if (!x.empty()) { ... x[x.size() - 1] ... }</tt></li>
</ul>
</recommendation>
<example>
<p>
<a href="https://github.com/Exiv2/exiv2/issues/1706">#1706</a> was caused by a lack of bounds-checking on <a href="https://github.com/Exiv2/exiv2/blob/15098f4ef50cc721ad0018218acab2ff06e60beb/include/exiv2/value.hpp#L1639">this array access</a>. The bug was <a href="https://github.com/Exiv2/exiv2/pull/1735">fixed</a> calling the <a href="https://en.cppreference.com/w/cpp/container/vector/at"><tt>at()</tt></a> method instead.
</p>
</example>
</qhelp>

View File

@ -0,0 +1,181 @@
/**
* @name Unsafe vector access
* @description std::vector::operator[] does not do any runtime
* bounds-checking, so it is safer to use std::vector::at()
* @kind problem
* @problem.severity warning
* @id cpp/unsafe-vector-access
* @tags security
* external/cwe/cwe-125
*/
import cpp
import semmle.code.cpp.controlflow.Guards
import semmle.code.cpp.dataflow.DataFlow
import semmle.code.cpp.rangeanalysis.SimpleRangeAnalysis
import semmle.code.cpp.rangeanalysis.RangeAnalysisUtils
import semmle.code.cpp.valuenumbering.GlobalValueNumbering
import semmle.code.cpp.valuenumbering.HashCons
// A call to `operator[]`.
class ArrayIndexCall extends FunctionCall {
ClassTemplateInstantiation ti;
TemplateClass tc;
ArrayIndexCall() {
this.getTarget().getName() = "operator[]" and
ti = this.getQualifier().getType().getUnspecifiedType() and
tc = ti.getTemplate() and
tc.getSimpleName() != "map" and
tc.getSimpleName() != "match_results"
}
ClassTemplateInstantiation getClassTemplateInstantiation() { result = ti }
TemplateClass getTemplateClass() { result = tc }
}
// A call to `size` or `length`.
class SizeCall extends FunctionCall {
string fname;
SizeCall() {
fname = this.getTarget().getName() and
(fname = "size" or fname = "length")
}
}
// `x[i]` is safe if `x` is a `std::array` (fixed-size array)
// and `i` within the bounds of the array.
predicate indexK_with_fixedarray(ClassTemplateInstantiation t, ArrayIndexCall call) {
t = call.getClassTemplateInstantiation() and
exists(Expr idx |
t.getSimpleName() = "array" and
idx = call.getArgument(0) and
lowerBound(idx) >= 0 and
upperBound(idx) < t.getTemplateArgument(1).(Literal).getValue().toInt()
)
}
// Holds if `cond` is a Boolean condition that checks the size of
// the array. It handles the following code patterns:
//
// 1. `if (!x.empty()) { ... }`
// 2. `if (x.length()) { ... }`
// 3. `if (x.size() > 2) { ... }`
// 4. `if (x.size() == 2) { ... }`
// 5. `if (x.size() != 0) { ... }`
//
// If it safe to assume that `x.size() >= minsize` on the exit `branch`.
predicate minimum_size_cond(Expr cond, Expr arrayExpr, int minsize, boolean branch) {
// `if (!x.empty()) { ...x[0]... }`
exists(FunctionCall emptyCall |
cond = emptyCall and
arrayExpr = emptyCall.getQualifier() and
emptyCall.getTarget().getName() = "empty" and
minsize = 1 and
branch = false
)
or
// `if (x.length()) { ...x[0]... }`
exists(SizeCall sizeCall |
cond = sizeCall and
arrayExpr = sizeCall.getQualifier() and
minsize = 1 and
branch = true
)
or
// `if (x.size() > 2) { ...x[2]... }`
exists(SizeCall sizeCall, Expr k, RelationStrictness strict |
arrayExpr = sizeCall.getQualifier() and
relOpWithSwapAndNegate(cond, sizeCall, k, Greater(), strict, branch)
|
strict = Strict() and minsize = 1 + k.getValue().toInt()
or
strict = Nonstrict() and minsize = k.getValue().toInt()
)
or
// `if (x.size() == 2) { ...x[1]... }`
exists(SizeCall sizeCall, Expr k |
arrayExpr = sizeCall.getQualifier() and
eqOpWithSwapAndNegate(cond, sizeCall, k, true, branch) and
minsize = k.getValue().toInt()
)
or
// `if (x.size() != 0) { ...x[0]... }`
exists(SizeCall sizeCall, Expr k |
arrayExpr = sizeCall.getQualifier() and
eqOpWithSwapAndNegate(cond, sizeCall, k, false, branch) and
k.getValue().toInt() = 0 and
minsize = 1
)
}
// Array accesses like these are safe:
// `if (!x.empty()) { ... x[0] ... }`
// `if (x.size() > 2) { ... x[2] ... }`
predicate indexK_with_check(GuardCondition guard, ArrayIndexCall call) {
exists(Expr arrayExpr, BasicBlock block, int i, int minsize, boolean branch |
minimum_size_cond(guard, arrayExpr, minsize, branch) and
(
globalValueNumber(arrayExpr) = globalValueNumber(call.getQualifier()) or
hashCons(arrayExpr) = hashCons(call.getQualifier())
) and
guard.controls(block, branch) and
block.contains(call) and
i = call.getArgument(0).getValue().toInt() and
0 <= i and
i < minsize
)
}
// Array accesses like this are safe:
// `if (i < x.size()) { ... x[i] ... }`
predicate indexI_with_check(GuardCondition guard, ArrayIndexCall call) {
exists(Expr idx, SizeCall sizeCall, BasicBlock block, boolean branch |
relOpWithSwapAndNegate(guard, idx, sizeCall, Lesser(), Strict(), branch) and
(
globalValueNumber(sizeCall.getQualifier()) = globalValueNumber(call.getQualifier()) and
globalValueNumber(idx) = globalValueNumber(call.getArgument(0))
or
hashCons(sizeCall.getQualifier()) = hashCons(call.getQualifier()) and
hashCons(idx) = hashCons(call.getArgument(0))
) and
guard.controls(block, branch) and
block.contains(call)
)
}
// Array accesses like this are safe:
// `if (!x.empty()) { ... x[x.size() - 1] ... }`
predicate index_last_with_check(GuardCondition guard, ArrayIndexCall call) {
exists(Expr arrayExpr, SubExpr minusExpr, SizeCall sizeCall, BasicBlock block, boolean branch |
minimum_size_cond(guard, arrayExpr, _, branch) and
(
globalValueNumber(arrayExpr) = globalValueNumber(call.getQualifier()) or
hashCons(arrayExpr) = hashCons(call.getQualifier())
) and
guard.controls(block, branch) and
block.contains(call) and
minusExpr = call.getArgument(0) and
minusExpr.getRightOperand().getValue().toInt() = 1 and
DataFlow::localExprFlow(sizeCall, minusExpr.getLeftOperand()) and
(
globalValueNumber(sizeCall.getQualifier()) = globalValueNumber(call.getQualifier()) or
hashCons(sizeCall.getQualifier()) = hashCons(call.getQualifier())
)
)
}
from ArrayIndexCall call
where
not indexK_with_fixedarray(_, call) and
not indexK_with_check(_, call) and
not indexI_with_check(_, call) and
not index_last_with_check(_, call) and
// Ignore accesses like this: `vsnprintf(&buffer[0], buffer.size(), format, args)`
// That's pointer arithmetic, not a deref, so it's usually a false positive.
not exists(AddressOfExpr addrExpr | addrExpr.getOperand() = call) and
// Ignore results in the xmpsdk directory.
not call.getLocation().getFile().getRelativePath().matches("xmpsdk/%")
select call, "Unsafe use of operator[]. Use the at() method instead."

5
.github/codeql/codeql-config.yml vendored Normal file
View File

@ -0,0 +1,5 @@
name: "Exiv2 CodeQL config"
queries:
- uses: ./.github/codeql-queries/exiv2-code-scanning.qls
- uses: ./.github/codeql-queries/exiv2-cpp-queries

61
.github/workflows/codeql-analysis.yml vendored Normal file
View File

@ -0,0 +1,61 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
name: "CodeQL"
on:
push:
branches: [0.27-maintenance, main]
pull_request:
# The branches below must be a subset of the branches above
branches: [0.27-maintenance, main]
workflow_dispatch:
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
language: [ 'cpp' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python' ]
# Learn more...
# https://docs.github.com/en/github/finding-security-vulnerabilities-and-errors-in-your-code/configuring-code-scanning#overriding-automatic-language-detection
steps:
- name: Checkout repository
uses: actions/checkout@v3
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
with:
languages: ${{ matrix.language }}
config-file: .github/codeql/codeql-config.yml
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# queries: ./path/to/local/query, your-org/your-repo/queries@main
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v1
# Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl
# ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
# and modify them (or add more) to build your code if your project
# uses a compiled language
#- run: |
# make bootstrap
# make release
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@ -0,0 +1,46 @@
# To trigger this workflow manually, go to this url and click "Run workflow":
# https://github.com/Exiv2/exiv2/actions/workflows/nightly_Linux_distributions.yml
on:
workflow_dispatch:
# Uncomment to run this workflow daily at 4am.
#
# schedule:
# - cron: 0 4 * * *
name: CI for different Linux distributions
jobs:
distros:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
# arch suffering this issue: https://github.com/abseil/abseil-cpp/issues/709
# centos:8 had linking issues with gtest
container_image: ["fedora:latest", "debian:10", "archlinux:base", "ubuntu:20.04", "centos:8", "opensuse/tumbleweed", "alpine:3.13"]
compiler: [g++, clang++]
build_type: [Release, Debug]
shared_libraries: [ON, OFF]
container:
image: ${{ matrix.container_image }}
env:
CMAKE_FLAGS: -DEXIV2_TEAM_EXTRA_WARNINGS=OFF -DEXIV2_ENABLE_WEBREADY=ON -DEXIV2_ENABLE_CURL=ON -DEXIV2_BUILD_UNIT_TESTS=OFF -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=OFF -DEXIV2_ENABLE_PNG=ON -DCMAKE_INSTALL_PREFIX=install
steps:
- name: install tar in opensuse
run: |
distro_id=$(grep '^ID=' /etc/os-release|awk -F = '{print $2}'|sed 's/\"//g')
echo $distro_id
if [[ "$distro_id" == "opensuse-tumbleweed" ]]; then zypper --non-interactive install tar gzip; fi
- uses: actions/checkout@v3
- name: install dependencies
run: ./ci/install_dependencies.sh
- name: build and compile
run: |
mkdir build && cd build
cmake $CMAKE_FLAGS -DCMAKE_BUILD_TYPE=${{ matrix.build_type }} -DBUILD_SHARED_LIBS=${{ matrix.shared_libraries }} -DCMAKE_CXX_COMPILER=${{ matrix.compiler }} ..
make -j $(nproc)
make install

30
.github/workflows/on_PR_linux_fuzz.yml vendored Normal file
View File

@ -0,0 +1,30 @@
# Builds and runs the fuzz target for a short amount of time. This is
# mainly to protect the fuzz target from bitrot, but hopefully will
# also help to quickly catch some bugs before the PR is merged.
name: Linux-Ubuntu Quick Fuzz on PRs
on:
pull_request:
workflow_dispatch:
jobs:
Linux:
name: 'Ubuntu 20.04 - clang/libFuzzer'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: install dependencies
run: sudo ./ci/install_dependencies.sh
- name: build and compile
run: |
mkdir build && cd build
cmake -DEXIV2_ENABLE_PNG=ON -DEXIV2_ENABLE_WEBREADY=ON -DEXIV2_ENABLE_CURL=ON -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON -DCMAKE_CXX_COMPILER=$(which clang++) -DEXIV2_BUILD_FUZZ_TESTS=ON -DEXIV2_TEAM_USE_SANITIZERS=ON ..
make -j $(nproc)
- name: Fuzz
run: |
cd build
mkdir corpus
LSAN_OPTIONS=suppressions=../fuzz/knownleaks.txt ./bin/fuzz-read-print-write corpus ../test/data/ -dict=../fuzz/exiv2.dict -jobs=$(nproc) -workers=$(nproc) -max_total_time=120 -max_len=4096

View File

@ -0,0 +1,52 @@
name: Linux-Ubuntu Matrix on PRs
on: [pull_request]
jobs:
Linux:
name: 'Ubuntu 20.04 - GCC, BuildType:${{matrix.build_type}}, SHARED:${{matrix.shared_libraries}}'
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
build_type: [Release, Debug]
shared_libraries: [ON, OFF]
steps:
- uses: actions/checkout@v3
- name: install dependencies
run: |
sudo apt-get install ninja-build
pip3 install conan==1.59.0
- name: Conan common config
run: |
conan config install https://github.com/conan-io/conanclientcert.git
conan profile new --detect default
conan profile update settings.build_type=${{matrix.build_type}} default
conan profile update settings.compiler.libcxx=libstdc++11 default
- name: Run Conan
run: |
mkdir build && cd build
conan profile list
conan profile show default
conan install .. -o webready=True --build missing
- name: Build
run: |
cd build
cmake -GNinja -DCMAKE_BUILD_TYPE=${{matrix.build_type}} -DBUILD_SHARED_LIBS=${{matrix.shared_libraries}} -DEXIV2_ENABLE_PNG=ON -DEXIV2_ENABLE_WEBREADY=ON -DEXIV2_ENABLE_CURL=ON -DEXIV2_BUILD_UNIT_TESTS=ON -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON -DCMAKE_INSTALL_PREFIX=install ..
cmake --build .
- name: Install
run: |
cd build
cmake --build . --target install
tree install
- name: Test
run: |
ctest --test-dir build --output-on-failure

View File

@ -0,0 +1,162 @@
name: Linux Special Builds on PRs
on: [pull_request]
jobs:
special_debugRelease:
name: 'Ubuntu 20.04 - GCC - Debug+Coverage'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 2
# Trying to deal with warning: -> Issue detecting commit SHA. Please run actions/checkout with fetch-depth > 1 or set to 0
- name: install dependencies
run: |
sudo apt-get install ninja-build
pip3 install conan==1.59.0
- name: Conan common config
run: |
conan config install https://github.com/conan-io/conanclientcert.git
conan profile new --detect default
conan profile update settings.compiler.libcxx=libstdc++11 default
- name: Run Conan
run: |
mkdir build && cd build
conan profile list
conan profile show default
conan install .. -o webready=True --build missing
- name: Build
run: |
cd build
cmake -GNinja -DCMAKE_BUILD_TYPE=Debug -DBUILD_SHARED_LIBS=ON -DEXIV2_ENABLE_PNG=ON -DEXIV2_ENABLE_WEBREADY=ON -DEXIV2_ENABLE_CURL=ON -DEXIV2_BUILD_UNIT_TESTS=ON -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON -DBUILD_WITH_COVERAGE=ON -DCMAKE_INSTALL_PREFIX=install ..
cmake --build .
- name: Tests + Upload coverage
run: |
cd build
ctest --output-on-failure
pip install gcovr
gcovr -r ./../ -x --exclude-unreachable-branches --exclude-throw-branches -o coverage.xml .
curl https://keybase.io/codecovsecurity/pgp_keys.asc | gpg --import
curl -Os https://uploader.codecov.io/latest/linux/codecov
curl -Os https://uploader.codecov.io/latest/linux/codecov.SHA256SUM
curl -Os https://uploader.codecov.io/latest/linux/codecov.SHA256SUM.sig
gpg --verify codecov.SHA256SUM.sig codecov.SHA256SUM
shasum -a 256 -c codecov.SHA256SUM
chmod +x codecov
./codecov -f build/coverage.xml
special_releaseValgrind:
name: 'Ubuntu 20.04 - GCC - Release+Valgrind'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: install dependencies
run: |
sudo apt-get update
sudo apt-get install valgrind ninja-build
pip3 install conan==1.59.0
- name: Conan common config
run: |
conan config install https://github.com/conan-io/conanclientcert.git
conan profile new --detect default
conan profile update settings.compiler.libcxx=libstdc++11 default
- name: Run Conan
run: |
mkdir build && cd build
conan profile list
conan profile show default
conan install .. -o webready=True --build missing
- name: Build
run: |
cd build
cmake -GNinja -DCMAKE_BUILD_TYPE=Release -DBUILD_SHARED_LIBS=ON -DEXIV2_ENABLE_PNG=ON -DEXIV2_ENABLE_WEBREADY=ON -DEXIV2_ENABLE_CURL=ON -DEXIV2_BUILD_UNIT_TESTS=ON -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON -DBUILD_WITH_COVERAGE=OFF -DCMAKE_INSTALL_PREFIX=install ..
cmake --build .
- name: Tests with valgrind
run: |
cd build/bin
valgrind ./unit_tests
special_releaseSanitizers:
name: 'Ubuntu 20.04 - GCC - Release+Sanitizers'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: install dependencies
run: |
sudo apt-get install ninja-build
pip3 install conan==1.59.0
- name: Conan common config
run: |
conan config install https://github.com/conan-io/conanclientcert.git
conan profile new --detect default
conan profile update settings.compiler.libcxx=libstdc++11 default
- name: Run Conan
run: |
mkdir build && cd build
conan profile list
conan profile show default
conan install .. -o webready=True --build missing
- name: Build
run: |
cd build
cmake -GNinja -DCMAKE_BUILD_TYPE=Release -DBUILD_SHARED_LIBS=ON -DEXIV2_ENABLE_PNG=ON -DEXIV2_ENABLE_WEBREADY=ON -DEXIV2_ENABLE_CURL=ON -DEXIV2_BUILD_UNIT_TESTS=ON -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON -DBUILD_WITH_COVERAGE=OFF -DEXIV2_TEAM_USE_SANITIZERS=ON -DCMAKE_INSTALL_PREFIX=install ..
cmake --build .
- name: Tests
run: |
ctest --test-dir build --output-on-failure
special_allEnabled:
name: 'Ubuntu 20.04 - GCC - All Options Enabled'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: install dependencies
run: |
sudo apt-get update
sudo apt-get install valgrind doxygen graphviz gettext
pip3 install conan==1.59.0
- name: Conan common config
run: |
conan config install https://github.com/conan-io/conanclientcert.git
conan profile new --detect default
conan profile update settings.compiler.libcxx=libstdc++11 default
- name: Run Conan
run: |
mkdir build && cd build
conan profile list
conan profile show default
conan install .. -o webready=True --build missing
- name: Build
run: |
cd build
cmake -DCMAKE_BUILD_TYPE=Release -DBUILD_SHARED_LIBS=ON -DEXIV2_ENABLE_PNG=ON -DEXIV2_ENABLE_WEBREADY=ON -DEXIV2_ENABLE_CURL=ON -DEXIV2_BUILD_UNIT_TESTS=ON -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON -DBUILD_WITH_COVERAGE=ON -DEXIV2_BUILD_DOC=ON -DEXIV2_ENABLE_NLS=ON -DCMAKE_CXX_FLAGS="-DEXIV2_DEBUG_MESSAGES" ..
make -j
- name: Generate documentation
run: |
make doc

43
.github/workflows/on_PR_mac_matrix.yml vendored Normal file
View File

@ -0,0 +1,43 @@
name: Mac Matrix on PRs
on: [pull_request]
jobs:
windows:
name: 'MacOS - clang, BuildType:${{matrix.build_type}}, SHARED:${{matrix.shared_libraries}}'
runs-on: macos-latest
strategy:
fail-fast: false
matrix:
build_type: [Release, Debug]
shared_libraries: [ON, OFF]
steps:
- uses: actions/checkout@v3
- name: install dependencies
run: |
brew install ninja
pushd /tmp
curl -LO https://github.com/google/googletest/archive/release-1.8.0.tar.gz
tar xzf release-1.8.0.tar.gz
mkdir -p googletest-release-1.8.0/build
pushd googletest-release-1.8.0/build
cmake .. ; make ; make install
popd
popd
- name: Build
run: |
mkdir build && cd build
cmake -GNinja -DCMAKE_BUILD_TYPE=${{matrix.build_type}} -DBUILD_SHARED_LIBS=${{matrix.shared_libraries}} -DEXIV2_ENABLE_PNG=ON -DEXIV2_ENABLE_WEBREADY=ON -DEXIV2_ENABLE_CURL=ON -DEXIV2_BUILD_UNIT_TESTS=ON -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON -DCMAKE_INSTALL_PREFIX=install -DCMAKE_CXX_FLAGS="-Wno-deprecated-declarations" ..
cmake --build .
- name: Install
run: |
cmake --build build --target install
- name: Test
run: |
ctest --test-dir build --output-on-failure

View File

@ -0,0 +1,198 @@
name: Win Matrix on PRs
on:
pull_request:
push:
branches:
- 0.27-maintenance
tags:
- '!*'
jobs:
windows:
name: 'Win10 Arch: ${{matrix.platform}} BuildType:${{matrix.build_type}} - SHARED:${{matrix.shared_libraries}}'
runs-on: windows-2022
strategy:
fail-fast: false
matrix:
build_type: [Release, Debug]
shared_libraries: [ON, OFF]
platform: [ x64, x86 ]
steps:
- uses: actions/checkout@v3
- name: Set up Visual Studio shell
uses: egor-tensin/vs-shell@v2
with:
arch: ${{matrix.platform}}
- name: Set up Ninja
uses: ashutoshvarma/setup-ninja@master
with:
version: 1.10.0
- name: Set up Python
uses: actions/setup-python@v3
with:
python-version: 3.7
- name: Restore Conan cache
uses: actions/cache@v2
with:
path: ${{github.workspace}}/conanCache
key: ${{runner.os}}-${{matrix.platform}}-${{matrix.build_type}}-Shared${{matrix.shared_libraries}}-${{ hashFiles('conanfile.py') }}
- name: Install Conan & Common config
run: |
pip.exe install "conan==1.59.0"
conan profile new --detect default
conan profile update settings.build_type=${{matrix.build_type}} default
conan profile update settings.compiler="Visual Studio" default
conan profile update settings.compiler.version=17 default
conan config set storage.path=$Env:GITHUB_WORKSPACE/conanCache
- name: Conan Arch conditional config
if: ${{matrix.platform == 'x86'}}
run: |
conan profile update settings.arch=x86 default
conan profile update settings.arch_build=x86 default
- name: Run Conan
run: |
md build
cd build
conan profile list
conan install .. --build missing
dir ..
tree /f ../conanCache
- name: Build
run: |
cmake -GNinja `
-DCMAKE_BUILD_TYPE=${{matrix.build_type}} `
-DBUILD_SHARED_LIBS=${{matrix.shared_libraries}} `
-DEXIV2_ENABLE_NLS=OFF `
-DEXIV2_ENABLE_WIN_UNICODE=OFF `
-DEXIV2_ENABLE_WEBREADY=ON `
-DEXIV2_ENABLE_BMFF=ON `
-DEXIV2_BUILD_UNIT_TESTS=ON `
-DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON `
-DCMAKE_INSTALL_PREFIX=install `
-S . -B build && `
cmake --build build --parallel
- name: Install
run: |
cd build
cmake --install .
tree /f install
- name: Test
if: ${{matrix.platform == 'x64'}}
run: |
ctest --test-dir build --output-on-failure
msys2:
runs-on: windows-latest
strategy:
fail-fast: false
matrix:
build_type: [Release, Debug]
shared_libraries: [ON, OFF]
sys: [MINGW64]
name: MSYS2 ${{matrix.sys}} - BuildType:${{matrix.build_type}} - SHARED:${{matrix.shared_libraries}}
defaults:
run:
shell: msys2 {0}
steps:
- uses: actions/checkout@v3
- name: Set up MSYS2
uses: msys2/setup-msys2@v2
with:
msystem: ${{matrix.sys}}
update: true
install: >-
base-devel
pacboy: >-
toolchain:p
cmake:p
ninja:p
expat:p
gettext:p
gtest:p
libiconv:p
python-lxml:p
zlib:p
- name: Build
run: |
cmake -GNinja \
-DCMAKE_CXX_FLAGS=-Wno-deprecated \
-DCMAKE_BUILD_TYPE=${{matrix.build_type}} \
-DBUILD_SHARED_LIBS=${{matrix.shared_libraries}} \
-DEXIV2_BUILD_SAMPLES=ON \
-DEXIV2_ENABLE_NLS=ON \
-DEXIV2_ENABLE_WIN_UNICODE=ON \
-DEXIV2_ENABLE_WEBREADY=ON \
-DEXIV2_ENABLE_BMFF=ON \
-DEXIV2_BUILD_UNIT_TESTS=ON \
-S . -B build && \
cmake --build build --parallel
- name: Test
run: |
ctest --test-dir build --output-on-failure
cygwin:
runs-on: windows-latest
strategy:
fail-fast: false
matrix:
build_type: [Release]
shared_libraries: [ON]
platform: [x64]
name: Cygwin ${{matrix.platform}} - BuildType:${{matrix.build_type}} - SHARED:${{matrix.shared_libraries}}
env:
SHELLOPTS: igncr
defaults:
run:
shell: C:\tools\cygwin\bin\bash.exe -eo pipefail '{0}'
steps:
# Make sure we don't check out scripts using Windows CRLF line endings
- run: git config --global core.autocrlf input
shell: pwsh
- uses: actions/checkout@v3
- name: Set up Cygwin
uses: egor-tensin/setup-cygwin@v3
with:
platform: ${{matrix.platform}}
packages: >-
gcc-g++
cmake
ninja
libexpat-devel
libxml2-devel
libxslt-devel
python38-lxml
zlib-devel
- name: Build
run: |
cmake -GNinja \
-DCMAKE_CXX_FLAGS=-Wno-deprecated \
-DCMAKE_BUILD_TYPE=${{matrix.build_type}} \
-DBUILD_SHARED_LIBS=${{matrix.shared_libraries}} \
-DEXIV2_ENABLE_NLS=OFF \
-DEXIV2_ENABLE_WIN_UNICODE=OFF \
-DEXIV2_ENABLE_WEBREADY=ON \
-DEXIV2_ENABLE_BMFF=ON \
-DEXIV2_BUILD_UNIT_TESTS=OFF \
-S . -B build && \
cmake --build build --parallel
- name: Test
run: |
ctest --test-dir build --output-on-failure

View File

@ -0,0 +1,152 @@
# Basic CI for all platforms on push
# Note that we want to run this as fast as possible and just for the more common configurations. On
# PRs, we will test things more intensively :)
# - Only running UnitTests and not regression tests
on: [push]
name: Basic CI for all platforms on push
jobs:
windows:
name: 'Win10 Arch:x64 BuildType:Release - SHARED'
runs-on: windows-latest
steps:
- uses: actions/checkout@v3
- name: Setup Ninja
uses: ashutoshvarma/setup-ninja@master
with:
version: 1.10.0
- name: Set up Visual Studio shell
uses: egor-tensin/vs-shell@v2
with:
arch: x64
- name: Set up Python
uses: actions/setup-python@v3
with:
python-version: 3.9
- name: Install Conan & Common config
run: |
pip.exe install "conan==1.59.0"
conan config install https://github.com/conan-io/conanclientcert.git
conan profile new --detect default
conan profile show default
conan profile update settings.compiler="Visual Studio" default
conan profile update settings.compiler.version=17 default
conan config set storage.path=$Env:GITHUB_WORKSPACE/conanCache
- name: Run Conan
run: |
md build
cd build
conan profile list
conan install .. --build missing
- name: Build
run: |
cmake -GNinja `
-DCMAKE_BUILD_TYPE=Release `
-DBUILD_SHARED_LIBS=ON `
-DEXIV2_BUILD_SAMPLES=ON `
-DEXIV2_ENABLE_NLS=OFF `
-DEXIV2_ENABLE_PNG=ON `
-DEXIV2_ENABLE_WEBREADY=ON `
-DEXIV2_ENABLE_BMFF=ON `
-DEXIV2_BUILD_UNIT_TESTS=ON `
-DEXIV2_ENABLE_WIN_UNICODE=OFF `
-DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON `
-DCMAKE_INSTALL_PREFIX=install .. `
-S . -B build && `
cmake --build build --parallel
- name: Test
run: |
ctest --test-dir build --output-on-failure
Linux:
name: 'Ubuntu 20.04 - GCC - Arch:x64 BuildType:Release - SHARED'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: install dependencies
run: |
sudo apt-get install ninja-build
pip3 install conan==1.59.0
- name: Conan
run: |
mkdir build && cd build
conan profile new --detect default
conan profile update settings.compiler.libcxx=libstdc++11 default
conan profile show default
conan install .. -o webready=True --build missing
- name: build and compile
run: |
cd build && \
cmake -GNinja \
-DCMAKE_BUILD_TYPE=Release \
-DBUILD_SHARED_LIBS=ON \
-DEXIV2_BUILD_SAMPLES=ON \
-DEXIV2_ENABLE_PNG=ON \
-DEXIV2_ENABLE_WEBREADY=ON \
-DEXIV2_ENABLE_CURL=ON \
-DEXIV2_BUILD_UNIT_TESTS=ON \
-DEXIV2_ENABLE_BMFF=ON \
-DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON \
-DCMAKE_INSTALL_PREFIX=install \
-DCMAKE_CXX_FLAGS=-Wno-deprecated \
.. && \
cmake --build . --parallel
- name: Test
run: |
ctest --test-dir build --output-on-failure
MacOS:
name: 'MacOS - clang - Arch:x64 BuildType:Release - SHARED'
runs-on: macos-latest
steps:
- uses: actions/checkout@v3
- name: install dependencies
run: |
brew install ninja
pushd /tmp
curl -LO https://github.com/google/googletest/archive/release-1.8.0.tar.gz
tar xzf release-1.8.0.tar.gz
mkdir -p googletest-release-1.8.0/build
pushd googletest-release-1.8.0/build
cmake .. ; make ; make install
popd
popd
- name: build and compile
run: |
mkdir build && cd build && \
cmake -GNinja \
-DCMAKE_BUILD_TYPE=Release \
-DBUILD_SHARED_LIBS=ON \
-DEXIV2_BUILD_SAMPLES=ON \
-DEXIV2_ENABLE_PNG=ON \
-DEXIV2_ENABLE_WEBREADY=ON \
-DEXIV2_ENABLE_CURL=ON \
-DEXIV2_BUILD_UNIT_TESTS=ON \
-DEXIV2_ENABLE_BMFF=ON \
-DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON \
-DCMAKE_INSTALL_PREFIX=install \
-DCMAKE_CXX_FLAGS="-Wno-deprecated-declarations" \
.. && \
cmake --build . --parallel
- name: Test
run: |
ctest --test-dir build --output-on-failure

View File

@ -0,0 +1,54 @@
name: Linux Special Builds for 0.27-maintenance branch on push
on:
push:
branches:
- 0.27-maintenance
tags:
- '!*'
jobs:
special_debugRelease:
name: 'Ubuntu 20.04 - GCC - Debug+Coverage'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: install dependencies
run: |
pip3 install conan==1.59.0
- name: Conan common config
run: |
conan config install https://github.com/conan-io/conanclientcert.git
conan profile new --detect default
conan profile update settings.compiler.libcxx=libstdc++11 default
- name: Run Conan
run: |
mkdir build && cd build
conan profile list
conan profile show default
conan install .. -o webready=True --build missing
- name: Build
run: |
cd build
cmake -DCMAKE_BUILD_TYPE=Debug -DBUILD_SHARED_LIBS=ON -DEXIV2_ENABLE_PNG=ON -DEXIV2_ENABLE_WEBREADY=ON -DEXIV2_ENABLE_CURL=ON -DEXIV2_BUILD_UNIT_TESTS=ON -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON -DBUILD_WITH_COVERAGE=ON -DCMAKE_INSTALL_PREFIX=install ..
make -j
- name: Tests + Upload coverage
run: |
cd build
ctest --output-on-failure
pip install gcovr
gcovr -r ./../ -x --exclude-unreachable-branches --exclude-throw-branches -o coverage.xml .
curl https://keybase.io/codecovsecurity/pgp_keys.asc | gpg --import
curl -Os https://uploader.codecov.io/latest/linux/codecov
curl -Os https://uploader.codecov.io/latest/linux/codecov.SHA256SUM
curl -Os https://uploader.codecov.io/latest/linux/codecov.SHA256SUM.sig
gpg --verify codecov.SHA256SUM.sig codecov.SHA256SUM
shasum -a 256 -c codecov.SHA256SUM
chmod +x codecov
./codecov -f build/coverage.xml

229
.github/workflows/release.yml vendored Normal file
View File

@ -0,0 +1,229 @@
name: Release
on:
push:
tags:
- v[0-9]+.[0-9]+.[0-9]+
schedule:
- cron: '30 4 * * *'
workflow_dispatch:
inputs:
tag_name:
description: 'Tag name for release'
required: false
default: 0.27-nightly
jobs:
Linux:
name: 'Build Linux Release'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Install dependencies
run: |
sudo apt-get install ninja-build gettext doxygen graphviz
pip3 install conan==1.59.0
- name: Conan common config
run: |
conan config install https://github.com/conan-io/conanclientcert.git
conan profile new --detect default
conan profile update settings.build_type=Release default
conan profile update settings.compiler.libcxx=libstdc++11 default
- name: Run Conan
run: |
mkdir build && cd build
conan profile list
conan profile show default
conan install .. -o webready=False --build missing
- name: Build packaged release
run: |
cd build
cmake -GNinja -DEXIV2_TEAM_PACKAGING=ON -DBUILD_SHARED_LIBS=ON -DEXIV2_ENABLE_WEBREADY=OFF -DEXIV2_ENABLE_NLS=ON -DCMAKE_BUILD_TYPE=Release -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON -DEXIV2_BUILD_DOC=ON ..
cmake --build . --target doc
cmake --build . --target package
tree -L 3
- uses: actions/upload-artifact@v3
with:
name: exiv2-linux64
path: ./build/exiv2-*.tar.gz
if-no-files-found: error
retention-days: 1
macOS:
name: 'Build macOS Release'
runs-on: macos-latest
steps:
- uses: actions/checkout@v3
- name: Install dependencies
run: |
brew install ninja tree gettext doxygen graphviz
- name: Build packaged release
run: |
mkdir build && cd build
cmake -GNinja -DEXIV2_TEAM_PACKAGING=ON -DBUILD_SHARED_LIBS=ON -DEXIV2_ENABLE_WEBREADY=OFF -DEXIV2_ENABLE_NLS=ON -DCMAKE_BUILD_TYPE=Release -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON -DEXIV2_BUILD_DOC=ON -DCMAKE_CXX_FLAGS="-Wno-deprecated-declarations" ..
cmake --build . --target doc
cmake --build . --target package
tree -L 3
- uses: actions/upload-artifact@v3
with:
name: exiv2-macos
path: ./build/exiv2-*.tar.gz
if-no-files-found: error
retention-days: 1
Windows:
name: 'Build Windows Release'
runs-on: windows-latest
steps:
- uses: actions/checkout@v3
- name: Set up Visual Studio shell
uses: egor-tensin/vs-shell@v2
- name: Setup Ninja
uses: ashutoshvarma/setup-ninja@master
with:
version: 1.10.0
- name: Set up Python
uses: actions/setup-python@v3
with:
python-version: 3.7
- name: Restore conan cache
uses: actions/cache@v2
with:
path: ${{github.workspace}}/conanCache
key: ${{runner.os}}-packaged-win-release-${{ hashFiles('conanfile.py') }}
- name: Install doxygen
run: |
choco install doxygen.install
choco install graphviz
- name: Install Conan & Common config
run: |
pip.exe install "conan==1.59.0"
conan config install https://github.com/conan-io/conanclientcert.git
conan profile new --detect default
conan profile update settings.build_type=Release default
conan config set storage.path=$Env:GITHUB_WORKSPACE/conanCache
conan config get storage.path
tree /f ./conanCache
- name: Run Conan
run: |
md build
cd build
conan profile list
conan install .. --build missing
dir ..
tree /f ../conanCache
- name: Build packaged release
run: |
cd build
cmake -GNinja -DEXIV2_TEAM_PACKAGING=ON -DBUILD_SHARED_LIBS=ON -DEXIV2_ENABLE_WEBREADY=OFF -DEXIV2_ENABLE_NLS=OFF -DCMAKE_BUILD_TYPE=Release -DEXIV2_ENABLE_BMFF=ON -DEXIV2_TEAM_WARNINGS_AS_ERRORS=ON -DEXIV2_BUILD_DOC=ON ..
cmake --build . --target doc
cmake --build . --target package
tree -L 3
- uses: actions/upload-artifact@v3
with:
name: exiv2-win
path: ./build/exiv2-*.zip
if-no-files-found: error
retention-days: 1
publish:
needs: [Linux, macOS, Windows]
runs-on: ubuntu-20.04
permissions:
contents: write
steps:
- if: github.event_name == 'workflow_dispatch'
run: echo "TAG_NAME=${{ github.event.inputs.tag_name }}" >> $GITHUB_ENV
- if: github.event_name == 'schedule'
run: echo 'TAG_NAME=0.27-nightly' >> $GITHUB_ENV
- if: github.event_name == 'push'
run: |
TAG_NAME=${{ github.ref }}
echo "TAG_NAME=${TAG_NAME#refs/tags/}" >> $GITHUB_ENV
- if: env.TAG_NAME == '0.27-nightly'
run: |
echo 'BODY<<EOF' >> $GITHUB_ENV
echo '## Exiv2 0.27-nightly prerelease build.' >> $GITHUB_ENV
echo 'Please help us improve exiv2 by reporting any issues you encounter :wink:' >> $GITHUB_ENV
echo 'EOF' >> $GITHUB_ENV
- if: env.TAG_NAME != '0.27-nightly'
run: |
echo 'BODY<<EOF' >> $GITHUB_ENV
echo '## Exiv2 Release ${{ env.TAG_NAME }}' >> $GITHUB_ENV
echo 'See [ChangeLog](doc/ChangeLog) for more information about the changes in this release.' >> $GITHUB_ENV
echo 'EOF' >> $GITHUB_ENV
- name: Cleanup old 0.27-nightly
if: env.TAG_NAME == '0.27-nightly'
uses: actions/github-script@v6
with:
script: |
try{
const rel_id = await github.repos.getReleaseByTag({
...context.repo,
tag: "0.27-nightly"
}).then(result => result.data.id);
console.log( "Found existing 0.27-nightly release with id: ", rel_id);
await github.repos.deleteRelease({
...context.repo,
release_id: rel_id
});
console.log( "Deletion of release successful")
}catch(error){
console.log( "Deletion of release failed");
console.log( "Failed with error\n", error);
}
try{
await github.git.deleteRef({
...context.repo,
ref: "tags/0.27-nightly"
});
console.log( "Deletion of tag successful")
}catch(error){
console.log( "Deletion of tag failed");
console.log( "Failed with error\n", error);
}
- uses: actions/download-artifact@v3
- name: List downloaded files
run: tree -L 3
- uses: softprops/action-gh-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
# needs newer relase, but add it once available
#fail_on_unmatched_files: true
body: ${{ env.BODY }}
prerelease: ${{ env.TAG_NAME == '0.27-nightly' }}
tag_name: ${{ env.TAG_NAME }}
files: |
./exiv2-linux64/exiv2-*
./exiv2-macos/exiv2-*
./exiv2-win/exiv2-*

26
.gitignore vendored Normal file
View File

@ -0,0 +1,26 @@
*~
*.gmo
*.la
*.lo
*.o
*.swp
*.pyc
*.txt.user
.DS_Store
.idea/*
cmake-*
build*
asan_build*
xcode_build*
msvc_build*
cygwin_build*
mingw_build*
linux_build*
po/POTFILES
po/remove-potcdate.sed
po/stamp-po
src/doxygen.hpp
test/tmp/*
doc/html
contrib/vms/.vagrant
/.vscode

23
.mergify.yml Normal file
View File

@ -0,0 +1,23 @@
pull_request_rules:
- name: forward patches to main branch
conditions:
- base=0.27-maintenance
- label=forward-to-main
actions:
backport:
branches:
- main
assignees:
- "{{ author }}"
- name: delete head branch after merge
conditions:
- merged
actions:
delete_head_branch: {}
- name: remove outdated reviews
conditions: []
actions:
dismiss_reviews:
changes_requested: False

1
AUTHORS Normal file
View File

@ -0,0 +1 @@
See doc/ChangeLog. Authors and other contributors are mentioned there.

144
CMakeLists.txt Normal file
View File

@ -0,0 +1,144 @@
# Minimum version imposed by Debian:9
cmake_minimum_required( VERSION 3.7.2 )
project(exiv2
VERSION 0.27.7
LANGUAGES CXX
)
if(NOT CMAKE_BUILD_TYPE)
set (CMAKE_BUILD_TYPE Release)
endif()
include(cmake/mainSetup.cmake REQUIRED)
# options and their default values
option( BUILD_SHARED_LIBS "Build exiv2lib as a shared library" ON )
option( EXIV2_ENABLE_XMP "Build with XMP metadata support" ON )
option( EXIV2_ENABLE_EXTERNAL_XMP "Use external version of XMP" OFF )
option( EXIV2_ENABLE_PNG "Build with png support (requires libz)" ON )
option( EXIV2_ENABLE_NLS "Build native language support (requires gettext)" OFF )
option( EXIV2_ENABLE_PRINTUCS2 "Build with Printucs2" ON )
option( EXIV2_ENABLE_LENSDATA "Build including lens data" ON )
option( EXIV2_ENABLE_VIDEO "Build video support into library" OFF )
option( EXIV2_ENABLE_DYNAMIC_RUNTIME "Use dynamic runtime (used for static libs)" ON )
option( EXIV2_ENABLE_WIN_UNICODE "Use Unicode paths (wstring) on Windows" OFF )
option( EXIV2_ENABLE_WEBREADY "Build webready support into library" OFF )
option( EXIV2_ENABLE_CURL "USE Libcurl for HttpIo (WEBREADY)" OFF )
option( EXIV2_ENABLE_SSH "USE Libssh for SshIo (WEBREADY)" OFF )
option( EXIV2_ENABLE_BMFF "Build with BMFF support" OFF )
option( EXIV2_BUILD_SAMPLES "Build sample applications" ON )
option( EXIV2_BUILD_EXIV2_COMMAND "Build exiv2 command-line executable" ON )
option( EXIV2_BUILD_UNIT_TESTS "Build unit tests" OFF )
option( EXIV2_BUILD_FUZZ_TESTS "Build fuzz tests (libFuzzer)" OFF )
option( EXIV2_BUILD_DOC "Add 'doc' target to generate documentation" OFF )
# Only intended to be used by Exiv2 developers/contributors
option( EXIV2_TEAM_EXTRA_WARNINGS "Add more sanity checks using compiler flags" OFF )
option( EXIV2_TEAM_WARNINGS_AS_ERRORS "Treat warnings as errors" OFF )
option( EXIV2_TEAM_USE_SANITIZERS "Enable ASAN and UBSAN when available" OFF )
option( EXIV2_TEAM_PACKAGING "Additional stuff for generating packages" OFF )
set(EXTRA_COMPILE_FLAGS " ")
mark_as_advanced(
EXIV2_TEAM_EXTRA_WARNINGS
EXIV2_TEAM_WARNINGS_AS_ERRORS
EXIV2_ENABLE_EXTERNAL_XMP
EXTRA_COMPILE_FLAGS
EXIV2_TEAM_USE_SANITIZERS
)
option( BUILD_WITH_CCACHE "Use ccache to speed up compilations" OFF )
option( BUILD_WITH_COVERAGE "Add compiler flags to generate coverage stats" OFF )
set( PACKAGE_BUGREPORT "http://github.com/exiv2/exiv2" )
set( PACKAGE_URL "https://exiv2.org")
set( PROJECT_DESCRIPTION "Exif/IPTC/Xmp C++ metadata library and tools plus ICC Profiles, Previews and more.")
if ( EXIV2_ENABLE_EXTERNAL_XMP )
set(EXIV2_ENABLE_XMP OFF)
endif()
if( EXIV2_BUILD_UNIT_TESTS )
set(CMAKE_WINDOWS_EXPORT_ALL_SYMBOLS ON) # Requires CMake 3.3.3
endif()
include(cmake/findDependencies.cmake REQUIRED)
include(cmake/compilerFlags.cmake REQUIRED)
include(cmake/generateConfigFile.cmake REQUIRED)
if (EXIV2_BUILD_DOC)
include(cmake/generateDoc.cmake REQUIRED)
generate_documentation("${PROJECT_SOURCE_DIR}/cmake/Doxyfile.in")
endif()
include_directories(${CMAKE_BINARY_DIR}) # Make the exv_conf.h file visible for the full project
if( EXIV2_ENABLE_XMP )
add_subdirectory( xmpsdk )
endif()
include(cmake/compilerFlagsExiv2.cmake REQUIRED)
add_subdirectory( include )
add_subdirectory( src )
if( EXIV2_BUILD_UNIT_TESTS )
add_subdirectory ( unitTests )
endif()
if( EXIV2_BUILD_FUZZ_TESTS )
if ((NOT COMPILER_IS_CLANG) OR (NOT EXIV2_TEAM_USE_SANITIZERS))
message(FATAL_ERROR "You need to build with Clang and sanitizers for the fuzzers to work. "
"Use Clang and -DEXIV2_TEAM_USE_SANITIZERS=ON")
endif()
add_subdirectory ( fuzz )
endif()
if( EXIV2_BUILD_SAMPLES )
add_subdirectory( samples )
get_directory_property(SAMPLES DIRECTORY samples DEFINITION APPLICATIONS)
if (Python3_Interpreter_FOUND)
add_test(NAME bashTests
WORKING_DIRECTORY ${PROJECT_SOURCE_DIR}/tests
COMMAND cmake -E env EXIV2_BINDIR=${CMAKE_RUNTIME_OUTPUT_DIRECTORY} ${Python3_EXECUTABLE} runner.py --verbose bash_tests
)
add_test(NAME bugfixTests
WORKING_DIRECTORY ${PROJECT_SOURCE_DIR}/tests
COMMAND cmake -E env EXIV2_BINDIR=${CMAKE_RUNTIME_OUTPUT_DIRECTORY} ${Python3_EXECUTABLE} runner.py --verbose bugfixes
)
add_test(NAME tiffTests
WORKING_DIRECTORY ${PROJECT_SOURCE_DIR}/tests
COMMAND cmake -E env EXIV2_BINDIR=${CMAKE_RUNTIME_OUTPUT_DIRECTORY} ${Python3_EXECUTABLE} runner.py --verbose tiff_test
)
add_test(NAME versionTests
WORKING_DIRECTORY ${PROJECT_SOURCE_DIR}/tests
COMMAND cmake -E env EXIV2_BINDIR=${CMAKE_RUNTIME_OUTPUT_DIRECTORY} ${Python3_EXECUTABLE} runner.py --verbose bash_tests/version_test.py
)
endif()
endif()
if( EXIV2_ENABLE_NLS )
add_subdirectory( po )
endif()
if (EXIV2_TEAM_PACKAGING)
include(cmake/packaging.cmake)
endif()
join_paths(libdir_for_pc_file "\${prefix}" "${CMAKE_INSTALL_LIBDIR}")
join_paths(includedir_for_pc_file "\${prefix}" "${CMAKE_INSTALL_INCLUDEDIR}")
configure_file(cmake/exiv2.pc.in exiv2.pc @ONLY)
install(FILES ${CMAKE_CURRENT_BINARY_DIR}/exiv2.pc DESTINATION ${CMAKE_INSTALL_LIBDIR}/pkgconfig)
# ******************************************************************************
# Man page
install( FILES ${PROJECT_SOURCE_DIR}/man/man1/exiv2.1 DESTINATION ${CMAKE_INSTALL_MANDIR}/man1 )
include(cmake/printSummary.cmake)
# That's all Folks!
##

59
CODING_GUIDELINES.md Normal file
View File

@ -0,0 +1,59 @@
Coding Guidelines
======================
# Contents #
* [1. General Guidelines](#10-general-guidelines)
* [2. Integer and Array Overflows](#20-integer-and-array-overflows)
* [3. Code Formatting](#30-code-formatting)
* [3.1 Guidelines to Apply clang-format](#31-guidelines-to-apply-clang-format)
# 1. General Guidelines #
- All new code must be properly tested: via unit tests (based on the Gtest framework; see `$REPO/unitTest`) or system tests (scripts exercising the main exiv2 application; see `$REPO/test`).
- Code should be simple to read and to understand.
- Do not invoke undefined behavior. [Optional] Ensure that with UBSAN, i.e. compile your code with `-fsanitize=undefined` and run the test suite.
- Ensure that your code has no memory errors. [Optional] Use ASAN for that, i.e. compile your code with `-fsanitize=address`.
# 2. Integer and Array Overflows #
- All new code that is added must be resistant to integer overflows, thus if you multiply, add, subtract, divide or bitshift integers you must ensure that no overflow can occur. Please keep in mind that signed integer overflow is undefined behavior, thus you must check for overflows before performing the arithmetic operation, otherwise the compiler is free to optimize your check after the overflow away (this has happened already).
- All new code must be resistant to buffer overflows. Thus before you access arrays a range check must be performed.
- Distrust any data that you extract from images or from external sources. E.g. if the metadata of an image gives you an offset of another information inside that file, do not assume that this offset will not result in an out off bounds read.
- New code must not assume the endianes and the word size of the system it is being run on. I.e. don't assume that `sizeof(int) = 8` or that the following will work:
```cpp
const uint32_t some_var = get_var();
const uint16_t lower_2_bytes = (const uint16_t*) &some_var;
```
since this will give you the upper two bytes on big endian systems.
If in doubt, then use the fixed size integer types like `int32_t`, `uint64_t`, `size_t`, etc.
# 3. Code Formatting #
The project contains a `.clang-format.optional` file defining the code formatting of the project (more details about of this file was defined can be found in this [PR](https://github.com/Exiv2/exiv2/pull/152)). We do not provide it via the standard name (`.clang-format`), since we do not enforce code formatting and do not want editors to automatically format everything.
Nevertheless, we suggest you to respect the code formatting by symlinking `.clang-format.optional` to `.clang-format` and applying `clang-format` to new or existing code. You can do it by using the `clang-format` command-line tool or by using one of the integration plugins provided by various editors or IDEs. Currently we know about these integrations:
- [QtCreator](http://doc.qt.io/qtcreator/creator-beautifier.html) -> beautifier -> clang-format
- [vim-clang-format](https://github.com/rhysd/vim-clang-format)
- [Emacs](https://clang.llvm.org/docs/ClangFormat.html#emacs-integration)
- Visual Studio: [1](http://clang.llvm.org/docs/ClangFormat.html#visual-studio-integration), [2](https://marketplace.visualstudio.com/items?itemName=xaver.clang-format)
Note that some times the formatting applied to complex code might result in some unexpected output. If you know how to improve the current `.clang-format` file to deal with such cases, then please contribute!. Otherwise, you have two options:
1. Apply `clang-format` to individual blocks of code (avoid to apply it over the complex piece of code).
2. Indicate which parts of the code that should not be `clang-format` formatted:
```cpp
// clang-format off
void unformatted_code ;
// clang-format on
```
## 3.1 Guidelines to Apply clang-format ##
- New files should follow the clang-format style.
- Old files will be completely re-formatted only if we need to touch several lines/functions/methods of that file. In that case, we suggest to first create a PR just re-formatting the files that will be touched. Later we create another PR with the code changes.
- If we only need to fix a small portion of a file then we do not apply clang-format at all, or we just do it in the code block that we touch.
More information about clang:
- Link to [clang-format](https://clang.llvm.org/docs/ClangFormat.html) tool.
- Link to the [clang-format option](https://clang.llvm.org/docs/ClangFormatStyleOptions.html).

145
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,145 @@
Contributing to the Exiv2 Project
======================
# Contents #
* [1. Introduction](#1-introduction)
* [2. Contributing code via GitHub](#2-contributing-code-via-github)
* [3. Contributing code via email](#3-contributing-code-via-email)
* [4. Contributing Lens Data](#4-contributing-lens-data)
* [5. Reporting Bugs](#5-reporting-bugs)
# 1. Introduction #
We welcome any help, for example contributing lens data (images), code contributions and bug reports.
# 2. Contributing code via GitHub #
Code contributions can be performed via *pull requests* (PR) on GitHub (if you cannot or do not want to use GitHub, see [3. Contributing code via email](#3-contributing-code-via-email)).
For this to work you first need to [create a user account on GitHub](https://help.github.com/articles/signing-up-for-a-new-github-account/) if you don't already have one.
A pull request should preferable contain only one new feature or bug fix etc. Since it is not uncommon to work on several PRs at the same time
it is recommended to create a new _branch_ for each PR. In this way PRs can easily be separated and the review and merge process becomes cleaner.
As a rule-of-thumb:
- PRs should be kept at a manageable size. Try to focus on just one goal per PR. If you find yourself doing several things in a PR that were not expected,
then try to split the different tasks into different PRs.
- Commits should always change a *single* logical unit so that cherry-picking & reverting is simple.
- Commit messages should be as informative and concise as possible. The first line of the commit message should be < 80 characters and
describe the commit briefly. If the 80 characters are too short for a summary, then consider splitting the commit. Optionally, add one blank line
below the short summary and write a more detailed explanation if necessary.
See the [GIT_GUIDELINES.md](git_guidelines.md) file for a more detailed description of the git workflow.
Below we outline the recommended steps in the code contribution workflow. We use `your-username` to refer to your username on GitHub, `exiv2_upstream` is used when we
set the upstream remote repository for Exiv2 (we could have picked any name by try to avoid already used names like, in particular, `origin` and `master`), and
we use the name `my-new-feature` for the branch that we create (e.g., the branch name should reflect the code change being made).
**Important**: If your PR lives for a long time, then don't press the button _Update branch_ in the Pull Request view, instead follow the steps below, as
that avoids additional merge commits.
Once you have a GitHub login:
1. Fork the Exiv2 git repository to your GitHub account by pressing the _Fork_ button at: [https://github.com/Exiv2/exiv2](https://github.com/Exiv2/exiv2)
(more details [here](https://guides.github.com/activities/forking/)).
2. Then start a terminal (or use your favorite git GUI app) and clone your fork of the Exiv2 repo:
$ git clone https://github.com:your-username/exiv2.git
$ cd exiv2
3. Add a new remote pointing to upstream exiv2 repository:
$ git remote add exiv2_upstream https://github.com/Exiv2/exiv2.git
and verify that you have the two remotes configured correctly:
$ git remote -v
exiv2_upstream https://github.com/Exiv2/exiv2.git (fetch)
exiv2_upstream https://github.com/Exiv2/exiv2.git (push)
origin https://github.com/your-username/exiv2.git (fetch)
origin https://github.com/your-username/exiv2.git (push)
4. Next, create a branch for your PR from `exiv2_upstream/master` (which we also need to fetch first):
$ git fetch exiv2_upstream master
$ git checkout -b my-new-feature exiv2_upstream/master --no-track
NB: This is an important step to avoid draging in old commits!
5. Configure the project and check that it builds (if not, please report a bug):
$ rm -rf build
$ mkdir build && cd build
$ cmake -DCMAKE_BUILD_TYPE=Release ..
$ make
6. Now, make your change(s), add tests for your changes, and commit each change:
...
$ git commit -m "Commit message 1"
...
$ git commit -m "Commit message 2"
7. Make sure the tests pass:
$ make tests # Integration tests
$./bin/unit_tests # Unit tests
Exiv2's (new) test system is described in more detail in the [doc.md](tests/doc.md) and [writing_tests.md](tests/writing_tests.md) files, and a description of the old
test system can be found in the Redmine wiki: [How do I run the test suite for Exiv2](http://dev.exiv2.org/projects/exiv2/wiki/How_do_I_run_the_test_suite_for_Exiv2)
8. Push the changes to your fork on GitHub:
$ git push origin my-new-feature
9. Create the PR by pressing the _New pull request_ button on: `https://github.com/your-username/exiv2`. Please select the option "Allow edits from maintainers" as outlined
[here](https://help.github.com/en/articles/allowing-changes-to-a-pull-request-branch-created-from-a-fork).
10. Now wait for an Exiv2 project member(s) to respond to your PR. Follow the discussion on your PR at [GitHub](https://github.com/Exiv2/exiv2/pulls).
You may have to do some updates to your PR until it gets accepted.
11. After the PR has been reviewed you must _rebase_ your repo copy since there may have been several changes to the upstream repository.
Switch to your branch again
$ git checkout my-new-feature
And rebase it on top of master:
$ git pull --rebase exiv2_upstream master
When you perform a rebase the commit history is rewritten and, therefore, the next time you try to push your branch to your fork repository you will need to use
the `--force-with-lease` option:
$ git push --force-with-lease
Also, follow the coding guidelines outlined in [CODING_GUIDELINES.md](CODING_GUIDELINES.md).
# 3. Contributing Code via email #
If you cannot or do not want to use GitHub, you can still submit patches via email by using our [sourcehut mirror](https://git.sr.ht/~d4n/exiv2).
Prepare your changes in your local clone of the [GitHub](https://github.com/Exiv2/exiv2.git) or [sourcehut](https://git.sr.ht/~d4n/exiv2) repository following our
[CODING_GUIDELINES.md](CODING_GUIDELINES.md) and [GIT_GUIDELINES.md](git_guidelines.md). Send your patches to the
[~d4n/exiv2-patches@lists.sr.ht](mailto:~d4n/exiv2-patches@lists.sr.ht) mailing list. Please use `git send-email` as outlined in https://git-send-email.io/ to
simplify the integration of your patches.
# 4. Contributing Lens Data #
In order for the Exiv2 project to support a new lens we need an example image containing the Exif metadata of that lens. This is a good way for
non-programmers to contribute to the project and example images can be submitted using the following procedure:
1. Create a new Issue by pressing the _New issue_ button here: [https://github.com/Exiv2/exiv2/issues](https://github.com/Exiv2/exiv2/issues),
2. In the new Issue, enter/add the lens mount and full lens name for each lens,
3. Take a (small) .jpg image (with the lens cap on) with each lens and transfer the .jpg file(s) to disk __without processing it__ in a desktop or server software (this is important to preserve the exif metadata in the file),
4. Attach the .jpg image(s) to the Issue (one can just drag-and-drop the image(s) or paste it/them from the clipboard).
Note that we are not only interested in non-supported lenses since we also look for example images to expand and improve the Exiv2 code tests.
# 5. Reporting Bugs #
Bugs should be reported by creating Issues on GitHub. However, before reporting a bug first check the Issue list if the bug is already known, and only if you cannot find any previous bug report
then create a new Issue. When reporting a bug try to describe the problem in as much detail as possible and if the bug is triggered by an input file then attach that file to the GitHub Issue, too.

340
COPYING Normal file
View File

@ -0,0 +1,340 @@
GNU GENERAL PUBLIC LICENSE
Version 2, June 1991
Copyright (C) 1989, 1991 Free Software Foundation, Inc.
51 Franklin Street, 5th Floor, Boston, MA 02110-1301 USA.
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
License is intended to guarantee your freedom to share and change free
software--to make sure the software is free for all its users. This
General Public License applies to most of the Free Software
Foundation's software and to any other program whose authors commit to
using it. (Some other Free Software Foundation software is covered by
the GNU Library General Public License instead.) You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
this service if you wish), that you receive source code or can get it
if you want it, that you can change the software or use pieces of it
in new free programs; and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if you
distribute copies of the software, or if you modify it.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must give the recipients all the rights that
you have. You must make sure that they, too, receive or can get the
source code. And you must show them these terms so they know their
rights.
We protect your rights with two steps: (1) copyright the software, and
(2) offer you this license which gives you legal permission to copy,
distribute and/or modify the software.
Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software. If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.
Finally, any free program is threatened constantly by software
patents. We wish to avoid the danger that redistributors of a free
program will individually obtain patent licenses, in effect making the
program proprietary. To prevent this, we have made it clear that any
patent must be licensed for everyone's free use or not licensed at all.
The precise terms and conditions for copying, distribution and
modification follow.
GNU GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License applies to any program or other work which contains
a notice placed by the copyright holder saying it may be distributed
under the terms of this General Public License. The "Program", below,
refers to any such program or work, and a "work based on the Program"
means either the Program or any derivative work under copyright law:
that is to say, a work containing the Program or a portion of it,
either verbatim or with modifications and/or translated into another
language. (Hereinafter, translation is included without limitation in
the term "modification".) Each licensee is addressed as "you".
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running the Program is not restricted, and the output from the Program
is covered only if its contents constitute a work based on the
Program (independent of having been made by running the Program).
Whether that is true depends on what the Program does.
1. You may copy and distribute verbatim copies of the Program's
source code as you receive it, in any medium, provided that you
conspicuously and appropriately publish on each copy an appropriate
copyright notice and disclaimer of warranty; keep intact all the
notices that refer to this License and to the absence of any warranty;
and give any other recipients of the Program a copy of this License
along with the Program.
You may charge a fee for the physical act of transferring a copy, and
you may at your option offer warranty protection in exchange for a fee.
2. You may modify your copy or copies of the Program or any portion
of it, thus forming a work based on the Program, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) You must cause the modified files to carry prominent notices
stating that you changed the files and the date of any change.
b) You must cause any work that you distribute or publish, that in
whole or in part contains or is derived from the Program or any
part thereof, to be licensed as a whole at no charge to all third
parties under the terms of this License.
c) If the modified program normally reads commands interactively
when run, you must cause it, when started running for such
interactive use in the most ordinary way, to print or display an
announcement including an appropriate copyright notice and a
notice that there is no warranty (or else, saying that you provide
a warranty) and that users may redistribute the program under
these conditions, and telling the user how to view a copy of this
License. (Exception: if the Program itself is interactive but
does not normally print such an announcement, your work based on
the Program is not required to print an announcement.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Program,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Program, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Program.
In addition, mere aggregation of another work not based on the Program
with the Program (or with a work based on the Program) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may copy and distribute the Program (or a work based on it,
under Section 2) in object code or executable form under the terms of
Sections 1 and 2 above provided that you also do one of the following:
a) Accompany it with the complete corresponding machine-readable
source code, which must be distributed under the terms of Sections
1 and 2 above on a medium customarily used for software interchange; or,
b) Accompany it with a written offer, valid for at least three
years, to give any third party, for a charge no more than your
cost of physically performing source distribution, a complete
machine-readable copy of the corresponding source code, to be
distributed under the terms of Sections 1 and 2 above on a medium
customarily used for software interchange; or,
c) Accompany it with the information you received as to the offer
to distribute corresponding source code. (This alternative is
allowed only for noncommercial distribution and only if you
received the program in object code or executable form with such
an offer, in accord with Subsection b above.)
The source code for a work means the preferred form of the work for
making modifications to it. For an executable work, complete source
code means all the source code for all modules it contains, plus any
associated interface definition files, plus the scripts used to
control compilation and installation of the executable. However, as a
special exception, the source code distributed need not include
anything that is normally distributed (in either source or binary
form) with the major components (compiler, kernel, and so on) of the
operating system on which the executable runs, unless that component
itself accompanies the executable.
If distribution of executable or object code is made by offering
access to copy from a designated place, then offering equivalent
access to copy the source code from the same place counts as
distribution of the source code, even though third parties are not
compelled to copy the source along with the object code.
4. You may not copy, modify, sublicense, or distribute the Program
except as expressly provided under this License. Any attempt
otherwise to copy, modify, sublicense or distribute the Program is
void, and will automatically terminate your rights under this License.
However, parties who have received copies, or rights, from you under
this License will not have their licenses terminated so long as such
parties remain in full compliance.
5. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Program or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Program (or any work based on the
Program), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Program or works based on it.
6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the
original licensor to copy, distribute or modify the Program subject to
these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
this License.
7. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Program at all. For example, if a patent
license would not permit royalty-free redistribution of the Program by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Program.
If any portion of this section is held invalid or unenforceable under
any particular circumstance, the balance of the section is intended to
apply and the section as a whole is intended to apply in other
circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system, which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
8. If the distribution and/or use of the Program is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Program under this License
may add an explicit geographical distribution limitation excluding
those countries, so that distribution is permitted only in or among
countries not thus excluded. In such case, this License incorporates
the limitation as if written in the body of this License.
9. The Free Software Foundation may publish revised and/or new versions
of the General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the Program
specifies a version number of this License which applies to it and "any
later version", you have the option of following the terms and conditions
either of that version or of any later version published by the Free
Software Foundation. If the Program does not specify a version number of
this License, you may choose any version ever published by the Free Software
Foundation.
10. If you wish to incorporate parts of the Program into other free
programs whose distribution conditions are different, write to the author
to ask for permission. For software which is copyrighted by the Free
Software Foundation, write to the Free Software Foundation; we sometimes
make exceptions for this. Our decision will be guided by the two goals
of preserving the free status of all derivatives of our free software and
of promoting the sharing and reuse of software generally.
NO WARRANTY
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
REPAIR OR CORRECTION.
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) 19yy <name of author>
This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
Also add information on how to contact you by electronic and paper mail.
If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) 19yy name of author
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, the commands you use may
be called something other than `show w' and `show c'; they could even be
mouse-clicks or menu items--whatever suits your program.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the program, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the program
`Gnomovision' (which makes passes at compilers) written by James Hacker.
<signature of Ty Coon>, 1 April 1989
Ty Coon, President of Vice
This General Public License does not permit incorporating your program into
proprietary programs. If your program is a subroutine library, you may
consider it more useful to permit linking proprietary applications with the
library. If this is what you want to do, use the GNU Library General
Public License instead of this License.

274
GIT_GUIDELINES.md Normal file
View File

@ -0,0 +1,274 @@
# Guidelines for using git
## Commit messages
The first line of a commit message should be < 80 characters long and briefly
describe the whole commit. Optionally, you can prefix the summary with a
tag/module in square brackets (e.g. travis if your commit changed something for
Travis, or testsuite for the testsuite, or tiff-parser, etc.). If the commit
requires additional explanation, a blank line can be put below the summary
followed by a more thorough explanation.
A commit message can look like this:
```
[travis] Fix mac osx jobs
- Specify concrete ubuntu and mac versions
- Use latest conan version
- Fix the profiles for linux and mac
- Use new version of expat (avilable in conan-center)
- Install urllib3 as suggested in python guidelines
- Use virtualenv with python3
```
The advantage of this approach is that we always see the brief summary via `git
log --oneline` and on GitHub. The 80 characters limit ensures that the message
does not wrap.
Please avoid overly generic commit messages like "fixed a bug", instead write
e.g. "fixed an overflow in the TIFF parser". If your commit fixes a specific
issue on GitHub then provide its number in the commit message. A message of the
form "fixes #aNumber" result in GitHub automatically closing issue #aNumber once
the issue got merged (please write that in the detailed description below the
summary). If the commit fixes an issue that got a CVE assigned, then you must
mention the CVE number in the commit message. Please also mention it in commit
messages for accompanying commits (like adding regression tests), so that
downstream package maintainers can cherry-pick the respective commits easily.
If you have trouble finding a brief summary that fits into 80 characters, then
you should probably split your commit.
## When to commit
Commits should be atomic, i.e. they should make one self-contained
change. Consider the following example: you want to fix an issue, which requires
you to perform two changes in two separate files to fix the issue. Then you also
want to reformat both files using clang-format and add a regression test or a
unit test.
This would result in the following commits:
1. the fix for the issue in the two source files
2. addition of a unit test or regression test (provided that it does not require
additional changes to other logical units)
3. Application of clang format to the first source file
4. Application of clang format to the second source file
We can summarize this in the following guidelines:
- Large formatting changes should be separate commits for each source file (that
way changes can be reviewed easily, as formatting changes result in very noisy
diffs)
- Changes made in different source which do not make sense without each other
should be committed together
- Changes made in the same file which do not belong together should not be
committed together
- If changes are requested during the code review, then they should be either
included in the previous already created commits, if that is applicable.
For example if a variable's name should be changed, then that should be
included into the already created commit. A bigger change, like a new function
or class will probably require a separate commit.
- Please keep in mind that your commits might be cherry-picked into an older
branch. Therefore split your commits accordingly, so that changes into
separate modules go into separate commits.
- Every commit should keep the code base in a buildable state. The test suite
needn't pass on every commit, but must pass before being merged into
`master`.
These are however not strict rules and it always depends on the case. If in
doubt: ask.
## Keeping the history linear
We prefer to keep the git log nearly linear with the individual pull requests
still visible, since they usually form one logical unit. It should look roughly
like this:
```
* 9f74f247 Merge pull request #227 from frli8848/master
|\
| * 73ac02d7 Added test for Sigma lenses
| * fc8b45dd Added the Sigma 120-300mm F2.8 DG OS HSM | S for Nikon mount.
| * 34a3be02 Added Sigma 50mm F1.4 DG HSM | A mount/UPC code (for Nikon mount).
| * 21522702 Added Sigma 20mm F1.4 DG HSM | A mount/UPC code (for Nikon mount).
|/
* f9d421b1 Merge pull request #109 from D4N/error_codes_enum
|\
| * 3965a44d Replace error variable names in test suite with enum error codes
| * a15f090f Modified test suite so that case sensitive keys are possible
| * efe2ccdc Replaced all hardcoded error codes with ker... constants
| * d897997b Force error code usage to construct a Exiv2::BasicError
| * d3c3c036 Incorporated error codes into errList
| * b80fa1b4 Added error codes from src/error.cpp into an enumeration
|/
* efee9a2b Merge pull request #205 from D4N/CVE-2017-1000127_reproducer
```
As can be seen, the two pull requests are still distinguishable but the history
is still nearly linear. This ensures that cherry-picking and bisecting works
without issues.
To ensure such a linear history, do **not** use GitHub's `Update Branch` button!
This creates a merge commit in your pull request's branch and can results in
rather complicated logs, like this:
```
* |
|\ \
| * |
* | |
|\ \ \
| |/ /
|/| |
| * |
| * |
| * |
| * |
| * |
|/ /
* |
|\ \
| |/
|/|
| *
| *
| *
|/
*
```
Instead of using the `Update Branch` button use `git pull --rebase`. For the
following example, we'll assume that we are working in a branch called
`feature_xyz` that should be merged into the branch `master`. Furthermore the
remote `origin` is a fork of exiv2 and the remote `upstream` is the "official"
exiv2 repository.
Before we start working, the `master` branch looks like this:
```
$ git log master --oneline --graph
* efee9a2b (master) Merge pull request #something
|\
| * ead7f309 A commit on master
|/
* 55001c8d Merge pull request #something else
```
We create a new branch `feature_xyz` based on `master`, create two new commits
`My commit 1` and `My commit 2` and submit a pull request into master. The log
of the branch `feature_xyz` now looks like this:
```
$ git log feature_xyz --oneline --graph
* 893fffa5 (HEAD -> feature_xyz) My commit 2
* a2a22fb9 My commit 1
* efee9a2b (master) Merge pull request #something
|\
| * ead7f309 A commit on master
|/
* 55001c8d Merge pull request #something else
```
If now new commits are pushed to `master`, resulting in this log:
```
$ git log master --oneline --graph
* 0d636cc9 (HEAD -> master) Hotfix for issue #something completely different
* efee9a2b Merge pull request #something
|\
| * ead7f309 A commit on master
|/
* 55001c8d Merge pull request #something else
```
then the branch `feature_xyz` is out of date with `master`, because it lacks the
commit `0d636cc9`. We could now merge both branches (via the cli or GitHub's
`Update Branch` button), but that will result in a messy history. Thus **don't**
do it! If you do it, you'll have to remove the merge commits manually.
Instead run: `git pull --rebase upstream master` in the `feature_xyz`
branch. Git will pull the new commit `0d636cc9` from master into your branch
`feature_xyz` and apply the two commits `My commit 1` and `My commit 2` on top
of it:
```
$ git log feature_xyz --oneline --graph
* 22a7a8c2 (HEAD -> feature_xyz) My commit 2
* efe2ccdc My commit 1
* 0d636cc9 (master) Hotfix for issue #something completely different
* efee9a2b Merge pull request #something
|\
| * ead7f309 A commit on master
|/
* 55001c8d Merge pull request #something else
```
Please note, that the hash of `My commit 1` and `My commit 2` changed! That
happened because their parent changed. Therefore you have to force push your
changes via `git push --force` next time you push your changes upstream.
## Merging pull requests
Most pull requests should be merged by creating a merge commit (the default on
GitHub). Small pull requests (= only one can commit) can be rebased on top of
master.
## Branches and tags
- The `master` branch is the current "main" development branch. It is protected
so that changes can be only included via reviewed pull requests. New releases
are made by tagging a specific commit on `master`.
- Releases are tagged with a tag of the form `v$major.$minor`. The tag is not
changed when changes are backported.
- For each release a branch of the form `$major.$minor` should be created to
store backported changes. It should be branched of from `master` at the commit
which was tagged with `v$major.$minor`.
- All other branches are development branches for pull requests, experiments,
etc. They should be deleted once the pull request got merged or the branch is
no longer useful.
- Exiv2 team members can create branches for pull requests in the main
repository if they want to collaborate with others (e.g. for big changes that
require a lot of work). No one should not `push --force` in these branches
without coordinating with others and should only `push --force-with-lease`.
When only one person will work on a pull request, then the branch can be
created in their personal fork or in the main repository (note that branches
in the main repository provide an automatic continuous integration).
## Backporting changes
We try to backport critical bugfixes to the latest released version on a best
effort basis. We lack the man power to support older releases, but accept
patches for these.
Bugfixes for crashes, memory corruptions, overflows and other potentially
dangerous bugs **must** be backported. The same applies to bugfixes for issues
that got a CVE assigned.
## Final remarks
Since git is a fully distributed version control system, all changes stay on
your machine until you push them. Thus, if you are in doubt whether a trickier
step with git might screw up your repository, you can simply create a backup of
your whole exiv2 folder. In case the tricky step went downhill, you can restore
your working copy of exiv2 and no one will ever know (unless you did a `git
push`)!
## Additional material
- [The git book](https://git-scm.com/book/en/v2/)
- `man git` and `man git $command`
- [amending and interactive
rebase](https://git-scm.com/book/en/v2/Git-Tools-Rewriting-History)
- [interactive
staging](https://git-scm.com/book/en/v2/Git-Tools-Interactive-Staging) (for
Emacs users: consider using [magit](https://magit.vc/) for interactive
staging)

575
README-CONAN.md Normal file
View File

@ -0,0 +1,575 @@
| Travis | AppVeyor | GitLab| Codecov| Repology| Chat |
|:-------------:|:-------------:|:-----:|:------:|:-------:|:----:|
| [![Build Status](https://travis-ci.org/Exiv2/exiv2.svg?branch=0.27-maintenance)](https://travis-ci.org/Exiv2/exiv2) | [![Build status](https://ci.appveyor.com/api/projects/status/d6vxf2n0cp3v88al/branch/0.27-maintenance?svg=true)](https://ci.appveyor.com/project/piponazo/exiv2-wutfp/branch/0.27-maintenance) | [![pipeline status](https://gitlab.com/D4N/exiv2/badges/0.27-maintenance/pipeline.svg)](https://gitlab.com/D4N/exiv2/commits/0.27-maintenance) | [![codecov](https://codecov.io/gh/Exiv2/exiv2/branch/0.27-maintenance/graph/badge.svg)](https://codecov.io/gh/Exiv2/exiv2) | [![Packaging status](https://repology.org/badge/tiny-repos/exiv2.svg)](https://repology.org/metapackage/exiv2/versions) | [![#exiv2-chat on matrix.org](matrix-standard-vector-logo-xs.png)](https://matrix.to/#/#exiv2-chat:matrix.org) |
![Exiv2](exiv2.png)
# Building Exiv2 and dependencies with conan
Conan is a portable package manager for C/C++ libraries. It can be used to create all dependencies needed to build Exiv2, without needing to install system packages.
This document provides a step-by-step guide to show you the basic usage of conan. For more details about the tool,
please visit the [Conan documentation website](http://docs.conan.io/en/latest/).
Although we provide step-by-step instructions to enable you to build Exiv2 with conan, we recommend that you read conan's documentation to understand the main concepts: [Getting started with Conan](http://docs.conan.io/en/latest/getting_started.html)
To build Exiv2 with conan, you will also need to install CMake. https://cmake.org/download/
_**We do not recommend using conan on MinGW, Cygwin, Unix or to cross compile from Linux to those platforms.**<br>
The build procedures for those platforms are discussed here: See [README.md](README.md)_
<name id="TOC"></a>
----
### TABLE OF CONTENTS
1. [Step by Step Guide](#1)
1. [Install conan](#1-1)
2. [Test conan installation](#1-2)
3. [Create a build directory](#1-3)
4. [Build dependencies, create build environment, build and test](#1-4)
2. [Platform Notes](#2)
1. [Linux Notes](#2-1)
2. [Visual Studio Notes](#2-2)
3. [Conan Architecture](#3)
1. [conanfile.py](#3-1)
2. [Conan Recipes](#3-2)
3. [Conan server search path](#3-3)
4. [Configuring conan on your machine](#3-4)
4. [Building Exiv2 with Adobe XMPsdk 2016](#4)
1. [Add a remote directory to conan's recipe search path](#4-1)
2. [Build dependencies and install conan artefacts in your build directory](#4-2)
3. [Execute cmake to generate build files for your environment](#4-3)
4. [Build Exiv2 and link Adobe XMPsdk library](#4-4)
5. [Webready Support](#5)
<name id="1"></a>
----
# 1 Step by Step Guide
<name id="1-1"></a>
##### 1.1) </a>Install conan:
```bash
$ pip install conan
```
For other installation methods (brew, installers, from sources), visit this [link]([install
conan](http://docs.conan.io/en/latest/installation.html)).
To upgrade the version of conan:
```bash
$ pip install conan --upgrade
```
<name id="1-2"></a>
##### 1.2) Test conan installation
```bash
$ conan --version
Conan version 1.23.0
```
<name id="1-3"></a>
##### 1.3) Create a build directory<name id="1-3"></a>
Create a build directory and run the conan commands:
```bash
$ mkdir build
$ cd build
$ conan profile list
```
_**Visual Studio Users**_
_The profile msvc2019Release96 in `%USERPROFILE%\.conan\profiles\msvc2019Release64` is:_
```ini
[build_requires]
[settings]
arch=x86_64
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=16
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]
```
_Profiles for Visual Studio are discussed in detail here: [Visual Studio Notes](#2-2)_
<name id="1-4"></a>
##### 1.4) Build dependencies, create build environment, build and test</a>
| | Build Steps | Linux and macOS | Visual Studio |
|:-- |:-------------------------------------------------------------------------|-----------------------|------------------------------|
| _**1**_ | Get conan to fetch dependencies<br><br>The output can be quite<br>long as conan downloads and/or builds<br>zlib, expat, curl and other dependencies.| $ conan install ..<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;--build missing | c:\\..\\build> conan install .. --build missing<br>&nbsp;&nbsp;&nbsp;&nbsp;--profile msvc2019Release64 |
| _**2**_ | Get cmake to generate<br>makefiles or sln/vcxproj | $ cmake .. | c:\\..\\build> cmake&nbsp;..&nbsp;-G&nbsp;"Visual Studio 16 2019"
| _**3**_ | Build | $ cmake --build . | c:\\..\\build>&nbsp;cmake&nbsp;--build&nbsp;.&nbsp;--config&nbsp;Release<br>You may prefer to open exiv2.sln and build using the IDE. |
| _**4**_ | Optionally Run Test Suite<br/>Test documentation: [README.md](README.md) | $ ctest | c:\\..\\build>&nbsp;ctest -C Release |
[TOC](#TOC)
<name id="2"></a>
## 2) Platform Notes
<name id="2-1"></a>
### 2.1) Linux Notes
##### Default Profile
When you run conan install for the first time, it will detect and write the default profile ~/.conan/profile/default. On my Ubuntu system with GCC 4.9, this is:
```ini
[settings]
os=Linux
os_build=Linux
arch=x86_64
arch_build=x86_64
compiler=gcc
compiler.version=4.9
compiler.libcxx=libstdc++
build_type=Release
[options]
[build_requires]
[env]
```
##### Changing profile settings
One of the most important **profile** settings to be adjusted in your conan profile when working on Linux is the field:
```bash
compiler.libcxx=libstdc++11 # Possible values: libstdc++, libstdc++11, libc++
```
With the arrival of the c++11 standard, and the growing popularity of the *clang* compiler, it is increasingly important which version of the standard library to use (corresponds to the `-stdlib` compiler flag).
Recommended **libcxx**
settings works with conan and different compilers:
```bash
compiler.libcxx=libstdc++11 # will use -stdlib=libstdc++ and define _GLIBCXX_USE_CXX11_ABI=1
compiler.libcxx=libstdc++ # will use -stdlib=libstdc++ and define _GLIBCXX_USE_CXX11_ABI=0
compiler.libcxx=libc++ # will use -stdlib=libc++
```
As a rule of thumb, set `compiler.libcxx=libstdc++11` when using a version of gcc >= 5.1.
More information about the standard library and GCC [dual ABI in gcc](https://gcc.gnu.org/onlinedocs/libstdc++/manual/using_dual_abi.html) with GCC works.
Please, be aware that normally when using gcc >= 5.1, \_GLIBCXX\_USE\_CXX11\_ABI is set to 1 by default. However some linux
distributions might set that definition to 0 by default. In case you get linking errors about standard c++ containers or
algorithms when bringing the Exiv2 dependencies with conan, this might indicate a mismatch between the value set in
**compiler.libcxx** and the default values used in your distribution.
[TOC](#TOC)
<name id="2-2"></a>
### 2.2) Visual Studio Notes
We recommend that you install python as discussed here: [https://github.com/Exiv2/exiv2/pull/1403#issuecomment-731836146](https://github.com/Exiv2/exiv2/pull/1403#issuecomment-731836146)
### Profiles for Visual Studio
Exiv2 v0.27 can be built with VS 2008, 2010, 2012, 2013, 2015 , 2017 and 2019.
Exiv2 v0.28 is being "modernised" to C++11 and will not support C++98.
We don't expect Exiv2 v0.28 to build with VS versions earlier than VS 2015.
You create profiles in %HOMEPATH%\.conan\profiles with a text editor. For your convenience, you'll find profiles in `<exiv2dir>\cmake\msvc_conan_profiles`.
```
Profile := msvc{Edition}{Type}{Bits}
Edition := { 2019 | 2017 | 2015 }
Type := { Release | Debug }
Bits := { 64 | 32 }
Examples: msvc2019Release64 msvc2017Release32 msvc2015Debug32
```
The profile msvc2019Release64 is as follows:
```ini
[build_requires]
[settings]
arch=x86_64
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=16
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]
```
### Tools for Visual Studio 2019
You will need cmake version 3.14 (and up) and conan 1.14 (and up).
Additionally, when I upgraded to conan 1.14.3, I had to manually update the file `settings.yml` as follows. For me: `%USERPROFILE% == C:\Users\rmills`:
```bat
copy/y %USERPROFILE%\.conan\settings.yml.new %USERPROFILE%\.conan\settings.yml
```
### CMake Generators for Visual Studio
In the step-by-step guide, the command `$ cmake ..` uses
the default CMake generator. Always use the generator for your version of Visual Studio. For example:
```bat
c:\..\build> conan install .. --build missing --profile msvc2019Release64
c:\..\build> cmake .. -G "Visual Studio 16 2019" -A x64
c:\..\build> cmake --build . --config Release
```
CMake provides Generators for different editions of Visual Studio. The 64 and 32 bit Generators have different names:
| Architecture | Visual Studio 2019 | Visual Studio 2017 | Visual Studio 2015 |
|:--------- |--------------------|--------------------|--------------------|--------------------|
| 64 bit | -G "Visual Studio 16 2019" -A x64 | -G "Visual Studio 15 2017 Win64" | -G "Visual Studio 14 2015 Win64" |
| 32 bit | -G "Visual Studio 16 2019" -A Win32 | -G "Visual Studio 15 2017" | -G "Visual Studio 14 2015" |
### Recommended settings for Visual Studio
##### 64 bit Release Build
| | Visual Studio 2019 | Visual Studio 2017 | Visual Studio 2015|
|:---------|--------------------|--------------------|--------------|
| _**conan install .. --profile**_ | msvc2019Release64 | msvc2017Release64 | msvc2015Release64 |
| _**cmake**_ | -G "Visual Studio 16 2019" -A x64 | -G "Visual Studio 15 2017 Win64" | -G "Visual Studio 14 2015 Win64" |
| _**profile**_<br><br><br><br><br><br><br>_ | arch=x86\_64<br>arch\_build=x86\_64<br>build\_type=Release<br>compiler.runtime=MD<br>compiler.version=16<br>compiler=Visual Studio<br>os=Windows<br>os\_build=Windows | arch=x86\_64<br>arch\_build=x86\_64<br>build\_type=Release<br>compiler.runtime=MD<br>compiler.version=15<br>compiler=Visual Studio<br>os=Windows<br>os\_build=Windows | arch=x86\_64<br>arch\_build=x86\_64<br>build\_type=Release<br>compiler.runtime=MD<br>compiler.version=14 <br>compiler=Visual Studio<br>os=Windows<br>os\_build=Windows |
##### Debug Builds
|| Visual Studio 2019 | Visual Studio 2017 | Visual Studio 2015 |
|:-------|-------|------|--------------|
| _**conan install .. --profile**_ | msvc2019Debug64 | msvc2017Debug64 | msvc2015Debug64 |
| _**profile**_<br>_ | build\_type=Debug<br>compiler.runtime=MDd | build\_type=Debug<br>compiler.runtime=MDd | build_type=Debug<br>compiler.runtime=MDd |
##### 32bit Builds
|| Visual Studio 2019 | Visual Studio 2017 | Visual Studio 2015 |
|:-----------|--------------------|--------------------|--------------------|
| _**conan install .. --profile**_ | msvc2019Release32 | msvc2017Release32 | msvc2015Release32 |
| _**cmake**_ | -G "Visual Studio 16 2019" -A Win32 | -G "Visual Studio 15 2017" | -G "Visual Studio 14 2015" |
| _**profile**_<br>_ | arch=x86<br>arch\_build=x86 | arch=x86<br>arch\_build=x86 | arch=x86<br>arch\_build=x86 |
##### Static Builds
The default builds of Exiv2 and sample applications build and use DLLs.
To build static libraries, use the cmake option `-DBUILD_SHARED_LIBS=Off`. You will probably also want to use the static run-time. The default is to use the dynamic run-time library.
```bash
$ cmake .. -DBUILD_SHARED_LIBS=Off -DEXIV2_ENABLE_DYNAMIC_RUNTIME=Off
```
If you wish to use the static C run-time library, use the following option in the conan profile.
| | Static Release | Static Debug |
|:--- |:--------- |:-------------------|
| **profile setting** | compiler.runtime=MT | compiler.runtime=MTd |
If you receive a linker warning concerning `LIBCMT`, it is because you are attempting to link libraries which have been built with different run-time libraries.
You should link everything with the dynamic or static run-time. You can link a static library with the dynamic run-time if you wish.
### Changing profile settings with the conan command
It is recommended that you use profiles provided in `<exiv2dir>\cmake\msvc_conan_profiles`.
You can modify profile settings on the command line.
The following example demonstrates making substantial changes to profile settings by performing a 32 bit build using Visual Studio 2015 with a 2017 profile! This example is not considered good practice, it is an illustration of some conan flexibility which may be useful when your build environment is automated.
```bash
$ conan install .. --profile msvc2017Release64 -s arch_build=x86 -s arch=x86 -s compiler.version=14
$ cmake .. -G "Visual Studio 2015"
$ cmake --build . --config Release
```
[TOC](#TOC)
<name id="3">
## 3 Conan Architecture
<name id="3-1">
##### 3.1) conanfile.py
In the root level of the **Exiv2** repository, the file `conanfile.py` defines C/C++ dependencies with the syntax: `Library/version@user/channel`
For example, **zlib**:
```python
self.requires('self.requires('zlib/1.2.11@conan/stable')')
```
[TOC](#TOC)
<name id="3-2">
##### 3.2) Conan _**Recipes**_
Conan searches remote servers for a _**recipe**_ to build a dependency.
A _**recipe**_ is a python file which indicates how to build a library from sources. The recipe
understands configurations: Platform/Compiler/Settings. If the remote server has a pre-compiled package for
your configuration, it will be downloaded. Otherwise, conan will compile the libraries on your machine using instructions in the recipe.
To illustrate, here is list of packages that returned by the command `$ conan search`
```bash
$ conan search --remote conan-center zlib/1.2.11@conan/stable
```
The output should be:
```bash
Existing packages for recipe zlib/1.2.11@conan/stable:
Package_ID: 0000193ac313953e78a4f8e82528100030ca70ee
[options]
shared: False
[settings]
arch: x86_64
build_type: Debug
compiler: gcc
compiler.version: 4.9
os: Linux
Outdated from recipe: False
Package_ID: 014be746b283391f79d11e4e8af3154344b58223
[options]
shared: False
[settings]
arch: x86_64
build_type: Debug
compiler: gcc
compiler.exception: seh
compiler.threads: posix
compiler.version: 5
os: Windows
Outdated from recipe: False
... deleted ....
```
[TOC](#TOC)
<name id="3-3">
##### 3.3) Conan server search path
Conan searches remote servers for a _**recipe**_ to build the dependency. You can list them with the command:
```bash
$ conan remote list
```
You can add servers to the conan server search path:
```bash
$ conan remote add conan-piponazo https://api.bintray.com/conan/piponazo/piponazo
```
[TOC](#TOC)
<name id="3-4">
##### 3.4) Configuring conan on your machine
Conan stores its configuration and local builds in the directory ~/.conan (%HOMEPATH%\\.conan on Windows).
Conan installs several files and two directories:
```bash
$HOME/.conan/profiles Configuration files for compilers/platforms
$HOME/.conan/data Dependencies are built/stored in this directory
```
[TOC](#TOC)
<name id="3-5">
##### 3.5) Running `conan install` for the first time
The first time you run `$ conan install`, it will auto-detect your configuration and store a default profile in the file
$HOME/.conan/profiles/default
Normally you will want to define new profiles for choosing different compilers (msvc, gcc, clang), different
build_type (Release, Debug), runtimes (MD, MT, MDd, MTd)
The expected output should be something like this, in case it's the first time you run conan:
```bash
$ conan install .. --build missing
Expat/2.2.5@pix4d/stable: Retrieving from predefined remote 'conan-center'
Expat/2.2.5@pix4d/stable: Trying with 'conan-center'...
Downloading conanmanifest.txt
[==================================================] 220B/220B
Downloading conanfile.py
[==================================================] 1.7KB/1.7KB
zlib/1.2.11@conan/stable: Retrieving from predefined remote 'conan-center'
zlib/1.2.11@conan/stable: Trying with 'conan-center'...
Downloading conanmanifest.txt
[==================================================] 121B/121B
Downloading conanfile.py
[==================================================] 5.7KB/5.7KB
libcurl/7.56.1@bincrafters/stable: Retrieving from predefined remote 'bincrafters'
libcurl/7.56.1@bincrafters/stable: Trying with 'bincrafters'...
Downloading conanmanifest.txt
...
PROJECT: Installing D:\Dev\Windows\projects\exiv2\conanfile.py
Requirements
Expat/2.2.5@pix4d/stable from 'conan-center'
OpenSSL/1.0.2n@conan/stable from 'conan-center'
gtest/1.8.0@bincrafters/stable from 'conan-center'
libcurl/7.56.1@bincrafters/stable from 'bincrafters'
zlib/1.2.11@conan/stable from 'conan-center'
Packages
Expat/2.2.5@pix4d/stable:6cc50b139b9c3d27b3e9042d5f5372d327b3a9f7
OpenSSL/1.0.2n@conan/stable:606fdb601e335c2001bdf31d478826b644747077
gtest/1.8.0@bincrafters/stable:a35f8fa327837a5f1466eaf165e1b6347f6e1e51
libcurl/7.56.1@bincrafters/stable:e37838f02fd790447943465f1c9317fd1c59b95c
zlib/1.2.11@conan/stable:6cc50b139b9c3d27b3e9042d5f5372d327b3a9f7
PROJECT: Retrieving package 6cc50b139b9c3d27b3e9042d5f5372d327b3a9f7
Expat/2.2.5@pix4d/stable: Looking for package 6cc50b139b9c3d27b3e9042d5f5372d327b3a9f7 in remote 'conan-center'
Downloading conanmanifest.txt
[==================================================] 323B/323B
Downloading conaninfo.txt
[==================================================] 438B/438B
Downloading conan_package.tgz
[==================================================] 133.6KB/133.6KB
Expat/2.2.5@pix4d/stable: Package installed 6cc50b139b9c3d27b3e9042d5f5372d327b3a9f7
PROJECT: Retrieving package a35f8fa327837a5f1466eaf165e1b6347f6e1e51
gtest/1.8.0@bincrafters/stable: Looking for package a35f8fa327837a5f1466eaf165e1b6347f6e1e51 in remote 'conan-center'
Downloading conanmanifest.txt
[==================================================] 3.5KB/3.5KB
Downloading conaninfo.txt
[==================================================] 478B/478B
Downloading conan_package.tgz
[==================================================] 1001.1KB/1001.1KB
gtest/1.8.0@bincrafters/stable: Package installed a35f8fa327837a5f1466eaf165e1b6347f6e1e51
PROJECT: Retrieving package 6cc50b139b9c3d27b3e9042d5f5372d327b3a9f7
...
PROJECT: Generator cmake created conanbuildinfo.cmake
PROJECT: Generator txt created conanbuildinfo.txt
PROJECT: Generated conaninfo.txt
PROJECT imports(): Copied 5 '.dll' files
(conan)
```
Note that it first downloads the recipes, and then the binary packages. When everything goes well, conan found
the recipes in the remotes, and it also found packages for our configuration (msvc2017, Release, MD).
However, if you use other configuration for which there are no packages in the remotes, you will get an error such as:
```bash
PROJECT: WARN: Can't find a 'zlib/1.2.11@conan/stable' package for the specified options and settings:
- Settings: arch=x86_64, build_type=Release, compiler=clang, compiler.version=3.9, os=Macos
- Options: shared=False
ERROR: Missing prebuilt package for 'zlib/1.2.11@conan/stable'
Try to build it from sources with "--build zlib"
Or read "http://docs.conan.io/en/latest/faq/troubleshooting.html#error-missing-prebuilt-package"
```
In that case, we can tell conan to build the library:
```bash
$ conan install .. --profile MyEsotericProfile --build missing
```
Once the command succeeds, we will have the libraries in our system (you can find the recipes and packages in
`$HOME/.conan/data`). When you execute the command `conan install` with the same profile, the following output is typical:
```bash
$ conan install ..
PROJECT: Installing D:\Dev\Windows\projects\exiv2\conanfile.py
Requirements
Expat/2.2.5@pix4d/stable from 'conan-center'
OpenSSL/1.0.2n@conan/stable from 'conan-center'
gtest/1.8.0@bincrafters/stable from 'conan-center'
libcurl/7.56.1@bincrafters/stable from 'bincrafters'
zlib/1.2.11@conan/stable from 'conan-center'
Packages
Expat/2.2.5@pix4d/stable:6cc50b139b9c3d27b3e9042d5f5372d327b3a9f7
OpenSSL/1.0.2n@conan/stable:606fdb601e335c2001bdf31d478826b644747077
gtest/1.8.0@bincrafters/stable:a35f8fa327837a5f1466eaf165e1b6347f6e1e51
libcurl/7.56.1@bincrafters/stable:e37838f02fd790447943465f1c9317fd1c59b95c
zlib/1.2.11@conan/stable:6cc50b139b9c3d27b3e9042d5f5372d327b3a9f7
Expat/2.2.5@pix4d/stable: Already installed!
gtest/1.8.0@bincrafters/stable: Already installed!
zlib/1.2.11@conan/stable: Already installed!
OpenSSL/1.0.2n@conan/stable: Already installed!
libcurl/7.56.1@bincrafters/stable: Already installed!
PROJECT: Generator cmake created conanbuildinfo.cmake
PROJECT: Generator txt created conanbuildinfo.txt
PROJECT: Generated conaninfo.txt
PROJECT imports(): Copied 5 '.dll' files
```
Indicating that the packages were found in the local cache.
[TOC](#TOC)
<name id="4">
## 4 Building Exiv2 with Adobe XMPsdk 2016
With Exiv2 v0.27, you can build Exiv2 with Adobe XMPsdk 2016 on Linux/GCC, Mac/clang and Visual Studio 2017.
Other platforms such as Cygwin are not supported by Adobe. Adobe/XMPsdk is built as a external library.
Applications which wish use the Adobe XMPsdk directly should build Exiv2 in this configuration and the
library can be used by the application and Exiv2. The Adobe XMPsdk can be built as a static or shared library (.DLL)
To build Exiv2 with Adobe XMPsdk 2016, perform steps 1.1, 1.2 and 1.3 described above, then perform the following:
<name id="4-1">
##### 4.1) Add a remote directory to conan's recipe search path
By default, conan knows about several public conan repositories. Exiv2 requires
the **piponazo** repository to find the XmpSdk dependency which is not available from **conan-center** repository.
```bash
$ conan remote add conan-piponazo https://api.bintray.com/conan/piponazo/piponazo
```
<name id="4-2">
##### 4.2) Build dependencies and install conan artefacts in your build directory
```bash
$ conan install .. --options xmp=True --build missing
```
<name id="4-3">
##### 4.3) Execute cmake to generate build files for your environment:
You must tell CMake to link Adobe's library:
```bash
$ cmake .. -DEXIV2_ENABLE_EXTERNAL_XMP=On # -G "Visual Studio 15 2017 Win64" -DCMAKE_BUILD_TYPE=Release
```
**macOS** users should use the cmake _**Xcode**_ Generator
```bash
$ cmake .. -DEXIV2_ENABLE_EXTERNAL_XMP=On -G Xcode
```
<name id="4-4">
##### 4.4) Build Exiv2 and link Adobe XMPsdk library
```bash
$ cmake --build . --config Release
```
[TOC](#TOC)
<name id="5">
## 5 Webready Support
Exiv2 can perform I/O using internet protocols such as http, https and ftp.
The feature is disabled by default. You will need to instruct conan to build/download necessary libraries (curl, openssl and libssh) and tell CMake to link to the libraries.
```bash
$ conan install .. --options webready=True
$ cmake -DEXIV2_ENABLE_WEBREADY=ON -DEXIV2_ENABLE_CURL=ON -DEXIV2_ENABLE_SSH=ON ..
```
[TOC](#TOC)
Written by Robin Mills<br>robin@clanmills.com<br>Updated: 2021-12-19

659
README-SAMPLES.md Normal file
View File

@ -0,0 +1,659 @@
| Travis | AppVeyor | GitLab| Codecov| Repology| Chat |
|:-------------:|:-------------:|:-----:|:------:|:-------:|:----:|
| [![Build Status](https://travis-ci.org/Exiv2/exiv2.svg?branch=0.27-maintenance)](https://travis-ci.org/Exiv2/exiv2) | [![Build status](https://ci.appveyor.com/api/projects/status/d6vxf2n0cp3v88al/branch/0.27-maintenance?svg=true)](https://ci.appveyor.com/project/piponazo/exiv2-wutfp/branch/0.27-maintenance) | [![pipeline status](https://gitlab.com/D4N/exiv2/badges/0.27-maintenance/pipeline.svg)](https://gitlab.com/D4N/exiv2/commits/0.27-maintenance) | [![codecov](https://codecov.io/gh/Exiv2/exiv2/branch/0.27-maintenance/graph/badge.svg)](https://codecov.io/gh/Exiv2/exiv2) | [![Packaging status](https://repology.org/badge/tiny-repos/exiv2.svg)](https://repology.org/metapackage/exiv2/versions) | [![#exiv2-chat on matrix.org](matrix-standard-vector-logo-xs.png)](https://matrix.to/#/#exiv2-chat:matrix.org) |
![Exiv2](exiv2.png)
# Exiv2 Sample Applications
Exiv2 is a C++ library and a command line utility to read, write, delete and modify Exif, IPTC, XMP and ICC image metadata. Exiv2 also features a collection of sample and test command-line programs. Please be aware that while the program _**exiv2**_ enjoys full support from Team Exiv2, the other programs have been written for test, documentation or development purposes. You are expected to read the code to discover the specification of programs other than _**exiv2**_.
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="TOC1">
### Sample Programs
The following programs are build and installed in /usr/local/bin.
| Name | Purpose | More information | Code |
|:--- |:--- |:--- |:-- |
| _**addmoddel**_ | Demonstrates Exiv2 library APIs to add, modify or delete metadata | [addmoddel](#addmoddel) | [addmoddel.cpp](samples/addmoddel.cpp) |
| _**exifcomment**_ | Set Exif.Photo.UserComment in an image | [exifcomment](#exifcomment) | [exifcomment.cpp](samples/exifcomment.cpp) |
| _**exifdata**_ | Prints _**Exif**_ metadata in different formats in an image | [exifdata](#exifdata) | [exifdata.cpp](samples/exifdata.cpp) |
| _**exifprint**_ | Print _**Exif**_ metadata in images<br>Miscelleous other features | [exifprint](#exifprint)| [exifprint.cpp](samples/exifprint.cpp) |
| _**exifvalue**_ | Prints the value of a single _**Exif**_ tag in a file | [exifvalue](#exifvalue) | [exifvalue.cpp](samples/exifvalue.cpp) |
| _**exiv2**_ | Command line utility to read, write, delete and modify Exif, IPTC, XMP and ICC image metadata.<br>This is the primary test tool used by Team Exiv2 and can exercise almost all code in the library. Due to the extensive capability of this utility, the APIs used are usually less obvious for casual code inspection. | [https://exiv2.org/manpage.html](https://exiv2.org/manpage.html)<br>[https://exiv2.org/sample.html](https://exiv2.org/sample.html) | |
| _**exiv2json**_ | Extracts data from image in JSON format.<br>This program also contains a parser to recursively parse Xmp metadata into vectors and objects. | [exiv2json](#exiv2json) | [exiv2json.cpp](samples/exiv2json.cpp) |
| _**geotag**_ | Reads GPX data and updates images with GPS Tags | [geotag](#geotag) | [geotag.cpp](samples/geotag.cpp) |
| _**iptceasy**_ | Demonstrates read, set or modify IPTC metadata | [iptceasy](#iptceasy) | [iptceasy.cpp](samples/iptceasy.cpp) |
| _**iptcprint**_ | Demonstrates Exiv2 library APIs to print Iptc data | [iptceasy](#iptceasy) | [iptcprint.cpp](samples/iptcprint.cpp) |
| _**metacopy**_ | Demonstrates copying metadata from one image to another | [metacopy](#metacopy) | [metacopy.cpp](samples/metacopy.cpp) |
| _**mrwthumb**_ | Sample program to extract a Minolta thumbnail from the makernote | [mrwthumb](#mrwthumb) | [mrwthumb.cpp](samples/mrwthumb.cpp) |
| _**taglist**_ | Print a simple comma separated list of tags defined in Exiv2 | [taglist](#taglist) |
| _**xmpdump**_ | Sample program to dump the XMP packet of an image | [xmpdump](#xmpdump) |
| _**xmpparse**_ | Read an XMP packet from a file, parse it and print all (known) properties. | [xmpparse](#xmpparse) | [xmpparse.cpp](samples/xmpparse.cpp) |
| _**xmpprint**_ | Read an XMP from a file, parse it and print all (known) properties.. | [xmpprint](#xmpprint) | [xmpprint.cpp](samples/xmpprint.cpp) |
| _**xmpsample**_ | Demonstrates Exiv2 library high level XMP classes | [xmpsample](#xmpsample) | [xmpsample.cpp](samples/exmpsample.cpp) |
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="TOC2">
### Test Programs
As Exiv2 is open source, we publish all our materials. The following programs are actively used in our test harness. Some were written during feature development of features and their on-going use may be limited, or even obsolete. In general these programs are published as source and Team Exiv2 will not provide support to users.
| Name | Kind | More information |
|:--- |:--- |:--- |
| _**conntest**_ | Test http/https/ftp/ssh/sftp connection | [conntest](#conntest) |
| _**convert-test**_ | Conversion test driver | [convert-test](#convert-test) |
| _**easyaccess-test**_ | Sample program using high-level metadata access functions | [easyaccess-test](#easyaccess-test) |
| _**getopt-test**_ | Sample program to test getopt() | [getopt-test](#getopt-test) |
| _**ini-test**_ | Shows simple usage of the INIReader class | [ini-test](#ini-test) |
| _**iotest**_ | Test programs for BasicIo functions. | [iotest](#iotest) |
| _**iptctest**_ | Sample program test Iptc reading and writing. | [iptctest](#iptctest) |
| _**key-test**_ | Key unit tests | [key-test](#key-test) |
| _**largeiptc-test**_ | Test for large (>65535 bytes) IPTC buffer | [largeiptc-test](#largeiptc-test) |
| _**mmap-test**_ | Simple mmap tests | [mmap-test](#mmap-test) |
| _**path-test**_ | Test path IO | [path-test](#path-test) |
| _**prevtest**_ | Test access to preview images | [prevtest](#prevtest) |
| _**remotetest**_ | Tester application for testing remote i/o. | [remotetest](#remotetest) |
| _**stringto-test**_ | Test conversions from string to long, float and Rational types. | [stringto-test](#stringto-test) |
| _**tiff-test**_ | Simple TIFF write test | [tiff-test](#tiff-test) |
| _**werror-test**_ | Simple tests for the wide-string error class WError | [werror-test](#werror-test) |
| _**write-test**_ | ExifData write unit tests | [write-test](#write-test) |
| _**write2-test**_ | ExifData write unit tests for Exif data created from scratch | [write2-test](#write2-test) |
| _**xmpparser-test**_ | Read an XMP packet from a file, parse and re-serialize it. | [xmpparser-test](#xmpparser-test)|
[Sample](#TOC1) Programs [Test](#TOC2) Programs
## 2 Sample Program Descriptions
<div id="addmoddel">
#### addmoddel
```
Usage: addmoddel file
```
Demonstrates Exiv2 library APIs to add, modify or delete metadata. _Code: [addmoddel.cpp](samples/addmoddel.cpp)_
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="exifcomment">
#### exifcomment
```
Usage: exifcomment file
```
This is a simple program that demonstrates how to set _**Exif.Photo.UserComment**_ in an image. _Code: [exifcomment.cpp](samples/exifcomment.cpp)_
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="exifdata">
### exifdata
```
Usage: exifdata file format
formats: csv | json | wolf | xml
```
This is a simple program to demonstrate dumping _**Exif**_ metadata in common formats. _Code: [exifdata.cpp](samples/exifdata.cpp)_
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="exifprint">
#### exifprint
```
Usage: exifprint [ path | --version | --version-test ]
```
| Arguments | Description |
|:-- |:--- |
| path | Path to image |
| --version | Print version information from build |
| --version-test | Tests Exiv2 VERSION API |
This program demonstrates how to print _**Exif**_ metadata in an image. This program is also discussed in the platform ReadMe.txt file included in a build bundle. The option **--version** was added to enable the user to build a test application which dumps the build information. The option **--version-test** was added to test the macro EXIV2\_TEST\_VERSION() in **include/exiv2/version.hpp**.
There is one other unique feature of this program. It is the only test/sample program which can use the EXV\_UNICODE\_PATH build feature of Exiv2 on Windows.
_Code: [exifprint.cpp](samples/exifprint.cpp)_
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="exifvalue">
#### exifvalue
```
Usage: exifvalue file tag
```
Prints the value of a single _**Exif**_ tag in a file. _Code: [exifvalue.cpp](samples/exifvalue.cpp)_
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="exiv2json">
#### exiv2json
```
Usage: exiv2json [-option] file
Option: all | exif | iptc | xmp | filesystem
```
| Arguments | Description |
|:-- |:--- |
| all | All metadata |
| filesystem | Filesystem metadata |
| exif | Exif metadata |
| iptc | Iptc metadata |
| xmp | Xmp metadata |
| file | path to image |
This program dumps metadata from an image in JSON format. _Code: [exiv2json.cpp](samples/exiv2json.cpp)_
exiv2json has a recursive parser to encode XMP into Vectors and Objects. XMP data is XMP and can contain XMP `Bag` and `Seq` which are converted to JSON Objects and Arrays. Exiv2 presents data in the format: Family.Group.Tag. For XMP, results in "flat" output such such as:
```
$ exiv2 -px ~/Stonehenge.jpg
Xmp.xmp.Rating XmpText 1 0
Xmp.xmp.ModifyDate XmpText 25 2015-07-16T20:25:28+01:00
Xmp.dc.description LangAlt 1 lang="x-default" Classic View
```
exiv2json parses the Exiv2 'Family.Group.Tag' data and restores the structure of the original data in JSON. _Code: [exiv2json.cpp](samples/exiv2json.cpp)_
```
$ exiv2json -xmp http://clanmills.com/Stonehenge.jpg
{
"Xmp": {
"xmp": {
"Rating": "0",
"ModifyDate": "2015-07-16T20:25:28+01:00"
},
"dc": {
"description": {
"lang": {
"x-default": "Classic View"
}
}
},
"xmlns": {
"dc": "http:\/\/purl.org\/dc\/elements\/1.1\/",
"xmp": "http:\/\/ns.adobe.com\/xap\/1.0\/"
}
}
}
$
```
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="geotag">
#### geotag
```
Usage: geotag {-help|-version|-dst|-dryrun|-ascii|-verbose|-adjust value|-tz value|-delta value}+ path+
```
Geotag reads one or more GPX files and adds GPS Tages to images. _Code: [geotag.cpp](samples/geotag.cpp)_
If the path is a directory, geotag will read all the files in the directory. It constructs a time dictionary of position data, then updates every image with GPS Tags.
| Arguments | Description |
|:-- |:--- |
| -help | print usage statement |
| -version | prints data and time of compiling geotag.cpp |
| -dst | Apply 1 hour adjustment for daylight saving time. |
| -dryrun | Read arguments and print report. Does not modify images. |
| -ascii | Output in ascii (not UTF8). Prints `deg` instead of &deg;. |
| -verbose | Report progress. |
| -adjust value | Add/subtract time from image data. |
| -tz value | Specify time zone. For example PST = -8:00 |
| -delta value | Correction between Image DataTime and GPS time. |
| path+ | One or more directories, image paths or gpx paths. Directories are searched for gpx and images |
I use this program frequently. I have a little Canon camera which I take when I run. My Samsung Galaxy Watch uploads my runs to Strava and I download the GPX. If I'm in another time-zone and have forgotten to change the time setting in the camera, I use `-adjust` to alter the images. The GPX time is always correct, however the camera is normally off by seconds or minutes. This option enables you to correct for inaccuracy in the setting of the camera time.
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="iptceasy">
#### iptceasy
```
Usage: iptceasy file
Reads and writes raw metadata. Use -h option for help.
```
Demonstrates read, set or modify IPTC metadata. _Code: [iptceasy.cpp](samples/iptceasy.cpp)_
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="iptcprint">
#### iptcprint
```
Usage: iptcprint file
Reads and writes raw metadata. Use -h option for help.
```
Demonstrates Exiv2 library APIs to print Iptc data. _Code: [iptcprint.cpp](samples/iptcprint.cpp)_
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="metacopy">
#### metacopy
```
Usage: metacopy [-iecxaph] readfile writefile
Reads and writes raw metadata. Use -h option for help.
```
Metacopy is used to copy a complete metadata block from one file to another. _Code: [metacopy.cpp](samples/metacopy.cpp)_
Please note that some metadata such as Exif.Photo.PixelXDimension is considered to be part of the image and will not be copied.
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="mrwthumb">
#### mrwthumb
```
Usage: mrwthumb file
```
Sample program to extract a Minolta thumbnail from the makernote. _Code: [mrwthumb.cpp](samples/mrwthumb.cpp)_
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="xmpparse">
#### xmpparse
```
Usage: xmpparse file
```
Read an XMP packet from a file, parse it and print all (known) properties. _Code: [xmpparse.cpp](samples/xmpparse.cpp)_
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="xmpprint">
#### xmpprint
```
Usage: xmpprint file
```
Read an XMP from a file, parse it and print all (known) properties. _Code: [xmpprint.cpp](samples/xmpprint.cpp)_
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="xmpsample">
#### xmpsample
```
Usage: xmpsample file
```
Demonstrates Exiv2 library high level XMP classes. _Code: [xmpsample.cpp](samples/xmpsample.cpp)_
[Sample](#TOC1) Programs [Test](#TOC2) Programs
## 3 Test Program Descriptions
<div id="conntest">
#### conntest
```
Usage: conntest url {-http1_0}
```
Test http/https/ftp/ssh/sftp connection
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="convert-test">
#### convert-test
```
Usage: convert-test file
```
Conversion test driver
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="easyaccess-test">
#### easyaccess-test
```
Usage: easyaccess-test file
```
Sample program using high-level metadata access functions
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="getopt-test">
#### getopt-test
```
Usage: getopt-test
```
This program is used to test the function **getopt()**. Prior to Exiv2 v0.27, the sample programs used the platform's C Runtime Library function **getopt()**. Visual Studio builds used code in src/getopt.cpp. Due to differences in the platform **getopt()**, the code in src/getopt.cpp was modified and adopted on every platforms. This test program was added for test and debug purpose. Please note that src/getopt.cpp is compiled and linked into the sample application and is not part of the Exiv2 library.
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="ini-test">
#### ini-test
```
Usage: ini-test
```
This program is used to test reading the file ini-test. This program was added in Exiv2 v0.26 when the ~/.exiv2 file was added to the Exiv2 architecture.
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="iotest">
#### iotest
```
Usage: iotest filein fileout1 fileout2 [remote [blocksize]]
copy filein to fileout1 and copy filein to fileout2
fileout1 and fileout2 are overwritten and should match filein exactly
You may optionally provide the URL of a remote file to be copied to filein
If you use `remote`, you may optionally provide a blocksize for the copy buffer (default 10k)
```
Test programs for BasicIo functions.
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="iptctest">
#### iptctest
```
Usage: iptctest image
Commands read from stdin.
```
Sample program test Iptc reading and writing.
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="key-test">
#### key-test
```
Usage: key-test
```
Key unit tests
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="largeiptc-test">
#### largeiptc-test
```
Usage: largeiptc-test image datafile
```
Test for large (>65535 bytes) IPTC buffer
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="mmap-test">
#### mmap-test
```
Usage: mmap-test file
```
Simple mmap tests
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="path-test">
#### path-test
```
Usage: path-test file
```
Test path IO
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="prevtest">
#### prevtest
```
Usage: prevtest file
```
Test access to preview images
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="remotetest">
#### remotetest
```
Usage: remotetest remotetest file {--nocurl | --curl}
```
Tester application for testing remote i/o.
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="stringto-test">
#### stringto-test
```
Usage: stringto-test
```
Test conversions from string to long, float and Rational types.
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="taglist">
#### taglist
```
$ taglist --help
Usage: taglist [--help]
[--group name|
Groups|Exif|Canon|CanonCs|CanonSi|CanonCf|Fujifilm|Minolta|Nikon1|Nikon2|Nikon3|Olympus|
Panasonic|Pentax|Sigma|Sony|Iptc|
dc|xmp|xmpRights|xmpMM|xmpBJ|xmpTPg|xmpDM|pdf|photoshop|crs|tiff|exif|aux|iptc|all|ALL
]
Print Exif tags, MakerNote tags, or Iptc datasets
```
Print a simple comma separated list of tags defined in Exiv2
This program encodes the library's tag definitions in ascii.
The data from this program is formatted as HTML on the web-site. https://exiv2.org/metadata.html
For example, to show the binary definition of Group `Nikon3`:
```
$ taglist Nikon3
Version, 1, 0x0001, Nikon3, Exif.Nikon3.Version, Undefined, Nikon Makernote version
ISOSpeed, 2, 0x0002, Nikon3, Exif.Nikon3.ISOSpeed, Short, ISO speed setting
ColorMode, 3, 0x0003, Nikon3, Exif.Nikon3.ColorMode, Ascii, Color mode
Quality, 4, 0x0004, Nikon3, Exif.Nikon3.Quality, Ascii, Image quality setting
WhiteBalance, 5, 0x0005, Nikon3, Exif.Nikon3.WhiteBalance, Ascii, White balance
Sharpening, 6, 0x0006, Nikon3, Exif.Nikon3.Sharpening, Ascii, Image sharpening setting
Focus, 7, 0x0007, Nikon3, Exif.Nikon3.Focus, Ascii, Focus mode
FlashSetting, 8, 0x0008, Nikon3, Exif.Nikon3.FlashSetting, Ascii, Flash setting
FlashDevice, 9, 0x0009, Nikon3, Exif.Nikon3.FlashDevice, Ascii, Flash device
...
```
We can see those tags being used:
```
$ exiv2 -pa --grep Nikon3 http://clanmills.com/Stonehenge.jpg
Exif.Nikon3.Version Undefined 4 2.11
Exif.Nikon3.ISOSpeed Short 2 200
...
```
This information is formatted (search Nikon (format 3) MakerNote Tags): [https://exiv2.org/tags-nikon.html](https://exiv2.org/tags-nikon.html)
#### taglist all
These options are provided to list every Exif tag known to Exiv2. The option `all` prints Group.Name for every tag. The option `ALL` print Group.Name followed by the TagInfo for that tag. For example:
```bash
$ taglist all | grep ISOSpeed$
Photo.ISOSpeed
PanasonicRaw.ISOSpeed
CanonCs.ISOSpeed
CanonSi.ISOSpeed
Casio2.ISOSpeed
MinoltaCs5D.ISOSpeed
MinoltaCs7D.ISOSpeed
Nikon1.ISOSpeed
Nikon2.ISOSpeed
Nikon3.ISOSpeed
Olympus.ISOSpeed
Olympus2.ISOSpeed
```
```bash
$ taglist ALL | grep ISOSpeed,
Photo.ISOSpeed, 34867, 0x8833, Photo, Exif.Photo.ISOSpeed, Long, This tag indicates the ISO speed value of a camera or input device that is defined in ISO 12232. When recording this tag, the PhotographicSensitivity and SensitivityType tags shall also be recorded.
PanasonicRaw.ISOSpeed, 23, 0x0017, PanasonicRaw, Exif.PanasonicRaw.ISOSpeed, Short, ISO speed setting
CanonCs.ISOSpeed, 16, 0x0010, CanonCs, Exif.CanonCs.ISOSpeed, SShort, ISO speed setting
CanonSi.ISOSpeed, 2, 0x0002, CanonSi, Exif.CanonSi.ISOSpeed, Short, ISO speed used
Casio2.ISOSpeed, 20, 0x0014, Casio2, Exif.Casio2.ISOSpeed, Short, ISO Speed
MinoltaCs5D.ISOSpeed, 38, 0x0026, MinoltaCs5D, Exif.MinoltaCs5D.ISOSpeed, Short, ISO speed setting
MinoltaCs7D.ISOSpeed, 28, 0x001c, MinoltaCs7D, Exif.MinoltaCs7D.ISOSpeed, Short, ISO speed setting
Nikon1.ISOSpeed, 2, 0x0002, Nikon1, Exif.Nikon1.ISOSpeed, Short, ISO speed setting
Nikon2.ISOSpeed, 6, 0x0006, Nikon2, Exif.Nikon2.ISOSpeed, Short, ISO speed setting
Nikon3.ISOSpeed, 2, 0x0002, Nikon3, Exif.Nikon3.ISOSpeed, Short, ISO speed setting
Olympus.ISOSpeed, 4097, 0x1001, Olympus, Exif.Olympus.ISOSpeed, SRational, ISO speed value
Olympus2.ISOSpeed, 4097, 0x1001, Olympus, Exif.Olympus.ISOSpeed, SRational, ISO speed value
Sony1MltCs7D.ISOSpeed, 28, 0x001c, MinoltaCs7D, Exif.MinoltaCs7D.ISOSpeed, Short, ISO speed setting
```
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="tiff-test">
#### tiff-test
```
Usage: tiff-test file
```
Simple TIFF write test
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="werror-test">
#### werror-test
```
Usage: werror-test
```
Simple tests for the wide-string error class WError
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="write-test">
#### write-test
```
Usage: write-test file case
where case is an integer between 1 and 11
```
ExifData write unit tests
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="write2-test">
#### write2-test
```
Usage: write2-test file
```
ExifData write unit tests for Exif data created from scratch
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="xmpdump">
#### xmpdump
```
Usage: xmpdump file
```
Sample program to dump the XMP packet of an image
[Sample](#TOC1) Programs [Test](#TOC2) Programs
<div id="xmpparser-test">
#### xmpparser-test
```
Usage: xmpparser-test file
```
Read an XMP packet from a file, parse and re-serialize it.
[Sample](#TOC1) Programs [Test](#TOC2) Programs
Robin Mills<br>
robin@clanmills.com<br>
Revised: 2020-11-20

1306
README.md Normal file

File diff suppressed because it is too large Load Diff

39
SECURITY.md Normal file
View File

@ -0,0 +1,39 @@
# Security Policy
## Supported Versions
| Exiv2 Version | Branch | _Dot_ or _Security_ Releases |
|:-- |:-- |:-- |
| v0.27 | 0.27-maintenance | v0.27.1<br>v0.27.2<br>v0.27.3 |
| v0.26 | Branch 0.26 | None |
| v0.25 | Branch 0.25 | None |
## Security Process
Security alerts are published here: https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=Exiv2 We open an issue with the label "Security" on GitHub and fix it. It doesn't get special treatment and will be included in the next release of the branch.
Team Exiv2 does not back-port security (or any other fix) to earlier releases of the code. An engineer at SUSE has patched and fixed some security releases for Exiv2 v0.26 and Exiv2 v0.25 in branches 0.26 and 0.25. Exiv2 has provided several _**Dot Release**_ for v0.27. Exiv2 has never issued a _**Security Release**_.
The version numbering scheme is explained below. The design includes provision for a security release. A _**Dot Release**_ is an updated version of the library with security PRs and other changes. A _**Dot Release**_ offers the same API as its parent. A _**Security Release**_ is an existing release PLUS one or more security PRs. Nothing else is changed from it parent.
Users can register on GitHub.com to receive release notices for RC and GM Releases. Additionally, we inform users when we begin a project to create a new release on FaceBook (https://facebook.com/exiv2) and Discuss Pixls (https://discuss.pixls.us). The announcement of a new release project has a preliminay specification and schedule.
## Version Numbering Scheme
| Version | Name | Status | Purpose |
|:-- |:-- |:-- |:-- |
| v0.27.7.3 | Exiv2 v0.27.3 | GM | Golden Master. This is the final and official release. |
| v0.27.3.2 | Exiv2 v0.27.3.2 | RC2 | Release Candidate 2. |
| v0.27.3.20 | Exiv2 v0.27.3.2 | RC2 Preview | Dry-run for release candidate. For team review. |
| v0.27.3.81 | Exiv2 v0.27.3 | Security Fix | Security Release |
| v0.27.3.29 | Exiv2 v0.27.3.29 | Development | Should never be installed for production. |
| v0.27.4.9 | Exiv2 v0.27.4.9 | Development | Should never be installed for production. |
| v0.27.99 | Exiv2 v0.28 | Development | Should never be installed for production. |
## Reported CVEs
| CVE | Description | Solution | PR |
|:-- |:-- |:-- |:-- |
| [CVE-2019-9144](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-9144) | Crash in BigTiffImage::printIFD | Remove src/bigtiffimage.cpp | [#1331](https://github.com/Exiv2/exiv2/pull/1331) |
| to be continued | | | |

345
WORK-IN-PROGRESS Normal file
View File

@ -0,0 +1,345 @@
T A B L E o f C O N T E N T S
---------------------------------
1 Building Adobe XMPsdk and Samples in Terminal with the ./Generate_XXX_mac.sh scripts
1.1 Amazing Discovery 1 DumpFile is linked to libstdc++.6.dylib
1.2 Amazing Discovery 2 Millions of "weak symbol/visibility" messages
4 Build design for v0.26.1
4.8 Support for MinGW
5 Refactoring the Tiff Code
5.1 Background
5.2 How does Exiv2 decode the ExifData in a JPEG?
5.3 How is metadata organized in Exiv2
5.4 Where are the tags defined?
5.5 How do the MakerNotes get decoded?
5.6 How do the encoders work?
6 Using external XMP SDK via Conan
==========================================================================
4 Build design for v0.26.1
Added : 2017-08-18
Modified: 2017-08-23
The purpose of the v0.26.1 is to release bug fixes and
experimental new features which may become defaults with v0.27
4.8 Support for MinGW
MinGW msys/1.0 was deprecated when v0.26 was released.
No support for MinGW msys/1.0 will be provided.
It's very likely that the MinGW msys/1.0 will build.
I will not provide any user support for MinGW msys/1.0 in future.
MinGW msys/2.0 might be supported as "experimental" in Exiv2 v0.26.2
==========================================================================
5 Refactoring the Tiff Code
Added : 2017-09-24
Modified: 2017-09-24
5.1 Background
Tiff parsing is the root code of a metadata engine.
The Tiff parsing code in Exiv2 is very difficult to understand and has major architectural shortcomings:
1) It requires the Tiff file to be totally in memory
2) It cannot handle BigTiff
3) The parser doesn't know the source of the in memory tiff image
4) It uses memory mapping on the tiff file
- if the network connection is lost, horrible things happen
- it requires a lot of VM to map the complete file
- BigTiff file can be 100GB+
- The memory mapping causes problems with Virus Detection software on Windows
5) The parser cannot deal with multi-page tiff files
6) It requires the total file to be in contiguous memory and defeats 'webready'.
The Tiff parsing code in Exiv2 is ingenious. It's also very robust. It works well. It can:
1) Handle 32-bit Tiff and Many Raw formats (which are derived from Tiff)
2) It can read and write Manufacturer's MakerNotes which are (mostly) in Tiff format
3) It probably has other great features that I haven't discovered
- because the code is so hard to understand, I can't simply browse and read it.
4) It separates file navigation from data analysis.
The code in image::printStructure was originally written to understand "what is a tiff?"
It has problems:
1) It was intended to be a single threaded debugging function and has security issues.
2) It doesn't handle BigTiff
3) It's messy. It's reading and processing metadata simultaneously.
The aim of this project is to
1) Reconsider the Tiff Code.
2) Keep everything good in the code and address known deficiencies
3) Establish a Team Exiv2 "Tiff Expert" who knows the code intimately.
5.2 How does Exiv2 decode the ExifData in a JPEG?
You can get my test file from http://clanmills.com/Stonehenge.jpg
808 rmills@rmillsmbp:~/gnu/github/exiv2/exiv2/build $ exiv2 -pS ~/Stonehenge.jpg
STRUCTURE OF JPEG FILE: /Users/rmills/Stonehenge.jpg
address | marker | length | data
0 | 0xffd8 SOI
2 | 0xffe1 APP1 | 15288 | Exif..II*......................
15292 | 0xffe1 APP1 | 2610 | http://ns.adobe.com/xap/1.0/.<?x
17904 | 0xffed APP13 | 96 | Photoshop 3.0.8BIM.......'.....
18002 | 0xffe2 APP2 | 4094 | MPF.II*...............0100.....
22098 | 0xffdb DQT | 132
22232 | 0xffc0 SOF0 | 17
22251 | 0xffc4 DHT | 418
22671 | 0xffda SOS
809 rmills@rmillsmbp:~/gnu/github/exiv2/exiv2/build $
Exiv2 calls JpegBase::readMetadata which locates the APP1/Exif segment.
It invokes the ExifParser:
ExifParser::decode(exifData_, rawExif.pData_, rawExif.size_);
This is thin wrapper over:
TiffParserWorker::decode(....) in tiffimage.cpp
What happens then? I don't know. The metadata is decoded in:
tiffvisitor.cpp TiffDecoder::visitEntry()
The design of the TiffMumble classes is the "Visitor" pattern
described in "Design Patterns" by Addison & Wesley. The aim of the pattern
is to separate parsing from dealing with the data.
The data is being stored in ExifData which is a vector.
Order is important and preserved.
As the data values are recovered they are stored as Exifdatum in the vector.
How does the tiff visitor work? I think the reader and processor
are connected by this line in TiffParser::
rootDir->accept(reader);
The class tree for the decoder is:
class TiffDecoder : public TiffFinder {
class TiffReader ,
class TiffFinder : public TiffVisitor {
class TiffVisitor {
public:
//! Events for the stop/go flag. See setGo().
enum GoEvent {
geTraverse = 0,
geKnownMakernote = 1
};
void setGo(GoEvent event, bool go);
virtual void visitEntry(TiffEntry* object) =0;
virtual void visitDataEntry(TiffDataEntry* object) =0;
virtual void visitImageEntry(TiffImageEntry* object) =0;
virtual void visitSizeEntry(TiffSizeEntry* object) =0;
virtual void visitDirectory(TiffDirectory* object) =0;
virtual void visitSubIfd(TiffSubIfd* object) =0;
virtual void visitMnEntry(TiffMnEntry* object) =0;
virtual void visitIfdMakernote(TiffIfdMakernote* object) =0;
virtual void visitIfdMakernoteEnd(TiffIfdMakernote* object);
virtual void visitBinaryArray(TiffBinaryArray* object) =0;
virtual void visitBinaryArrayEnd(TiffBinaryArray* object);
//! Operation to perform for an element of a binary array
virtual void visitBinaryElement(TiffBinaryElement* object) =0;
//! Check if stop flag for \em event is clear, return true if it's clear.
bool go(GoEvent event) const;
}
}
}
The reader works by stepping along the Tiff directory and calls the visitor's
"callbacks" as it reads.
There are 2000 lines of code in tiffcomposite.cpp and, to be honest,
I don't know what most of it does!
Set a breakpoint in src/exif.cpp#571.
Thats where he adds the key/value to the exifData vector.
Exactly how did he get here? Thats a puzzle.
void ExifData::add(const ExifKey& key, const Value* pValue)
{
add(Exifdatum(key, pValue));
}
5.3 How is metadata organized in Exiv2
section.group.tag
section: Exif | IPTC | Xmp
group: Photo | Image | MakerNote | Nikon3 ....
tag: YResolution etc ...
820 rmills@rmillsmbp:~/gnu/github/exiv2/exiv2/src $ exiv2 -pa ~/Stonehenge.jpg | cut -d' ' -f 1 | cut -d. -f 1 | sort | uniq
Exif
Iptc
Xmp
821 rmills@rmillsmbp:~/gnu/github/exiv2/exiv2/src $ exiv2 -pa --grep Exif ~/Stonehenge.jpg | cut -d'.' -f 2 | sort | uniq
GPSInfo
Image
Iop
MakerNote
Nikon3
NikonAf2
NikonCb2b
NikonFi
NikonIi
NikonLd3
NikonMe
NikonPc
NikonVr
NikonWt
Photo
Thumbnail
822 rmills@rmillsmbp:~/gnu/github/exiv2/exiv2/src $ 533 rmills@rmillsmbp:~/Downloads $ exiv2 -pa --grep Exif ~/Stonehenge.jpg | cut -d'.' -f 3 | cut -d' ' -f 1 | sort | uniq
AFAperture
AFAreaHeight
AFAreaMode
...
XResolution
YCbCrPositioning
YResolution
534 rmills@rmillsmbp:~/Downloads $
823 rmills@rmillsmbp:~/gnu/github/exiv2/exiv2/src $
The data in IFD0 of is Exiv2.Image:
826 rmills@rmillsmbp:~/gnu/github/exiv2/exiv2/src $ exiv2 -pR ~/Stonehenge.jpg | head -20
STRUCTURE OF JPEG FILE: /Users/rmills/Stonehenge.jpg
address | marker | length | data
0 | 0xffd8 SOI
2 | 0xffe1 APP1 | 15288 | Exif..II*......................
STRUCTURE OF TIFF FILE (II): MemIo
address | tag | type | count | offset | value
10 | 0x010f Make | ASCII | 18 | 146 | NIKON CORPORATION
22 | 0x0110 Model | ASCII | 12 | 164 | NIKON D5300
34 | 0x0112 Orientation | SHORT | 1 | | 1
46 | 0x011a XResolution | RATIONAL | 1 | 176 | 300/1
58 | 0x011b YResolution | RATIONAL | 1 | 184 | 300/1
70 | 0x0128 ResolutionUnit | SHORT | 1 | | 2
82 | 0x0131 Software | ASCII | 10 | 192 | Ver.1.00
94 | 0x0132 DateTime | ASCII | 20 | 202 | 2015:07:16 20:25:28
106 | 0x0213 YCbCrPositioning | SHORT | 1 | | 1
118 | 0x8769 ExifTag | LONG | 1 | | 222
STRUCTURE OF TIFF FILE (II): MemIo
address | tag | type | count | offset | value
224 | 0x829a ExposureTime | RATIONAL | 1 | 732 | 10/4000
236 | 0x829d FNumber | RATIONAL | 1 | 740 | 100/10
827 rmills@rmillsmbp:~/gnu/github/exiv2/exiv2/src $ exiv2 -pa --grep Image ~/Stonehenge.jpg
Exif.Image.Make Ascii 18 NIKON CORPORATION
Exif.Image.Model Ascii 12 NIKON D5300
Exif.Image.Orientation Short 1 top, left
Exif.Image.XResolution Rational 1 300
Exif.Image.YResolution Rational 1 300
Exif.Image.ResolutionUnit Short 1 inch
Exif.Image.Software Ascii 10 Ver.1.00
Exif.Image.DateTime Ascii 20 2015:07:16 20:25:28
Exif.Image.YCbCrPositioning Short 1 Centered
Exif.Image.ExifTag Long 1 222
Exif.Nikon3.ImageBoundary Short 4 0 0 6000 4000
Exif.Nikon3.ImageDataSize Long 1 6173648
Exif.NikonAf2.AFImageWidth Short 1 0
Exif.NikonAf2.AFImageHeight Short 1 0
Exif.Photo.ImageUniqueID Ascii 33 090caaf2c085f3e102513b24750041aa
Exif.Image.GPSTag Long 1 4060
828 rmills@rmillsmbp:~/gnu/github/exiv2/exiv2/src $
The data in IFD1 is Exiv2.Photo
The data in the MakerNote is another embedded TIFF (which more embedded tiffs)
829 rmills@rmillsmbp:~/gnu/github/exiv2/exiv2/src $ exiv2 -pa --grep MakerNote ~/Stonehenge.jpg
Exif.Photo.MakerNote Undefined 3152 (Binary value suppressed)
Exif.MakerNote.Offset Long 1 914
Exif.MakerNote.ByteOrder Ascii 3 II
830 rmills@rmillsmbp:~/gnu/github/exiv2/exiv2/src $
The MakerNote decodes them into:
Exif.Nikon1, Exiv2.NikonAf2 and so on. I don't know exactly it achieves this.
However it means that tag-numbers can be reused in different IFDs.
Tag 0x0016 = Nikon GPSSpeed and can mean something different elsewhere.
5.4 Where are the tags defined?
There's an array of "TagInfo" data structures in each of the makernote decoders.
These define the tag (a number) and the tag name, the groupID (eg canonId) and the default type.
There's also a callback to print the value of the tag. This does the "interpretation"
that is performed by the -pt in the exiv2 command-line program.
TagInfo(0x4001, "ColorData", N_("Color Data"), N_("Color data"), canonId, makerTags, unsignedShort, -1, printValue),
5.5 How do the MakerNotes get decoded?
I don't know. It has something to do with this code in tiffcomposite.cpp#936
TiffMnEntry::doAccept(TiffVisitor& visitor) { ... }
Most makernotes are TiffStructures. So the TiffXXX classes are invoked recursively to decode the maker note.
#0 0x000000010058b4b0 in Exiv2::Internal::TiffDirectory::doAccept(Exiv2::Internal::TiffVisitor&) at /Users/rmills/gnu/github/exiv2/exiv2/src/tiffcomposite.cpp:916
This function iterated the array of entries
#1 0x000000010058b3c6 in Exiv2::Internal::TiffComponent::accept(Exiv2::Internal::TiffVisitor&) at /Users/rmills/gnu/github/exiv2/exiv2/src/tiffcomposite.cpp:891
#2 0x00000001005b5357 in Exiv2::Internal::TiffParserWorker::parse(unsigned char const*, unsigned int, unsigned int, Exiv2::Internal::TiffHeaderBase*) at /Users/rmills/gnu/github/exiv2/exiv2/src/tiffimage.cpp:2006
This function creates an array of TiffEntries
#3 0x00000001005a2a60 in Exiv2::Internal::TiffParserWorker::decode(Exiv2::ExifData&, Exiv2::IptcData&, Exiv2::XmpData&, unsigned char const*, unsigned int, unsigned int, void (Exiv2::Internal::TiffDecoder::* (*)(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, unsigned int, Exiv2::Internal::IfdId))(Exiv2::Internal::TiffEntryBase const*), Exiv2::Internal::TiffHeaderBase*) at /Users/rmills/gnu/github/exiv2/exiv2/src/tiffimage.cpp:1900
#4 0x00000001005a1ae9 in Exiv2::TiffParser::decode(Exiv2::ExifData&, Exiv2::IptcData&, Exiv2::XmpData&, unsigned char const*, unsigned int) at /Users/rmills/gnu/github/exiv2/exiv2/src/tiffimage.cpp:260
#5 0x000000010044d956 in Exiv2::ExifParser::decode(Exiv2::ExifData&, unsigned char const*, unsigned int) at /Users/rmills/gnu/github/exiv2/exiv2/src/exif.cpp:625
#6 0x0000000100498fd7 in Exiv2::JpegBase::readMetadata() at /Users/rmills/gnu/github/exiv2/exiv2/src/jpgimage.cpp:386
#7 0x000000010000bc59 in Action::Print::printList() at /Users/rmills/gnu/github/exiv2/exiv2/src/actions.cpp:530
#8 0x0000000100005835 in Action::Print::run(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) at /Users/rmills/gnu/github/exiv2/exiv2/src/actions.cpp:245
5.6 How do the encoders work?
I understand writeMetadata() and will document that soon.
I still have to study how the TiffVisitor writes metadata.
6 Using external XMP SDK via Conan
Section 1 describes how to compile the newer versions of XMP SDK with a bash script. This
approach had few limitations:
1) We had to include sources from other projects into the Exiv2 repository: Check the folder
xmpsdk/third-party.
2) Different scripts for compiling XMP SDK on Linux, Mac OSX and Windows.
3) Lot of configuration/compilation issues depending on the system configuration.
Taking into account that during the last months we have done a big effort in migrating the
manipulation of 3rd party dependencies to Conan, we have decided to do the same here. A conan recipe
has been written for XmpSdk at:
https://github.com/piponazo/conan-xmpsdk
And the recipe and package binaries can be found in the piponazo's bintray repository:
https://bintray.com/piponazo/piponazo
This conan recipe provides a custom CMake finder that will be used by our CMake code to properly
find XMP SDK in the conan cache and then be able to use the CMake variables: ${XMPSDK_LIBRARY} and
${XMPSDK_INCLUDE_DIR}.
These are the steps you will need to follow to configure the project with the external XMP support:
# Add the conan-piponazo remote to your conan configuration (only once)
conan remote add conan-piponazo https://api.bintray.com/conan/piponazo/piponazo
mkdir build && cd build
# Run conan to bring the dependencies. Note that the XMPSDK is not enabled by default and you will
# need to enable the xmp option to bring it.
conan install .. --options xmp=True
# Configure the project with support for the external XMP version. Disable the normal XMP version
cmake -DCMAKE_BUILD_TYPE=Release -DEXIV2_ENABLE_XMP=OFF -DEXIV2_ENABLE_EXTERNAL_XMP=ON -DBUILD_SHARED_LIBS=ON ..
Note that the usage of the newer versions of XMP is experimental and it was included in Exiv2
because few users has requested it.

83
ci/install_dependencies.sh Executable file
View File

@ -0,0 +1,83 @@
#!/bin/sh -e
# Debian & derivatives don't provide binary packages of googletest
# => have to build them ourselves
#
# This script builds a shared library of googletest (not googlemock!) inside
# gtest_build and copies it to /usr/lib/
debian_build_gtest() {
[ -d gtest_build ] || mkdir gtest_build
cd gtest_build
cmake -DBUILD_SHARED_LIBS=1 /usr/src/googletest/googletest
make
if [ -f "lib/libgtest.so" ]; then
# Ubuntu 20.04 with gtest 1.10
cp lib/libgtest* /usr/lib/
else
# Debian 9 with gtest 1.8
cp libgtest* /usr/lib/
fi
cd ..
}
# workaround for really bare-bones Archlinux containers:
if [ -x "$(command -v pacman)" ]; then
pacman --noconfirm -Sy
pacman --noconfirm -S grep gawk sed
fi
distro_id=$(grep '^ID=' /etc/os-release|awk -F = '{print $2}'|sed 's/\"//g')
case "$distro_id" in
'fedora')
dnf -y --refresh install gcc-c++ clang cmake make ccache expat-devel zlib-devel libssh-devel libcurl-devel gtest-devel which dos2unix glibc-langpack-en diffutils
;;
'debian')
apt-get update
apt-get install -y cmake g++ clang make ccache python3 libexpat1-dev zlib1g-dev libssh-dev libcurl4-openssl-dev libgtest-dev libxml2-utils
debian_build_gtest
;;
'arch')
pacman --noconfirm -Syu
pacman --noconfirm -S gcc clang cmake make ccache expat zlib libssh curl gtest python dos2unix which diffutils
;;
'ubuntu')
apt-get update
apt-get install -y cmake g++ clang make ccache python3 libexpat1-dev zlib1g-dev libssh-dev libcurl4-openssl-dev libgtest-dev google-mock libxml2-utils
debian_build_gtest
;;
'alpine')
apk update
apk add gcc g++ clang cmake make ccache expat-dev zlib-dev libssh-dev curl-dev gtest gtest-dev gmock libintl gettext-dev which dos2unix bash libxml2-utils diffutils python3
;;
'centos'|'rhel')
yum -y update libarchive # workaround for https://bugs.centos.org/view.php?id=18212
yum -y install epel-release
# enable copr for gtest
curl https://copr.fedorainfracloud.org/coprs/defolos/devel/repo/epel-7/defolos-devel-epel-7.repo > /etc/yum.repos.d/_copr_defolos-devel.repo
yum clean all
yum -y install gcc-c++ clang cmake make ccache expat-devel zlib-devel libssh-devel libcurl-devel gtest-devel which python3 dos2unix
;;
'opensuse-tumbleweed')
zypper --non-interactive refresh
zypper --non-interactive install gcc-c++ clang cmake make ccache libexpat-devel zlib-devel libssh-devel curl libcurl-devel git which dos2unix libxml2-tools
pushd /tmp
curl -LO https://github.com/google/googletest/archive/release-1.8.0.tar.gz
tar xzf release-1.8.0.tar.gz
mkdir -p googletest-release-1.8.0/build
pushd googletest-release-1.8.0/build
cmake .. ; make ; make install
popd
popd
;;
*)
echo "Sorry, no predefined dependencies for your distribution $distro_id exist yet"
exit 1
;;
esac

133
ci/test_build.py Normal file
View File

@ -0,0 +1,133 @@
#!/usr/bin/python3
# -*- coding: utf-8 -*-
import itertools
import multiprocessing
import os
import shlex
import subprocess
import sys
def call_wrapper(*args, **kwargs):
"""
Wrapper around subprocess.call which terminates the program on non-zero
return value.
"""
return_code = subprocess.call(*args, **kwargs)
if return_code != 0:
sys.exit(return_code)
def matrix_build(shared_libs, ccs, build_types, cmake_bin, cmake_options,
tests=True):
NCPUS = multiprocessing.cpu_count()
os.mkdir("build")
for params in itertools.product(shared_libs, ccs, build_types):
lib_type, cc, build_type = params
cwd = os.path.abspath(
os.path.join(
"build",
"_".join(
map(lambda p: str(p) if p is not None else "", params)
)
)
)
os.mkdir(cwd)
cmake = "{cmake_bin} {!s} -DCMAKE_BUILD_TYPE={build_type} -DCMAKE_CXX_FLAGS=-Wno-deprecated " \
"-DBUILD_SHARED_LIBS={lib_type} -DEXIV2_BUILD_UNIT_TESTS={tests} "\
"../..".format(
cmake_options, cmake_bin=cmake_bin, build_type=build_type,
lib_type=lib_type, tests="ON" if tests else "OFF"
)
make = "make -j " + str(NCPUS)
make_tests = "make tests"
unit_test_binary = os.path.join(cwd, "bin", "unit_tests")
# set compiler via environment only when requested
env_copy = os.environ.copy()
if cc is not None:
cxx = {"gcc": "g++", "clang": "clang++"}[cc]
env_copy["CC"] = cc
env_copy["CXX"] = cxx
# location of the binaries for the new test suite:
env_copy["EXIV2_BINDIR"] = os.path.join(cwd, "bin")
kwargs = {"env": env_copy, "cwd": cwd}
def run(cmd):
call_wrapper(shlex.split(cmd), **kwargs)
run(cmake)
run(make)
if tests:
run(make_tests)
run(unit_test_binary)
if __name__ == '__main__':
import argparse
parser = argparse.ArgumentParser(
description="Build and test exiv2 using a matrix of build switches")
parser.add_argument(
"--compilers",
help="Compilers to be used to build exiv2 (when none ore specified, "
"then the default compiler will be used)",
nargs='*',
default=["gcc", "clang"],
type=str
)
parser.add_argument(
"--shared-libs",
help="Values for the -DBUILD_SHARED_LIBS option",
nargs='+',
default=["ON", "OFF"],
type=str
)
parser.add_argument(
"--build-types",
help="Values for the -DCMAKE_BUILD_TYPE option",
nargs='+',
default=["Debug", "Release"],
type=str
)
parser.add_argument(
"--cmake-executable",
help="alternative name or path for the cmake executable",
nargs=1,
default=['cmake'],
type=str
)
parser.add_argument(
"--without-tests",
help="Skip building and running tests",
action='store_true'
)
parser.add_argument(
"--cmake-options",
help="Additional flags for cmake",
type=str,
nargs='?',
default="-DEXIV2_TEAM_EXTRA_WARNINGS=ON -DEXIV2_ENABLE_VIDEO=ON "
"-DEXIV2_ENABLE_WEBREADY=ON -DEXIV2_BUILD_UNIT_TESTS=ON -DEXIV2_ENABLE_BMFF=ON "
"-DBUILD_WITH_CCACHE=ON -DEXIV2_ENABLE_CURL=ON"
)
args = parser.parse_args()
if len(args.compilers) == 0:
args.compilers = [None]
matrix_build(
args.shared_libs, args.compilers, args.build_types,
args.cmake_executable[0], args.cmake_options,
not args.without_tests
)

2455
cmake/Doxyfile.in Normal file

File diff suppressed because it is too large Load Diff

132
cmake/FindIconv.cmake Normal file
View File

@ -0,0 +1,132 @@
# Distributed under the OSI-approved BSD 3-Clause License. See accompanying
# file Copyright.txt or https://cmake.org/licensing for details.
# This code is disabled for Visual Studio as explained in README.md
if ( NOT MSVC )
#[=======================================================================[.rst:
FindIconv
---------
This module finds the ``iconv()`` POSIX.1 functions on the system.
These functions might be provided in the regular C library or externally
in the form of an additional library.
The following variables are provided to indicate iconv support:
.. variable:: Iconv_FOUND
Variable indicating if the iconv support was found.
.. variable:: Iconv_INCLUDE_DIRS
The directories containing the iconv headers.
.. variable:: Iconv_LIBRARIES
The iconv libraries to be linked.
.. variable:: Iconv_IS_BUILT_IN
A variable indicating whether iconv support is stemming from the
C library or not. Even if the C library provides `iconv()`, the presence of
an external `libiconv` implementation might lead to this being false.
Additionally, the following :prop_tgt:`IMPORTED` target is being provided:
.. variable:: Iconv::Iconv
Imported target for using iconv.
The following cache variables may also be set:
.. variable:: Iconv_INCLUDE_DIR
The directory containing the iconv headers.
.. variable:: Iconv_LIBRARY
The iconv library (if not implicitly given in the C library).
.. note::
On POSIX platforms, iconv might be part of the C library and the cache
variables ``Iconv_INCLUDE_DIR`` and ``Iconv_LIBRARY`` might be empty.
#]=======================================================================]
if (WIN32)
# If neither C nor CXX are loaded, implicit iconv makes no sense.
set(Iconv_IS_BUILT_IN FALSE)
endif()
# iconv can only be provided in libc on a POSIX system.
# If any cache variable is already set, we'll skip this test.
if(NOT DEFINED Iconv_IS_BUILT_IN)
if(UNIX AND NOT DEFINED Iconv_INCLUDE_DIR AND NOT DEFINED Iconv_LIBRARY)
include(CheckCSourceCompiles)
# We always suppress the message here: Otherwise on supported systems
# not having iconv in their C library (e.g. those using libiconv)
# would always display a confusing "Looking for iconv - not found" message
set(CMAKE_FIND_QUIETLY TRUE)
# The following code will not work, but it's sufficient to see if it compiles.
# Note: libiconv will define the iconv functions as macros, so CheckSymbolExists
# will not yield correct results.
set(Iconv_IMPLICIT_TEST_CODE
"
#include <stddef.h>
#include <iconv.h>
int main() {
char *a, *b;
size_t i, j;
iconv_t ic;
ic = iconv_open(\"to\", \"from\");
iconv(ic, &a, &i, &b, &j);
iconv_close(ic);
}
"
)
if(CMAKE_C_COMPILER_LOADED)
check_c_source_compiles("${Iconv_IMPLICIT_TEST_CODE}" Iconv_IS_BUILT_IN)
else()
check_cxx_source_compiles("${Iconv_IMPLICIT_TEST_CODE}" Iconv_IS_BUILT_IN)
endif()
else()
set(Iconv_IS_BUILT_IN FALSE)
endif()
endif()
if(NOT Iconv_IS_BUILT_IN)
find_path(Iconv_INCLUDE_DIR
NAMES "iconv.h"
DOC "iconv include directory")
set(Iconv_LIBRARY_NAMES "iconv" "libiconv")
else()
set(Iconv_INCLUDE_DIR "" CACHE FILEPATH "iconv include directory")
set(Iconv_LIBRARY_NAMES "c")
endif()
find_library(Iconv_LIBRARY
NAMES ${Iconv_LIBRARY_NAMES}
DOC "iconv library (potentially the C library)")
mark_as_advanced(Iconv_INCLUDE_DIR)
mark_as_advanced(Iconv_LIBRARY)
include(FindPackageHandleStandardArgs)
if(NOT Iconv_IS_BUILT_IN)
find_package_handle_standard_args(Iconv REQUIRED_VARS Iconv_LIBRARY Iconv_INCLUDE_DIR)
else()
find_package_handle_standard_args(Iconv REQUIRED_VARS Iconv_LIBRARY)
endif()
if(Iconv_FOUND)
set(Iconv_INCLUDE_DIRS "${Iconv_INCLUDE_DIR}")
set(Iconv_LIBRARIES "${Iconv_LIBRARY}")
if(NOT TARGET Iconv::Iconv)
add_library(Iconv::Iconv INTERFACE IMPORTED)
endif()
set_property(TARGET Iconv::Iconv PROPERTY INTERFACE_INCLUDE_DIRECTORIES "${Iconv_INCLUDE_DIRS}")
set_property(TARGET Iconv::Iconv PROPERTY INTERFACE_LINK_LIBRARIES "${Iconv_LIBRARIES}")
endif()
endif()

50
cmake/FindRegex.cmake Normal file
View File

@ -0,0 +1,50 @@
# - Try to find the Regex library
#
# Once done this will define
#
# REGEX_FOUND - system has libregex
# REGEX_INCLUDE_DIR - the libregex include directory
# REGEX_LIBRARIES - Link these to use libregex
#
# Copyright (c) 2018, Gilles Caulier, <caulier dot gilles at gmail dot com>
#
# Redistribution and use is allowed according to the terms of the BSD license.
# For details see the accompanying COPYING-CMAKE-SCRIPTS file.
if ( NOT MSVC AND NOT MINGW AND NOT MSYS )
find_path(Regex_INCLUDE_DIR
NAMES regex.h
DOC "libregex include directory"
)
mark_as_advanced(Regex_INCLUDE_DIR)
find_library(Regex_LIBRARY "regex"
DOC "libregex libraries"
)
mark_as_advanced(Regex_LIBRARY)
find_package_handle_standard_args(Regex
FOUND_VAR Regex_FOUND
REQUIRED_VARS Regex_INCLUDE_DIR
FAIL_MESSAGE "Failed to find libregex"
)
if(REGEX_FOUND)
set(REGEX_INCLUDE_DIRS ${Regex_INCLUDE_DIRS})
if(Regex_LIBRARY)
set(REGEX_LIBRARIES ${Regex_LIBRARY})
else()
unset(REGEX_LIBRARIES)
endif()
endif()
endif()

24
cmake/JoinPaths.cmake Normal file
View File

@ -0,0 +1,24 @@
# This module provides function for joining paths
# known from from most languages
#
# Original license:
# SPDX-License-Identifier: (MIT OR CC0-1.0)
# Copyright 2020 Jan Tojnar
# https://github.com/jtojnar/cmake-snips
#
# Modelled after Pythons os.path.join
# https://docs.python.org/3.7/library/os.path.html#os.path.join
# Windows not supported
function(join_paths joined_path first_path_segment)
set(temp_path "${first_path_segment}")
foreach(current_segment IN LISTS ARGN)
if(NOT ("${current_segment}" STREQUAL ""))
if(IS_ABSOLUTE "${current_segment}")
set(temp_path "${current_segment}")
else()
set(temp_path "${temp_path}/${current_segment}")
endif()
endif()
endforeach()
set(${joined_path} "${temp_path}" PARENT_SCOPE)
endfunction()

153
cmake/compilerFlags.cmake Normal file
View File

@ -0,0 +1,153 @@
# These flags applies to exiv2lib, the applications, and to the xmp code
include(CheckCXXCompilerFlag)
if ( MINGW OR UNIX OR MSYS ) # MINGW, Linux, APPLE, CYGWIN
if (${CMAKE_CXX_COMPILER_ID} STREQUAL GNU)
set(COMPILER_IS_GCC ON)
elseif (${CMAKE_CXX_COMPILER_ID} MATCHES "Clang")
set(COMPILER_IS_CLANG ON)
endif()
set (CMAKE_CXX_FLAGS_DEBUG "-g3 -gstrict-dwarf -O0")
if (CMAKE_GENERATOR MATCHES "Xcode")
set(CMAKE_XCODE_ATTRIBUTE_GCC_VERSION "com.apple.compilers.llvm.clang.1_0")
if (EXIV2_ENABLE_EXTERNAL_XMP)
# XMP SDK 2016 uses libstdc++ even when it is deprecated in modern versions of the OSX SDK.
# The only way to make Exiv2 work with the external XMP SDK is to use the same standard library.
set(CMAKE_XCODE_ATTRIBUTE_CLANG_CXX_LIBRARY "libstdc++")
else()
set(CMAKE_XCODE_ATTRIBUTE_CLANG_CXX_LIBRARY "libc++")
endif()
endif()
if (COMPILER_IS_GCC OR COMPILER_IS_CLANG)
# This fails under Fedora - MinGW - Gcc 8.3
if (NOT (MINGW OR CYGWIN OR CMAKE_HOST_SOLARIS))
if (NOT APPLE) # Don't know why this isn't working correctly on Apple with M1 processor
check_cxx_compiler_flag(-fstack-clash-protection HAS_FSTACK_CLASH_PROTECTION)
endif()
check_cxx_compiler_flag(-fcf-protection HAS_FCF_PROTECTION)
check_cxx_compiler_flag(-fstack-protector-strong HAS_FSTACK_PROTECTOR_STRONG)
if(HAS_FSTACK_CLASH_PROTECTION)
add_compile_options(-fstack-clash-protection)
endif()
if(HAS_FCF_PROTECTION)
add_compile_options(-fcf-protection)
endif()
if(HAS_FSTACK_PROTECTOR_STRONG)
add_compile_options(-fstack-protector-strong)
endif()
endif()
add_compile_options(-Wp,-D_GLIBCXX_ASSERTIONS)
if (CMAKE_BUILD_TYPE STREQUAL Release AND NOT (APPLE OR MINGW OR MSYS))
add_compile_options(-Wp,-D_FORTIFY_SOURCE=2) # Requires to compile with -O2
endif()
if(BUILD_WITH_COVERAGE)
add_compile_options(--coverage)
# TODO: From CMake 3.13 we could use add_link_options instead these 2 lines
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} --coverage")
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} --coverage")
endif()
add_compile_options(-Wall -Wcast-align -Wpointer-arith -Wformat-security -Wmissing-format-attribute -Woverloaded-virtual -W)
# This seems to be causing issues in the Fedora_MinGW GitLab job
#add_compile_options(-fasynchronous-unwind-tables)
if ( EXIV2_TEAM_USE_SANITIZERS )
# ASAN is available in gcc from 4.8 and UBSAN from 4.9
# ASAN is available in clang from 3.1 and UBSAN from 3.3
# UBSAN is not fatal by default, instead it only prints runtime errors to stderr
# => make it fatal with -fno-sanitize-recover (gcc) or -fno-sanitize-recover=all (clang)
# add -fno-omit-frame-pointer for better stack traces
if ( COMPILER_IS_GCC )
if ( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 )
set(SANITIZER_FLAGS "-fno-omit-frame-pointer -fsanitize=address,undefined -fno-sanitize-recover")
elseif( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.8 )
set(SANITIZER_FLAGS "-fno-omit-frame-pointer -fsanitize=address")
endif()
elseif( COMPILER_IS_CLANG )
if ( EXIV2_BUILD_FUZZ_TESTS )
set(SANITIZER_FLAGS "-fsanitize=fuzzer-no-link,address,undefined")
elseif ( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.9 )
set(SANITIZER_FLAGS "-fno-omit-frame-pointer -fsanitize=address,undefined -fno-sanitize-recover=all")
elseif ( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 3.4 )
set(SANITIZER_FLAGS "-fno-omit-frame-pointer -fsanitize=address,undefined")
elseif( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 3.1 )
set(SANITIZER_FLAGS "-fno-omit-frame-pointer -fsanitize=address")
endif()
endif()
# sorry, ASAN does not work on Windows
if ( NOT CYGWIN AND NOT MINGW AND NOT MSYS )
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${SANITIZER_FLAGS}")
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} ${SANITIZER_FLAGS}")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} ${SANITIZER_FLAGS}")
set(CMAKE_MODULE_LINKER_FLAGS "${CMAKE_MODULE_LINKER_FLAGS} ${SANITIZER_FLAGS}")
endif()
endif()
endif()
endif ()
# http://stackoverflow.com/questions/10113017/setting-the-msvc-runtime-in-cmake
if(MSVC)
find_program(CLCACHE name clcache.exe
PATHS ENV CLCACHE_PATH
PATH_SUFFIXES Scripts clcache-4.1.0
)
if (CLCACHE)
message(STATUS "clcache found in ${CLCACHE}")
if (CMAKE_BUILD_TYPE STREQUAL "Debug")
message(WARNING "clcache only works for Release builds")
else()
set(CMAKE_CXX_COMPILER ${CLCACHE})
endif()
endif()
set(variables
CMAKE_CXX_FLAGS_DEBUG
CMAKE_CXX_FLAGS_MINSIZEREL
CMAKE_CXX_FLAGS_RELEASE
CMAKE_CXX_FLAGS_RELWITHDEBINFO
)
if (NOT BUILD_SHARED_LIBS AND NOT EXIV2_ENABLE_DYNAMIC_RUNTIME)
message(STATUS "MSVC -> forcing use of statically-linked runtime." )
foreach(variable ${variables})
if(${variable} MATCHES "/MD")
string(REGEX REPLACE "/MD" "/MT" ${variable} "${${variable}}")
endif()
endforeach()
endif()
# remove /Ob2 and /Ob1 - they cause linker issues
set(obs /Ob2 /Ob1)
foreach(ob ${obs})
foreach(variable ${variables})
if(${variable} MATCHES ${ob} )
string(REGEX REPLACE ${ob} "" ${variable} "${${variable}}")
endif()
endforeach()
endforeach()
if ( EXIV2_EXTRA_WARNINGS )
string(REGEX REPLACE "/W[0-4]" "/W4" CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS}")
endif ()
# Object Level Parallelism
add_compile_options(/MP)
add_definitions(-DNOMINMAX) # This definition is not only needed for Exiv2 but also for xmpsdk
# https://devblogs.microsoft.com/cppblog/msvc-now-correctly-reports-__cplusplus/
if (MSVC_VERSION GREATER_EQUAL "1910") # VS2017 and up
add_compile_options("/Zc:__cplusplus")
endif()
endif()

View File

@ -0,0 +1,92 @@
# These flags only applies to exiv2lib, and the applications, but not to the xmp code
include(CheckCXXCompilerFlag)
if (COMPILER_IS_GCC OR COMPILER_IS_CLANG) # MINGW, Linux, APPLE, CYGWIN
if ( EXIV2_TEAM_WARNINGS_AS_ERRORS )
add_compile_options(-Werror -Wno-error=deprecated-declarations)
check_cxx_compiler_flag(-Wno-error=deprecated-copy DEPRECATED_COPY)
if ( DEPRECATED_COPY)
add_compile_options(-Wno-error=deprecated-copy)
endif ()
endif ()
if ( EXIV2_TEAM_EXTRA_WARNINGS )
# Note that this is intended to be used only by Exiv2 developers/contributors.
if ( COMPILER_IS_GCC )
if ( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 4.0 )
string(CONCAT EXTRA_COMPILE_FLAGS ${EXTRA_COMPILE_FLAGS}
" -Wextra"
" -Wlogical-op"
" -Wdouble-promotion"
" -Wshadow"
" -Wuseless-cast"
" -Wpointer-arith" # This warning is also enabled by -Wpedantic
" -Wformat=2"
#" -Wold-style-cast"
)
endif ()
if ( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 5.0 )
string(CONCAT EXTRA_COMPILE_FLAGS ${EXTRA_COMPILE_FLAGS}
" -Warray-bounds=2"
)
endif ()
if ( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 6.0 )
string(CONCAT EXTRA_COMPILE_FLAGS ${EXTRA_COMPILE_FLAGS}
" -Wduplicated-cond"
)
endif ()
if ( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 7.0 )
string(CONCAT EXTRA_COMPILE_FLAGS ${EXTRA_COMPILE_FLAGS}
" -Wduplicated-branches"
" -Wrestrict"
)
endif ()
endif ()
if ( COMPILER_IS_CLANG )
# https://clang.llvm.org/docs/DiagnosticsReference.html
# These variables are at least available since clang 3.9.1
string(CONCAT EXTRA_COMPILE_FLAGS "-Wextra"
" -Wshadow"
" -Wassign-enum"
" -Wmicrosoft"
" -Wcomments"
" -Wconditional-uninitialized"
" -Wdirect-ivar-access"
" -Weffc++"
" -Wpointer-arith"
" -Wformat=2"
#" -Warray-bounds" # Enabled by default
# These two raises lot of warnings. Use them wisely
#" -Wconversion"
#" -Wold-style-cast"
)
# -Wdouble-promotion flag is not available in clang 3.4.2
if ( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 3.4.2 )
string(CONCAT EXTRA_COMPILE_FLAGS ${EXTRA_COMPILE_FLAGS}
" -Wdouble-promotion"
)
endif ()
# -Wcomma flag is not available in clang 3.8.1
if ( CMAKE_CXX_COMPILER_VERSION VERSION_GREATER 3.8.1 )
string(CONCAT EXTRA_COMPILE_FLAGS ${EXTRA_COMPILE_FLAGS}
" -Wcomma"
)
endif ()
endif ()
endif ()
endif()
if (MSVC)
if ( EXIV2_TEAM_WARNINGS_AS_ERRORS )
add_compile_options(/WX)
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} /WX")
set(CMAKE_MODULE_LINKER_FLAGS "${CMAKE_MODULE_LINKER_FLAGS} /WX")
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} /WX")
endif ()
endif ()

106
cmake/config.h.cmake Normal file
View File

@ -0,0 +1,106 @@
// File generated by cmake from cmake/config.h.cmake.
#ifndef _EXV_CONF_H_
#define _EXV_CONF_H_
// Defined if you want to use libssh for SshIO.
#cmakedefine EXV_USE_SSH
// Define to 1 if you want to use libcurl in httpIO.
#cmakedefine EXV_USE_CURL
// Define if you require webready support.
#cmakedefine EXV_ENABLE_WEBREADY
// Define if you have the <libintl.h> header file.
#cmakedefine EXV_HAVE_LIBINTL_H
// Define if you want translation of program messages to the user's native language
#cmakedefine EXV_ENABLE_NLS
// Define if you want BMFF support.
#cmakedefine EXV_ENABLE_BMFF
// Define if you want video support.
#cmakedefine EXV_ENABLE_VIDEO
// Define if you have the strerror_r function.
#cmakedefine EXV_HAVE_STRERROR_R
// Define if the strerror_r function returns char*.
#cmakedefine EXV_STRERROR_R_CHAR_P
// Define to enable the Windows unicode path support.
#cmakedefine EXV_UNICODE_PATH
/* Define to `const' or to empty, depending on the second argument of `iconv'. */
#cmakedefine ICONV_ACCEPTS_CONST_INPUT
#if defined(ICONV_ACCEPTS_CONST_INPUT) || defined(__NetBSD__)
#define EXV_ICONV_CONST const
#else
#define EXV_ICONV_CONST
#endif
// Define if you have the <regex.h> header file.
#cmakedefine EXV_HAVE_REGEX_H
// Define if have the <memory.h> header file.
#cmakedefine EXV_HAVE_MEMORY_H
// Define if stdbool.h conforms to C99.
#cmakedefine EXV_HAVE_STDBOOL_H
// Define if you have the <strings.h> header file.
#cmakedefine EXV_HAVE_STRINGS_H
// Define if you have the mmap function.
#cmakedefine EXV_HAVE_MMAP
// Define if you have the munmap function.
#cmakedefine EXV_HAVE_MUNMAP
// Define if you have <sys/stat.h> header file.
#cmakedefine EXV_HAVE_SYS_STAT_H
// Define if you have the <sys/types.h> header file.
#cmakedefine EXV_HAVE_SYS_TYPES_H
/* Define if you have the <unistd.h> header file. */
#cmakedefine EXV_HAVE_UNISTD_H
// Define if you have the <sys/mman.h> header file.
#cmakedefine EXV_HAVE_SYS_MMAN_H
// Define if you have are using the zlib library.
#cmakedefine EXV_HAVE_LIBZ
// Define if you have the <process.h> header file.
#cmakedefine EXV_HAVE_PROCESS_H
/* Define if you have (Exiv2/xmpsdk) Adobe XMP Toolkit. */
#cmakedefine EXV_HAVE_XMP_TOOLKIT
/* Define to the full name of this package. */
#cmakedefine EXV_PACKAGE_NAME "@EXV_PACKAGE_NAME@"
/* Define to the full name and version of this package. */
#cmakedefine EXV_PACKAGE_STRING "@EXV_PACKAGE_STRING@"
/* Define to the version of this package. */
#cmakedefine EXV_PACKAGE_VERSION "@PROJECT_VERSION@"
#define EXIV2_MAJOR_VERSION (@PROJECT_VERSION_MAJOR@)
#define EXIV2_MINOR_VERSION (@PROJECT_VERSION_MINOR@)
#define EXIV2_PATCH_VERSION (@PROJECT_VERSION_PATCH@)
#define EXIV2_TWEAK_VERSION (@PROJECT_VERSION_TWEAK@)
// Definition to enable translation of Nikon lens names.
#cmakedefine EXV_HAVE_LENSDATA
// Define if you have the iconv function.
#cmakedefine EXV_HAVE_ICONV
// Definition to enable conversion of UCS2 encoded Windows tags to UTF-8.
#cmakedefine EXV_HAVE_PRINTUCS2
#endif /* !_EXV_CONF_H_ */

11
cmake/exiv2.pc.in Normal file
View File

@ -0,0 +1,11 @@
prefix=@CMAKE_INSTALL_PREFIX@
exec_prefix=${prefix}
libdir=@libdir_for_pc_file@
includedir=@includedir_for_pc_file@
Name: exiv2
Description: @PROJECT_DESCRIPTION@
Version: @PROJECT_VERSION@
URL: @PACKAGE_URL@
Libs: -L${libdir} -lexiv2
Cflags: -I${includedir}

View File

@ -0,0 +1,22 @@
IF(NOT EXISTS "${CMAKE_BINARY_DIR}/install_manifest.txt")
MESSAGE(FATAL_ERROR "Cannot find install manifest: ${CMAKE_BINARY_DIR}/install_manifest.txt")
ENDIF(NOT EXISTS "${CMAKE_BINARY_DIR}/install_manifest.txt")
FILE(READ "${CMAKE_BINARY_DIR}/install_manifest.txt" files)
STRING(REGEX REPLACE "\n" ";" files "${files}")
FOREACH(file ${files})
MESSAGE(STATUS "Uninstalling: ${file}")
IF(IS_SYMLINK "${file}" OR EXISTS "${file}")
EXEC_PROGRAM(
"${CMAKE_COMMAND}" ARGS "-E remove \"${file}\""
OUTPUT_VARIABLE rm_out
RETURN_VALUE rm_retval
)
IF("${rm_retval}" STREQUAL 0)
ELSE("${rm_retval}" STREQUAL 0)
MESSAGE(FATAL_ERROR "Problem when removing \"${file}\"")
ENDIF("${rm_retval}" STREQUAL 0)
ELSE(IS_SYMLINK "${file}" OR EXISTS "${file}")
MESSAGE(STATUS "File \"${file}\" does not exist.")
ENDIF(IS_SYMLINK "${file}" OR EXISTS "${file}")
ENDFOREACH(file)

View File

@ -0,0 +1,75 @@
# set include path for FindXXX.cmake files
set(CMAKE_MODULE_PATH ${CMAKE_MODULE_PATH} "${CMAKE_SOURCE_DIR}/cmake/")
if (APPLE)
# On Apple, we use the conan cmake_paths generator
if (EXISTS ${CMAKE_BINARY_DIR}/conan_paths.cmake)
include(${CMAKE_BINARY_DIR}/conan_paths.cmake)
endif()
else()
# Otherwise, we rely on the conan cmake_find_package generator
list(APPEND CMAKE_MODULE_PATH ${CMAKE_BINARY_DIR})
list(APPEND CMAKE_PREFIX_PATH ${CMAKE_BINARY_DIR})
endif()
# don't use Frameworks on the Mac (#966)
if (APPLE)
set(CMAKE_FIND_FRAMEWORK NEVER)
endif()
find_package (Python3 COMPONENTS Interpreter)
find_package(Threads REQUIRED)
if( EXIV2_ENABLE_PNG )
find_package( ZLIB REQUIRED )
endif( )
if( EXIV2_ENABLE_WEBREADY )
if( EXIV2_ENABLE_CURL )
find_package(CURL REQUIRED)
endif()
if( EXIV2_ENABLE_SSH )
find_package(libssh CONFIG REQUIRED)
# Define an imported target to have compatibility with <=libssh-0.9.0
# libssh-0.9.1 is broken regardless.
if(NOT TARGET ssh)
add_library(ssh SHARED IMPORTED)
set_target_properties(ssh PROPERTIES
IMPORTED_LOCATION "${LIBSSH_LIBRARIES}"
INTERFACE_INCLUDE_DIRECTORIES "${LIBSSH_INCLUDE_DIR}"
)
endif()
endif()
endif()
if (EXIV2_ENABLE_XMP AND EXIV2_ENABLE_EXTERNAL_XMP)
message(FATAL_ERROR "EXIV2_ENABLE_XMP AND EXIV2_ENABLE_EXTERNAL_XMP are mutually exclusive. You can only choose one of them")
else()
if (EXIV2_ENABLE_XMP)
find_package(EXPAT REQUIRED)
elseif (EXIV2_ENABLE_EXTERNAL_XMP)
find_package(XmpSdk REQUIRED)
endif ()
endif()
if (EXIV2_ENABLE_NLS)
find_package(Intl REQUIRED)
endif( )
find_package(Iconv)
if( ICONV_FOUND )
message ( "-- ICONV_INCLUDE_DIR : " ${Iconv_INCLUDE_DIR} )
message ( "-- ICONV_LIBRARIES : " ${Iconv_LIBRARY} )
endif()
if( BUILD_WITH_CCACHE )
find_program(CCACHE_FOUND ccache)
if(CCACHE_FOUND)
message(STATUS "Program ccache found")
set_property(GLOBAL PROPERTY RULE_LAUNCH_COMPILE ccache)
set_property(GLOBAL PROPERTY RULE_LAUNCH_LINK ccache)
endif()
endif()

View File

@ -0,0 +1,57 @@
include(CheckIncludeFileCXX)
include(CheckCXXSourceCompiles)
include(CheckCXXSymbolExists)
# Note that the scope of the EXV_ variables in local
if (${EXIV2_ENABLE_WEBREADY})
set(EXV_USE_SSH ${EXIV2_ENABLE_SSH})
set(EXV_USE_CURL ${EXIV2_ENABLE_CURL})
endif()
set(EXV_ENABLE_BMFF ${EXIV2_ENABLE_BMFF})
set(EXV_ENABLE_VIDEO ${EXIV2_ENABLE_VIDEO})
set(EXV_ENABLE_WEBREADY ${EXIV2_ENABLE_WEBREADY})
set(EXV_HAVE_LENSDATA ${EXIV2_ENABLE_LENSDATA})
set(EXV_HAVE_PRINTUCS2 ${EXIV2_ENABLE_PRINTUCS2})
set(EXV_PACKAGE_NAME ${PROJECT_NAME})
set(EXV_PACKAGE_VERSION ${PROJECT_VERSION})
set(EXV_PACKAGE_STRING "${PROJECT_NAME} ${PROJECT_VERSION}")
if (${EXIV2_ENABLE_XMP} OR ${EXIV2_ENABLE_EXTERNAL_XMP})
set(EXV_HAVE_XMP_TOOLKIT ON)
else()
set(EXV_HAVE_XMP_TOOLKIT OFF)
endif()
set(EXV_HAVE_ICONV ${ICONV_FOUND})
set(EXV_HAVE_LIBZ ${ZLIB_FOUND})
set(EXV_UNICODE_PATH ${EXIV2_ENABLE_WIN_UNICODE})
check_cxx_symbol_exists(gmtime_r time.h EXV_HAVE_GMTIME_R)
check_cxx_symbol_exists(mmap sys/mman.h EXV_HAVE_MMAP )
check_cxx_symbol_exists(munmap sys/mman.h EXV_HAVE_MUNMAP )
check_cxx_symbol_exists(strerror_r string.h EXV_HAVE_STRERROR_R )
check_cxx_source_compiles( "
#include <string.h>
int main() {
char buff[100];
const char* c = strerror_r(0,buff,100);
(void)c; // ignore unuse-variable
return 0;
}" EXV_STRERROR_R_CHAR_P )
check_include_file_cxx( "memory.h" EXV_HAVE_MEMORY_H )
check_include_file_cxx( "process.h" EXV_HAVE_PROCESS_H )
check_include_file_cxx( "stdbool.h" EXV_HAVE_STDBOOL_H )
check_include_file_cxx( "strings.h" EXV_HAVE_STRINGS_H )
check_include_file_cxx( "sys/stat.h" EXV_HAVE_SYS_STAT_H )
check_include_file_cxx( "sys/types.h" EXV_HAVE_SYS_TYPES_H )
check_include_file_cxx( "inttypes.h" EXV_HAVE_INTTYPES_H )
check_include_file_cxx( "unistd.h" EXV_HAVE_UNISTD_H )
check_include_file_cxx( "sys/mman.h" EXV_HAVE_SYS_MMAN_H )
if ( NOT MINGW AND NOT MSYS AND NOT MSVC )
check_include_file_cxx( "regex.h" EXV_HAVE_REGEX_H )
endif()
set(EXV_ENABLE_NLS ${EXIV2_ENABLE_NLS})
configure_file(cmake/config.h.cmake ${CMAKE_BINARY_DIR}/exv_conf.h @ONLY)

30
cmake/generateDoc.cmake Normal file
View File

@ -0,0 +1,30 @@
# -helper macro to add a "doc" target with CMake build system.
# and configure doxy.config.in to doxy.config
#
# target "doc" allows building the documentation with doxygen/dot on WIN32, Linux and Mac
#
find_package(Doxygen REQUIRED dot)
macro(generate_documentation DOX_CONFIG_FILE)
if(NOT EXISTS "${DOX_CONFIG_FILE}")
message(FATAL_ERROR "Configuration file for doxygen not found")
endif()
#Define variables
set(INCDIR "${PROJECT_SOURCE_DIR}/include/exiv2")
set(SRCDIR "${PROJECT_SOURCE_DIR}/src")
set(ROOTDIR "${PROJECT_SOURCE_DIR}")
set(BINDIR "${PROJECT_BINARY_DIR}")
#set(TESTSDIR "${PROJECT_SOURCE_DIR}/tests")
configure_file(${DOX_CONFIG_FILE} ${CMAKE_CURRENT_BINARY_DIR}/doxy.config @ONLY) #OUT-OF-PLACE LOCATION
configure_file(${PROJECT_SOURCE_DIR}/src/doxygen.hpp.in ${PROJECT_BINARY_DIR}/doxygen.hpp @ONLY)
set(DOXY_CONFIG "${CMAKE_CURRENT_BINARY_DIR}/doxy.config")
add_custom_target(doc ${DOXYGEN_EXECUTABLE} ${DOXY_CONFIG})
install(DIRECTORY "${PROJECT_BINARY_DIR}/doc/html/" DESTINATION ${CMAKE_INSTALL_DOCDIR})
set_property(DIRECTORY APPEND PROPERTY ADDITIONAL_MAKE_CLEAN_FILES doc)
endmacro()

40
cmake/mainSetup.cmake Normal file
View File

@ -0,0 +1,40 @@
# In this file we configure some CMake settings we do not want to make visible directly in the main
# CMakeLists.txt file.
include(GNUInstallDirs)
include(CheckFunctionExists)
include(GenerateExportHeader)
include(CMakeDependentOption)
include(cmake/JoinPaths.cmake)
include(CTest)
set(CMAKE_ARCHIVE_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/lib)
set(CMAKE_LIBRARY_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/lib)
set(CMAKE_RUNTIME_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/bin)
set(CMAKE_RUNTIME_OUTPUT_DIRECTORY_DEBUG ${CMAKE_RUNTIME_OUTPUT_DIRECTORY})
set(CMAKE_RUNTIME_OUTPUT_DIRECTORY_RELEASE ${CMAKE_RUNTIME_OUTPUT_DIRECTORY})
if (NOT CMAKE_CXX_COMPILER_ID STREQUAL "AppleClang")
set(CMAKE_CXX_VISIBILITY_PRESET hidden)
set(CMAKE_VISIBILITY_INLINES_HIDDEN 1)
endif()
if (NOT CMAKE_CXX_STANDARD)
set (CMAKE_CXX_STANDARD 98)
endif()
if (UNIX)
if (APPLE)
set(CMAKE_MACOSX_RPATH ON)
set(CMAKE_INSTALL_RPATH "@loader_path")
else()
join_paths(CMAKE_INSTALL_RPATH "$ORIGIN" ".." "${CMAKE_INSTALL_LIBDIR}")
endif()
endif()
# Prevent conflicts when exiv2 is consumed in multiple-subdirectory projects.
if (NOT TARGET uninstall)
configure_file(cmake/exiv2_uninstall.cmake ${CMAKE_BINARY_DIR}/cmake_uninstall.cmake COPYONLY)
add_custom_target(uninstall "${CMAKE_COMMAND}" -P "${CMAKE_BINARY_DIR}/cmake_uninstall.cmake")
endif()

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=9
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=9
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=9
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=9
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=10
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=10
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=10
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=10
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=11
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=11
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=11
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=11
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=12
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=12
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=12
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=12
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=14
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=14
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=14
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=14
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=15
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=15
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=15
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=15
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=16
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=16
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=16
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=16
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86
build_type=Release
compiler=Visual Studio
compiler.runtime=MD
compiler.version=16
os=Windows
arch_build=x86
os_build=Windows
[options]
[env]

View File

@ -0,0 +1,12 @@
[build_requires]
[settings]
arch=x86_64
build_type=Debug
compiler=Visual Studio
compiler.runtime=MDd
compiler.version=16
os=Windows
arch_build=x86_64
os_build=Windows
[options]
[env]

160
cmake/packaging.cmake Normal file
View File

@ -0,0 +1,160 @@
set(CPACK_PACKAGE_NAME "${PROJECT_NAME}")
set(CPACK_PACKAGE_CONTACT "Luis Díaz Más <piponazo@gmail.com>")
set(CPACK_PACKAGE_VERSION ${PROJECT_VERSION})
set(CPACK_SOURCE_GENERATOR TGZ)
# https://libwebsockets.org/git/libwebsockets/commit/minimal-examples?id=3e25edf1ee7ea8127e941fd7b664e0e962cfeb85
set(CPACK_SOURCE_IGNORE_FILES $(CPACK_SOURCE_IGNORE_FILES) "/.git/" "/build/" "\\\\.tgz$" "\\\\.tar\\\\.gz$" "\\\\.zip$" "/test/tmp/" )
if ( MSVC )
set(CPACK_GENERATOR ZIP) # use .zip - less likely to damage bin/exiv2.dll permissions
else()
set(CPACK_GENERATOR TGZ) # MinGW/Cygwin/Linux/macOS etc use .tar.gz
endif()
set (BS "") # Bit Size
if ( NOT APPLE )
if ( CMAKE_SIZEOF_VOID_P EQUAL 8 )
set (BS 64)
else()
set (BS 32)
endif()
endif()
set (LT "") # Library Type
if ( NOT BUILD_SHARED_LIBS )
set (LT Static)
endif()
set (BT "") # Build Type
if ( NOT ${CMAKE_BUILD_TYPE} STREQUAL Release )
set (BT ${CMAKE_BUILD_TYPE})
endif()
if ( MINGW OR MSYS )
set (PACKDIR MinGW)
elseif ( MSVC )
set (PACKDIR msvc)
elseif ( CYGWIN )
set (PACKDIR CYGWIN)
elseif ( APPLE )
set (PACKDIR Darwin)
elseif ( LINUX )
set (PACKDIR Linux)
elseif ( CMAKE_SYSTEM_NAME STREQUAL "NetBSD" OR CMAKE_SYSTEM_NAME STREQUAL "FreeBSD" OR CMAKE_HOST_SOLARIS)
set (PACKDIR Unix)
else()
set (PACKDIR Linux) # Linux and unsupported systems
endif()
set (BUNDLE_NAME ${PACKDIR})
if ( CMAKE_SYSTEM_NAME STREQUAL "NetBSD" OR CMAKE_SYSTEM_NAME STREQUAL "FreeBSD" OR CMAKE_HOST_SOLARIS )
set (BUNDLE_NAME ${CMAKE_SYSTEM_NAME})
endif()
set (CC "") # Compiler
if ( NOT APPLE AND NOT CMAKE_SYSTEM_NAME STREQUAL "FreeBSD" )
if (${CMAKE_CXX_COMPILER_ID} MATCHES "Clang")
set (CC Clang)
endif()
endif()
set (VI "") # Video
if ( EXIV2_ENABLE_VIDEO )
set (VI Video)
endif()
set (WR "") # WebReady
if ( EXIV2_ENABLE_WEBREADY )
set (WR Webready)
endif()
set (VS "") # VisualStudio
if ( MSVC )
# VS2015 >= 1900, VS2017 >= 1910, VS2019 >= 1920
if ( MSVC_VERSION GREATER 1919 )
set(VS 2019)
elseif ( MSVC_VERSION GREATER 1909 )
set(VS 2017)
elseif ( MSVC_VERSION GREATER 1899 )
set(VS 2015)
elseif ( MSVC_VERSION STREQUAL 1800 )
set(VS 2013)
elseif ( MSVC_VERSION STREQUAL 1700 )
set(VS 2012)
elseif ( MSVC_VERSION STREQUAL 1600 )
set(VS 2010)
elseif ( MSVC_VERSION STREQUAL 1500 )
set(VS 2008)
endif()
endif()
# Set RC = Release Candidate from TWEAK
if ( PROJECT_VERSION_TWEAK STREQUAL "" )
set(RC "GM For Release")
else()
string(FIND "${PROJECT_VERSION_TWEAK}" 0 PREVIEW_RELEASE ) # 0.27.3.10 => RC1 Preview
string(FIND "${PROJECT_VERSION_TWEAK}" 9 NOT_FOR_RELEASE ) # 0.27.3.19 => RC1 Not for release
string(SUBSTRING ${PROJECT_VERSION_TWEAK} 0 1 RC)
if ( RC STREQUAL "0" )
set(RC,"")
else()
set (RC "RC${RC}")
endif()
if ( PREVIEW_RELEASE STREQUAL "1" )
set (RC "${RC} Preview")
set (PREVIEW_RELEASE 1)
else()
set (PREVIEW_RELEASE 0)
endif()
if ( NOT_FOR_RELEASE STREQUAL "1" )
set (RC "${RC} Not for release")
set (NOT_FOR_RELEASE 1)
else()
set (NOT_FOR_RELEASE 0)
endif()
endif()
# Set RV = Release Version
set(RV "Exiv2 v${PROJECT_VERSION_MAJOR}.${PROJECT_VERSION_MINOR}.${PROJECT_VERSION_PATCH}")
set(CPACK_PACKAGE_FILE_NAME ${CPACK_PACKAGE_NAME}-${CPACK_PACKAGE_VERSION}-${VS}${BUNDLE_NAME}${BS}${CC}${LT}${BT}${VI}${WR})
# https://stackoverflow.com/questions/17495906/copying-files-and-including-them-in-a-cpack-archive
install(FILES "${PROJECT_SOURCE_DIR}/samples/exifprint.cpp" DESTINATION "samples")
# Copy top level documents (eg README.md)
# https://stackoverflow.com/questions/21541707/cpack-embed-text-files
set( DOCS
README.md
README-CONAN.md
README-SAMPLES.md
COPYING
exiv2.png
matrix-standard-vector-logo-xs.png
)
foreach(doc ${DOCS})
install(FILES ${CMAKE_CURRENT_SOURCE_DIR}/${doc} DESTINATION .)
endforeach()
# copy build/log which which is present if built by build.sh
if(EXISTS ${PROJECT_SOURCE_DIR}/build/logs/build.txt)
install(FILES ${PROJECT_SOURCE_DIR}/build/logs/build.txt DESTINATION "logs")
endif()
# Copy releasenotes.txt and appropriate ReadMe.txt (eg releasenotes/${PACKDIR}/ReadMe.txt)
set(VM ${PROJECT_VERSION_MAJOR}) # Version Major 0
set(VN ${PROJECT_VERSION_MINOR}) # Version Minor 27
set(VD ${PROJECT_VERSION_PATCH}) # Version Dot 3
set(VR .${PROJECT_VERSION_TWEAK}) # Version RC .1
if ( PREVIEW_RELEASE )
set(VR " Preview")
elseif ( NOT_FOR_RELEASE )
set(VR " Not for release")
endif()
configure_file(${CMAKE_CURRENT_SOURCE_DIR}/releasenotes/${PACKDIR}/ReadMe.txt ReadMe.txt @ONLY)
configure_file(${CMAKE_CURRENT_SOURCE_DIR}/releasenotes/releasenotes.txt releasenotes.txt @ONLY)
install (FILES ${CMAKE_CURRENT_BINARY_DIR}/ReadMe.txt ${CMAKE_CURRENT_BINARY_DIR}/releasenotes.txt DESTINATION .)
include (CPack)

79
cmake/printSummary.cmake Normal file
View File

@ -0,0 +1,79 @@
# output chosen build options
macro( OptionOutput _outputstring )
if( ${ARGN} )
set( _var "YES" )
else( ${ARGN} )
set( _var "NO" )
endif( ${ARGN} )
message( STATUS "${_outputstring}${_var}" )
endmacro( OptionOutput _outputstring )
function(printList items)
foreach (item ${items})
message("\t ${item}")
endforeach()
endfunction()
get_property(COMPILER_OPTIONS DIRECTORY ${CMAKE_SOURCE_DIR} PROPERTY COMPILE_OPTIONS)
message( STATUS "Install prefix: ${CMAKE_INSTALL_PREFIX}")
message( STATUS "------------------------------------------------------------------" )
message( STATUS "CMake Generator: ${CMAKE_GENERATOR}" )
message( STATUS "CMAKE_BUILD_TYPE: ${CMAKE_BUILD_TYPE}" )
message( STATUS "Compiler info: ${CMAKE_CXX_COMPILER_ID} (${CMAKE_CXX_COMPILER}) ; version: ${CMAKE_CXX_COMPILER_VERSION}")
message( STATUS "CMAKE_CXX_STANDARD:${CMAKE_CXX_STANDARD}" )
message( STATUS " --- Compiler flags --- ")
message( STATUS "General: ${CMAKE_CXX_FLAGS}" )
printList("${COMPILER_OPTIONS}")
message( STATUS "Extra: ${EXTRA_COMPILE_FLAGS}" )
message( STATUS "Debug: ${CMAKE_CXX_FLAGS_DEBUG}" )
message( STATUS "Release: ${CMAKE_CXX_FLAGS_RELEASE}" )
message( STATUS "RelWithDebInfo: ${CMAKE_CXX_FLAGS_RELWITHDEBINFO}" )
message( STATUS "MinSizeRel: ${CMAKE_CXX_FLAGS_MINSIZEREL}" )
message( STATUS " --- Linker flags --- ")
message( STATUS "General: ${CMAKE_EXE_LINKER_FLAGS}" )
message( STATUS "Debug: ${CMAKE_EXE_LINKER_FLAGS_DEBUG}" )
message( STATUS "Release: ${CMAKE_EXE_LINKER_FLAGS_RELEASE}" )
message( STATUS "RelWithDebInfo: ${CMAKE_EXE_LINKER_FLAGS_RELWITHDEBINFO}" )
message( STATUS "MinSizeRel: ${CMAKE_EXE_LINKER_FLAGS_MINSIZEREL}" )
message( STATUS "" )
message( STATUS "Compiler Options")
OptionOutput( "Warnings as errors: " EXIV2_WARNINGS_AS_ERRORS )
OptionOutput( "Use extra compiler warning flags: " EXIV2_EXTRA_WARNINGS )
message( STATUS "" )
message( STATUS "------------------------------------------------------------------" )
OptionOutput( "Building shared library: " BUILD_SHARED_LIBS )
OptionOutput( "Building PNG support: " EXIV2_ENABLE_PNG AND ZLIB_FOUND )
if ( EXIV2_ENABLE_EXTERNAL_XMP )
OptionOutput( "XMP metadata support (EXTERNAL): " EXIV2_ENABLE_EXTERNAL_XMP )
else()
OptionOutput( "XMP metadata support: " EXIV2_ENABLE_XMP )
endif()
OptionOutput( "Building BMFF support: " EXIV2_ENABLE_BMFF )
OptionOutput( "Native language support: " EXIV2_ENABLE_NLS )
OptionOutput( "Conversion of Windows XP tags: " EXIV2_ENABLE_PRINTUCS2 )
OptionOutput( "Nikon lens database: " EXIV2_ENABLE_LENSDATA )
OptionOutput( "Building video support: " EXIV2_ENABLE_VIDEO )
OptionOutput( "Building webready support: " EXIV2_ENABLE_WEBREADY )
if ( EXIV2_ENABLE_WEBREADY )
OptionOutput( "USE Libcurl for HttpIo: " EXIV2_ENABLE_CURL )
OptionOutput( "USE Libssh for SshIo: " EXIV2_ENABLE_SSH )
endif ( EXIV2_ENABLE_WEBREADY )
if (WIN32)
OptionOutput( "Dynamic runtime override: " EXIV2_ENABLE_DYNAMIC_RUNTIME)
OptionOutput( "Unicode paths (wstring): " EXIV2_ENABLE_WIN_UNICODE )
endif()
OptionOutput( "Building exiv2 command: " EXIV2_BUILD_EXIV2_COMMAND )
OptionOutput( "Building samples: " EXIV2_BUILD_SAMPLES )
OptionOutput( "Building unit tests: " EXIV2_BUILD_UNIT_TESTS )
OptionOutput( "Building doc: " EXIV2_BUILD_DOC )
OptionOutput( "Building with coverage flags: " BUILD_WITH_COVERAGE )
OptionOutput( "Using ccache: " BUILD_WITH_CCACHE )
message( STATUS "------------------------------------------------------------------" )
message(STATUS " WARNING: Deprecated features: EPS, Video, Ssh")
message( STATUS "------------------------------------------------------------------" )

2
codecov.yml Normal file
View File

@ -0,0 +1,2 @@
ignore:
- "xmpsdk" # Not interested about the coverage of XMKSDK

44
conanfile.py Normal file
View File

@ -0,0 +1,44 @@
from conans import ConanFile
from conans.tools import os_info
from conans.model.version import Version
class Exiv2Conan(ConanFile):
settings = 'os', 'compiler', 'build_type', 'arch'
generators = 'cmake_find_package', 'cmake_paths'
options = {'unitTests': [True, False],
'xmp': [True, False],
'iconv': [True, False],
'webready': [True, False],
}
default_options = ('unitTests=True',
'xmp=False',
'iconv=False',
'webready=False',
)
def configure(self):
self.options['libcurl'].shared = True
self.options['gtest'].shared = False
def requirements(self):
self.requires('zlib/1.2.12')
if os_info.is_windows and self.options.iconv:
self.requires('libiconv/1.16')
if self.options.unitTests:
self.requires('gtest/1.8.1')
if self.settings.build_type == "Debug":
self.options['gtest'].debug_postfix = ''
if self.options.webready:
self.requires('libcurl/7.80.0')
if self.options.xmp:
self.requires('XmpSdk/2016.7@piponazo/stable') # from conan-piponazo
else:
self.requires('expat/2.4.8')
def imports(self):
self.copy('*.dll', dst='conanDlls', src='bin')
self.copy('*.dylib', dst='bin', src='lib')

47
contrib/Qt/ReadMe.txt Normal file
View File

@ -0,0 +1,47 @@
contrib/Qt/ReadMe.txt
---------------------
Exiv2 works well with Qt.
Qt requires C++11 libraries which are the default for Exiv2 v0.28 and later.
Exiv2 v0.27 default build (and pre-built binaries) are for C++98
You will have to build Exiv2 v0.27 from source with C++11 for Qt.
To build and run commandLineTool
--------------------------------
1) Windows Users should install MinGW/msys2 as documented in README.md
2) All users should build Exiv2 with C++11 support as documented in README.md
3) Generate Makefile
Caution: You will have to modify commandLineTool.pro to fit your environment.
$ cd <exiv2dir>
$ cd contrib/Qt
$ qmake commandLinePro.pro
4) Build commandLineTool.cpp
$ make
5) Run commandLineTool.exe
$ commandLineTool.exe
UNICODE_PATH on Windows
-----------------------
Windows users may prefer to build Exiv2 to support UNICODE_PATH.
The sample application samples/exifprint.cpp works with UNICODE_PATH.
The cmake option -DEXIV2_ENABLE_WIN_UNICODE=ON is documented in README.md
Searching for more information about Qt, MinGW and UNICODE_PATH
---------------------------------------------------------------
These matters are occasionally discussed on the forum. Please search to read discussions.
https://github.com/Exiv2/exiv2/issues/1101#issuecomment-623141576
http://dev.exiv2.org/boards/3/topics/2311?r=2312#message-2312
http://dev.exiv2.org/issues/1169
http://dev.exiv2.org/boards/3/topics/2705
Robin Mills
http://clanmills.com
2020-05-04

View File

@ -0,0 +1,15 @@
QT += core
QT -= gui
TARGET = commandLineTool
CONFIG += console
CONFIG -= app_bundle
TEMPLATE = app
SOURCES += main.cpp
win32 {
INCLUDEPATH += $$quote(c:/Qt/5.14.2/mingw73_64/include)
INCLUDEPATH += /usr/local/include
LIBS += -L$$quote(c:/Qt/5.14.2/mingw73_64/include) -L/usr/local/lib -lexiv2
}

35
contrib/Qt/main.cpp Normal file
View File

@ -0,0 +1,35 @@
// ***************************************************************** -*- C++ -*-
/*
* Copyright (C) 2004-2021 Exiv2 authors
* This program is part of the Exiv2 distribution.
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 2
* of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, 5th Floor, Boston, MA 02110-1301 USA.
*/
#include <QCoreApplication>
#include <exiv2/exiv2.hpp>
#include <iostream>
int main(int argc, char *argv[])
{
QCoreApplication a(argc, argv);
exv_grep_keys_t keys;
Exiv2::dumpLibraryInfo(std::cout,keys);
return 0;
// return a.exec();
}

62
contrib/coverity.sh Executable file
View File

@ -0,0 +1,62 @@
#!/bin/sh
# Copyright (c) 2013-2015, Gilles Caulier, <caulier dot gilles at gmail dot com>
#
# Redistribution and use is allowed according to the terms of the BSD license.
# For details see the accompanying COPYING-CMAKE-SCRIPTS file.
#
# Before to run this script you must set these shell variable :
# $EXIVCoverityToken with token of Exiv2 project given by Coverity SCAN
# $EXIVCoverityEmail with email adress to send SCAN result.
#
# Coverity Scan bin dir must be appended to PATH variable.
#
# See this url to see how to prepare your computer with Coverity SCAN tool:
# http://scan.coverity.com/projects/297/upload_form
cd ..
# Manage build sub-dir
if [ -d "build.cmake" ]; then
rm -rfv ./build.cmake
fi
if [[ "$OSTYPE" == "linux-gnu" ]]; then
./bootstrap.linux
elif [[ "$OSTYPE" == "darwin"* ]]; then
./bootstrap.macports
else
echo "Unsupported platform..."
exit -1
fi
# Get active svn branch path to create SCAN import description string
svn info | grep "URL" | sed '/svn/{s/.*\(svn.*\)/\1/};' > ./build.cmake/svn_branch.txt
desc=$(<build.cmake/svn_branch.txt)
cd ./build.cmake
cov-build --dir cov-int --tmpdir ~/tmp make -j8
tar czvf myproject.tgz cov-int
echo "-- SCAN Import description --"
echo $desc
echo "-----------------------------"
echo "Coverity Scan tarball 'myproject.tgz' uploading in progress..."
nslookup scan5.coverity.com
SECONDS=0
curl -# \
--form token=$EXIVCoverityToken \
--form email=$EXIVCoverityEmail \
--form file=@myproject.tgz \
--form version=svn-trunk \
--form description="$desc" \
https://scan.coverity.com/builds?project=Exiv2 \
> /dev/null
echo "Done. Coverity Scan tarball 'myproject.tgz' is uploaded and ready for analyse."
echo "That took approximately $SECONDS seconds to upload."

235
contrib/organize/MD5.cpp Normal file
View File

@ -0,0 +1,235 @@
/*
* This code implements the MD5 message-digest algorithm.
* The algorithm is due to Ron Rivest. This code was
* written by Colin Plumb in 1993, no copyright is claimed.
* This code is in the public domain; do with it what you wish.
*
* Equivalent code is available from RSA Data Security, Inc.
* This code has been tested against that, and is equivalent,
* except that you don't need to include two pages of legalese
* with every copy.
*
* To compute the message digest of a chunk of bytes, declare an
* MD5_CTX structure, pass it to MD5Init, call MD5Update as
* needed on buffers full of bytes, and then call MD5Final, which
* will fill a supplied 16-byte array with the digest.
*
* Changed so as no longer to depend on Colin Plumb's `usual.h' header
* definitions; now uses stuff from dpkg's config.h.
* - Ian Jackson <ian@chiark.greenend.org.uk>.
* Still in the public domain.
*/
#include <cstring>
#include "MD5.h"
using namespace std;
static void
byteSwap(UWORD32 *buf, unsigned words)
{
const uint32_t byteOrderTest = 0x1;
if (((char *)&byteOrderTest)[0] == 0) {
md5byte *p = (md5byte *)buf;
do {
*buf++ = (UWORD32)((unsigned)p[3] << 8 | p[2]) << 16 |
((unsigned)p[1] << 8 | p[0]);
p += 4;
} while (--words);
}
}
/*
* Start MD5 accumulation. Set bit count to 0 and buffer to mysterious
* initialization constants.
*/
void
MD5Init(struct MD5_CTX *ctx)
{
ctx->buf[0] = 0x67452301;
ctx->buf[1] = 0xefcdab89;
ctx->buf[2] = 0x98badcfe;
ctx->buf[3] = 0x10325476;
ctx->bytes[0] = 0;
ctx->bytes[1] = 0;
}
/*
* Update context to reflect the concatenation of another buffer full
* of bytes.
*/
void
MD5Update(struct MD5_CTX *ctx, md5byte const *buf, unsigned len)
{
UWORD32 t;
/* Update byte count */
t = ctx->bytes[0];
if ((ctx->bytes[0] = t + len) < t)
ctx->bytes[1]++; /* Carry from low to high */
t = 64 - (t & 0x3f); /* Space available in ctx->in (at least 1) */
if (t > len) {
memcpy((md5byte *)ctx->in + 64 - t, buf, len);
return;
}
/* First chunk is an odd size */
memcpy((md5byte *)ctx->in + 64 - t, buf, t);
byteSwap(ctx->in, 16);
MD5Transform(ctx->buf, ctx->in);
buf += t;
len -= t;
/* Process data in 64-byte chunks */
while (len >= 64) {
memcpy(ctx->in, buf, 64);
byteSwap(ctx->in, 16);
MD5Transform(ctx->buf, ctx->in);
buf += 64;
len -= 64;
}
/* Handle any remaining bytes of data. */
memcpy(ctx->in, buf, len);
}
/*
* Final wrapup - pad to 64-byte boundary with the bit pattern
* 1 0* (64-bit count of bits processed, MSB-first)
*/
void
MD5Final(md5byte digest[16], struct MD5_CTX *ctx)
{
int count = ctx->bytes[0] & 0x3f; /* Number of bytes in ctx->in */
md5byte *p = (md5byte *)ctx->in + count;
/* Set the first char of padding to 0x80. There is always room. */
*p++ = 0x80;
/* Bytes of padding needed to make 56 bytes (-8..55) */
count = 56 - 1 - count;
if (count < 0) { /* Padding forces an extra block */
memset(p, 0, count + 8);
byteSwap(ctx->in, 16);
MD5Transform(ctx->buf, ctx->in);
p = (md5byte *)ctx->in;
count = 56;
}
memset(p, 0, count);
byteSwap(ctx->in, 14);
/* Append length in bits and transform */
ctx->in[14] = ctx->bytes[0] << 3;
ctx->in[15] = ctx->bytes[1] << 3 | ctx->bytes[0] >> 29;
MD5Transform(ctx->buf, ctx->in);
byteSwap(ctx->buf, 4);
memcpy(digest, ctx->buf, 16);
memset(ctx, 0, sizeof(ctx)); /* In case it's sensitive */
}
/* The four core functions - F1 is optimized somewhat */
/* #define F1(x, y, z) (x & y | ~x & z) */
#define F1(x, y, z) (z ^ (x & (y ^ z)))
#define F2(x, y, z) F1(z, x, y)
#define F3(x, y, z) (x ^ y ^ z)
#define F4(x, y, z) (y ^ (x | ~z))
/* This is the central step in the MD5 algorithm. */
#define MD5STEP(f,w,x,y,z,in,s) \
(w += f(x,y,z) + in, w = (w<<s | w>>(32-s)) + x)
/*
* The core of the MD5 algorithm, this alters an existing MD5 hash to
* reflect the addition of 16 longwords of new data. MD5Update blocks
* the data and converts bytes into longwords for this routine.
*/
void
MD5Transform(UWORD32 buf[4], UWORD32 const in[16])
{
register UWORD32 a, b, c, d;
a = buf[0];
b = buf[1];
c = buf[2];
d = buf[3];
MD5STEP(F1, a, b, c, d, in[0] + 0xd76aa478, 7);
MD5STEP(F1, d, a, b, c, in[1] + 0xe8c7b756, 12);
MD5STEP(F1, c, d, a, b, in[2] + 0x242070db, 17);
MD5STEP(F1, b, c, d, a, in[3] + 0xc1bdceee, 22);
MD5STEP(F1, a, b, c, d, in[4] + 0xf57c0faf, 7);
MD5STEP(F1, d, a, b, c, in[5] + 0x4787c62a, 12);
MD5STEP(F1, c, d, a, b, in[6] + 0xa8304613, 17);
MD5STEP(F1, b, c, d, a, in[7] + 0xfd469501, 22);
MD5STEP(F1, a, b, c, d, in[8] + 0x698098d8, 7);
MD5STEP(F1, d, a, b, c, in[9] + 0x8b44f7af, 12);
MD5STEP(F1, c, d, a, b, in[10] + 0xffff5bb1, 17);
MD5STEP(F1, b, c, d, a, in[11] + 0x895cd7be, 22);
MD5STEP(F1, a, b, c, d, in[12] + 0x6b901122, 7);
MD5STEP(F1, d, a, b, c, in[13] + 0xfd987193, 12);
MD5STEP(F1, c, d, a, b, in[14] + 0xa679438e, 17);
MD5STEP(F1, b, c, d, a, in[15] + 0x49b40821, 22);
MD5STEP(F2, a, b, c, d, in[1] + 0xf61e2562, 5);
MD5STEP(F2, d, a, b, c, in[6] + 0xc040b340, 9);
MD5STEP(F2, c, d, a, b, in[11] + 0x265e5a51, 14);
MD5STEP(F2, b, c, d, a, in[0] + 0xe9b6c7aa, 20);
MD5STEP(F2, a, b, c, d, in[5] + 0xd62f105d, 5);
MD5STEP(F2, d, a, b, c, in[10] + 0x02441453, 9);
MD5STEP(F2, c, d, a, b, in[15] + 0xd8a1e681, 14);
MD5STEP(F2, b, c, d, a, in[4] + 0xe7d3fbc8, 20);
MD5STEP(F2, a, b, c, d, in[9] + 0x21e1cde6, 5);
MD5STEP(F2, d, a, b, c, in[14] + 0xc33707d6, 9);
MD5STEP(F2, c, d, a, b, in[3] + 0xf4d50d87, 14);
MD5STEP(F2, b, c, d, a, in[8] + 0x455a14ed, 20);
MD5STEP(F2, a, b, c, d, in[13] + 0xa9e3e905, 5);
MD5STEP(F2, d, a, b, c, in[2] + 0xfcefa3f8, 9);
MD5STEP(F2, c, d, a, b, in[7] + 0x676f02d9, 14);
MD5STEP(F2, b, c, d, a, in[12] + 0x8d2a4c8a, 20);
MD5STEP(F3, a, b, c, d, in[5] + 0xfffa3942, 4);
MD5STEP(F3, d, a, b, c, in[8] + 0x8771f681, 11);
MD5STEP(F3, c, d, a, b, in[11] + 0x6d9d6122, 16);
MD5STEP(F3, b, c, d, a, in[14] + 0xfde5380c, 23);
MD5STEP(F3, a, b, c, d, in[1] + 0xa4beea44, 4);
MD5STEP(F3, d, a, b, c, in[4] + 0x4bdecfa9, 11);
MD5STEP(F3, c, d, a, b, in[7] + 0xf6bb4b60, 16);
MD5STEP(F3, b, c, d, a, in[10] + 0xbebfbc70, 23);
MD5STEP(F3, a, b, c, d, in[13] + 0x289b7ec6, 4);
MD5STEP(F3, d, a, b, c, in[0] + 0xeaa127fa, 11);
MD5STEP(F3, c, d, a, b, in[3] + 0xd4ef3085, 16);
MD5STEP(F3, b, c, d, a, in[6] + 0x04881d05, 23);
MD5STEP(F3, a, b, c, d, in[9] + 0xd9d4d039, 4);
MD5STEP(F3, d, a, b, c, in[12] + 0xe6db99e5, 11);
MD5STEP(F3, c, d, a, b, in[15] + 0x1fa27cf8, 16);
MD5STEP(F3, b, c, d, a, in[2] + 0xc4ac5665, 23);
MD5STEP(F4, a, b, c, d, in[0] + 0xf4292244, 6);
MD5STEP(F4, d, a, b, c, in[7] + 0x432aff97, 10);
MD5STEP(F4, c, d, a, b, in[14] + 0xab9423a7, 15);
MD5STEP(F4, b, c, d, a, in[5] + 0xfc93a039, 21);
MD5STEP(F4, a, b, c, d, in[12] + 0x655b59c3, 6);
MD5STEP(F4, d, a, b, c, in[3] + 0x8f0ccc92, 10);
MD5STEP(F4, c, d, a, b, in[10] + 0xffeff47d, 15);
MD5STEP(F4, b, c, d, a, in[1] + 0x85845dd1, 21);
MD5STEP(F4, a, b, c, d, in[8] + 0x6fa87e4f, 6);
MD5STEP(F4, d, a, b, c, in[15] + 0xfe2ce6e0, 10);
MD5STEP(F4, c, d, a, b, in[6] + 0xa3014314, 15);
MD5STEP(F4, b, c, d, a, in[13] + 0x4e0811a1, 21);
MD5STEP(F4, a, b, c, d, in[4] + 0xf7537e82, 6);
MD5STEP(F4, d, a, b, c, in[11] + 0xbd3af235, 10);
MD5STEP(F4, c, d, a, b, in[2] + 0x2ad7d2bb, 15);
MD5STEP(F4, b, c, d, a, in[9] + 0xeb86d391, 21);
buf[0] += a;
buf[1] += b;
buf[2] += c;
buf[3] += d;
}

50
contrib/organize/MD5.h Normal file
View File

@ -0,0 +1,50 @@
#ifndef __MD5_h__
#define __MD5_h__
/*
* This is the header file for the MD5 message-digest algorithm.
* The algorithm is due to Ron Rivest. This code was
* written by Colin Plumb in 1993, no copyright is claimed.
* This code is in the public domain; do with it what you wish.
*
* Equivalent code is available from RSA Data Security, Inc.
* This code has been tested against that, and is equivalent,
* except that you don't need to include two pages of legalese
* with every copy.
*
* To compute the message digest of a chunk of bytes, declare an
* MD5_CTX structure, pass it to MD5Init, call MD5Update as
* needed on buffers full of bytes, and then call MD5Final, which
* will fill a supplied 16-byte array with the digest.
*
* Changed so as no longer to depend on Colin Plumb's `usual.h'
* header definitions; now uses stuff from dpkg's config.h
* - Ian Jackson <ian@chiark.greenend.org.uk>.
* Still in the public domain.
*/
#include <sys/types.h>
#ifdef EXV_HAVE_STDINT_H
# include <stdint.h>
#endif
/* MSVC doesn't provide C99 types, but it has MS specific variants */
#ifdef _MSC_VER
typedef unsigned __int32 uint32_t;
#endif
typedef unsigned char md5byte;
typedef uint32_t UWORD32;
struct MD5_CTX {
UWORD32 buf[4];
UWORD32 bytes[2];
UWORD32 in[16];
};
extern void MD5Init(struct MD5_CTX *context);
extern void MD5Update(struct MD5_CTX *context, md5byte const *buf, unsigned len);
extern void MD5Final(unsigned char digest[16], struct MD5_CTX *context);
extern void MD5Transform(UWORD32 buf[4], UWORD32 const in[16]);
#endif

144
contrib/organize/Makefile Normal file
View File

@ -0,0 +1,144 @@
# ************************************************************* -*- Makefile -*-
#
# Copyright (C) 2004-2015 Andreas Huggel <ahuggel@gmx.net>
#
# This Makefile is part of the Exiv2 distribution.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions
# are met:
#
# 1. Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# 2. Redistributions in binary form must reproduce the above
# copyright notice, this list of conditions and the following
# disclaimer in the documentation and/or other materials provided
# with the distribution.
# 3. The name of the author may not be used to endorse or promote
# products derived from this software without specific prior
# written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR
# IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY
# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
# GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER
# IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR
# OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
# IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#
# File: Makefile
# Author(s): Andreas Huggel (ahu) <ahuggel@gmx.net>
# History: 31-Jan-09, ahu: created
#
# Description:
# Simple Makefile to build the organize application. Requires installed
# exiv2 library and headers. Adapted from samples/Makefile.
#
# Restrictions:
# Requires GNU make.
#
# ******************************************************************************
# Default make target
all: ozbin
# Include system configuration
top_srcdir = ../..
include $(top_srcdir)/config/config.mk
include boost.mk
# ******************************************************************************
# Source files
# Source files for the organize application
OZMAIN = organize.cpp
OZSRC = helpers.cpp MD5.cpp
# ******************************************************************************
# Initialisations
SHELL = /bin/sh
.SUFFIXES:
.SUFFIXES: .c .cpp .o .so
.PRECIOUS: %.cpp
CPPFLAGS := -I$(BOOST_INC_DIR) `pkg-config exiv2 --cflags`
ifdef HAVE_STDINT
CPPFLAGS += -DEXV_HAVE_STDINT_H=1
endif
LDFLAGS := $(BOOST_LIBS) `pkg-config exiv2 --libs`
OZOBJ = $(OZSRC:.cpp=.o) $(OZMAIN:.cpp=.o)
OZBIN = $(OZMAIN:.cpp=)
OZEXE = $(OZMAIN:.cpp=$(EXEEXT))
ifdef DEP_TRACKING
DEP = $(OZMAIN:%.cpp=$(DEPDIR)/%.d) $(OZSRC:%.cpp=$(DEPDIR)/%.d)
endif
# ******************************************************************************
# Rules
ozbin: $(OZBIN)
$(OZOBJ): %.o: %.cpp
$(COMPILE.cc) -o $@ $<
@$(MAKEDEPEND)
@$(POSTDEPEND)
%.ii: %.cpp
set -e; \
$(CXXCPP) -E $(CPPFLAGS) $< | sed '/^[ ]*$$/d' > $@
# ******************************************************************************
# Targets
.PHONY: all ozbin relink binclean install uninstall mostlyclean clean distclean maintainer-clean
ifdef DEP_TRACKING
# Include targets from dependency files
-include $(DEP)
endif
$(OZBIN): $(OZOBJ)
$(LIBTOOL) --mode=link $(LINK.cc) -o $@ $(OZOBJ)
relink: binclean organize
install:
$(INSTALL_DIRS) $(DESTDIR)$(bindir)
@$(LIBTOOL) --mode=install $(INSTALL_PROGRAM) $(OZEXE) $(DESTDIR)$(bindir)/$(OZEXE)
uninstall:
@$(LIBTOOL) --mode=uninstall $(RM) $(DESTDIR)$(bindir)/$(OZEXE)
-rmdir $(DESTDIR)$(bindir)
# Remove binaries, e.g., to relink them
binclean:
$(RM) $(OZEXE)
mostlyclean:
$(RM) core
$(RM) $(OZMAIN:.cpp=.ii) $(OZSRC:.cpp=.ii)
$(RM) $(OZMAIN:%.cpp=.libs/%.d) $(OZSRC:%.cpp=.libs/%.d)
-rmdir .libs
$(RM) $(OZOBJ)
clean: binclean mostlyclean
# Run `make distclean' from the top source directory to also remove
# files created by configuring the program.
distclean: clean
ifdef DEP_TRACKING
$(RM) $(DEP)
-rmdir $(DEPDIR)
endif
$(RM) *~ *.bak *#
# This command is intended for maintainers to use; it deletes files
# that may need special tools to rebuild.
maintainer-clean: uninstall distclean

3
contrib/organize/README Normal file
View File

@ -0,0 +1,3 @@
organize uses the Boost library (http://www.boost.org).
Configuration settings for Boost are in the file boost.mk
in this directory and should be changed as required.

View File

@ -0,0 +1,3 @@
# Boost configuration for organize - change paths and library names as needed
BOOST_INC_DIR = /usr/local/include/boost-1_37
BOOST_LIBS = /usr/local/lib/libboost_system-gcc43-mt-1_37.a /usr/local/lib/libboost_filesystem-gcc43-mt-1_37.a /usr/local/lib/libboost_regex-gcc43-mt-1_37.a /usr/local/lib/libboost_program_options-gcc43-mt-1_37.a

View File

@ -0,0 +1,635 @@
// ***************************************************************** -*- C++ -*-
/*
* Copyright (C) 2009 Brad Schick <schickb@gmail.com>
*
* This file is part of the organize tool.
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 2
* of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, 5th Floor, Boston, MA 02110-1301 USA.
*/
// *****************************************************************************
#include <boost/algorithm/string.hpp>
#include <boost/regex.hpp>
#include <boost/format.hpp>
#include <boost/lexical_cast.hpp>
#include <exiv2/image.hpp>
#include <exiv2/easyaccess.hpp>
#include <exiv2/exif.hpp>
#include <exiv2/iptc.hpp>
#include <exiv2/tags.hpp>
//#include <exiv2/xmp.hpp>
#include <cassert>
#include <sstream>
#include <ctime>
#include "helpers.hpp"
#define BOOST_FILESYSTEM_NO_DEPRECATED
namespace fs = boost::filesystem;
typedef Exiv2::ExifData::const_iterator (*EasyAccessFct)(const Exiv2::ExifData& ed);
std::string scrub(const std::string &dirty, bool strip_space = false)
{
std::string scrub = boost::trim_copy(dirty);
if(strip_space) {
boost::regex space("\\s");
scrub = boost::regex_replace(scrub, space, "");
}
boost::regex dash("[:/\\\\|<>]");
boost::regex under("[\"'\\[\\]\\{\\}#=%\\$\\?,\\+\\*]");
scrub = boost::regex_replace(scrub, dash, "-");
return boost::regex_replace(scrub, under, "_");
}
bool exif_data(const Exiv2::Image *image, const char *key, Exiv2::ExifData::const_iterator &md)
{
assert(image && key);
bool ok = false;
try {
const Exiv2::ExifData &exifData = image->exifData();
Exiv2::ExifKey exifKey(key);
md = exifData.findKey(exifKey);
if(md != exifData.end() && md->typeId() != Exiv2::undefined)
ok = true;
}
catch(const Exiv2::AnyError&) {
}
return ok;
}
bool exif_data_easy(const Exiv2::Image *image, EasyAccessFct easy, Exiv2::ExifData::const_iterator &md)
{
assert(image && easy);
bool ok = false;
try {
const Exiv2::ExifData &exifData = image->exifData();
md = easy(exifData);
if(md != exifData.end() && md->typeId() != Exiv2::undefined)
ok = true;
}
catch(const Exiv2::AnyError&) {
}
return ok;
}
bool iptc_data(const Exiv2::Image *image, const char *key, Exiv2::IptcData::const_iterator &md)
{
bool ok = false;
assert(image && key);
try {
const Exiv2::IptcData &iptcData = image->iptcData();
Exiv2::IptcKey iptcKey(key);
md = iptcData.findKey(iptcKey);
if(md != iptcData.end() && md->typeId() != Exiv2::undefined)
ok = true;
}
catch(const Exiv2::AnyError&) {
}
return ok;
}
std::string exif_date(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Photo.DateTimeDigitized", md);
if(!done)
done = exif_data(image, "Exif.Photo.DateTimeOriginal", md);
if(!done)
return "";
std::string date = scrub(md->print().substr(0,10));
// Some files have zeros for dates, just fail in that case
if(boost::lexical_cast<int>(date.substr(0,4))==0)
return "";
return date;
}
std::string exif_year(const Exiv2::Image *image, const fs::path &path)
{
std::string date = exif_date(image, path);
if(date.length())
return date.substr(0,4);
else
return date;
}
std::string exif_month(const Exiv2::Image *image, const fs::path &path)
{
std::string date = exif_date(image, path);
if(date.length())
return date.substr(5,2);
else
return date;
}
std::string exif_day(const Exiv2::Image *image, const fs::path &path)
{
std::string date = exif_date(image, path);
if(date.length())
return date.substr(8,2);
else
return date;
}
bool iptc_get_date(const Exiv2::Image *image, Exiv2::DateValue::Date &date)
{
Exiv2::IptcData::const_iterator md;
bool done = iptc_data(image, "Iptc.Application2.DigitizationDate", md);
if(!done)
done = iptc_data(image, "Iptc.Application2.DateCreated", md);
if(!done)
return false;
date = ((Exiv2::DateValue*)md->getValue().get())->getDate();
return date.year > 0;
}
std::string iptc_date(const Exiv2::Image *image, const fs::path &)
{
Exiv2::DateValue::Date date;
if(iptc_get_date(image, date))
return str(boost::format("%4d-%02d-%02d") % date.year % date.month % date.day);
else
return "";
}
std::string iptc_year(const Exiv2::Image *image, const fs::path &)
{
Exiv2::DateValue::Date date;
if(iptc_get_date(image, date))
return str(boost::format("%4d") % date.year);
else
return "";
}
std::string iptc_month(const Exiv2::Image *image, const fs::path &)
{
Exiv2::DateValue::Date date;
if(iptc_get_date(image, date))
return str(boost::format("%02d") % date.month);
else
return "";
}
std::string iptc_day(const Exiv2::Image *image, const fs::path &)
{
Exiv2::DateValue::Date date;
if(iptc_get_date(image, date))
return str(boost::format("%02d") % date.day);
else
return "";
}
bool file_get_tm(const fs::path &path, std::tm &tm)
{
std::time_t timer = fs::last_write_time(path);
if(time > 0) {
tm = *localtime(&timer);
return true;
}
else {
return false;
}
}
std::string file_date(const Exiv2::Image *, const fs::path &path)
{
std::tm tm;
if(file_get_tm(path, tm))
return str(boost::format("%4d-%02d-%02d") % (tm.tm_year + 1900) % (tm.tm_mon + 1) % tm.tm_mday);
else
return "";
}
std::string file_year(const Exiv2::Image *, const fs::path &path)
{
std::tm tm;
if(file_get_tm(path, tm))
return str(boost::format("%4d") % (tm.tm_year + 1900));
else
return "";
}
std::string file_month(const Exiv2::Image *, const fs::path &path)
{
std::tm tm;
if(file_get_tm(path, tm))
return str(boost::format("%02d") % (tm.tm_mon + 1));
else
return "";
}
std::string file_day(const Exiv2::Image *, const fs::path &path)
{
std::tm tm;
if(file_get_tm(path, tm))
return str(boost::format("%02d") % tm.tm_mday);
else
return "";
}
/*
std::string xmp_date(const Exiv2::Image *image, const fs::path &)
{
return "";
}
std::string xmp_year(const Exiv2::Image *image, const fs::path &)
{
return "";
}
std::string xmp_month(const Exiv2::Image *image, const fs::path &)
{
return "";
}
std::string xmp_day(const Exiv2::Image *image, const fs::path &)
{
return "";
}*/
std::string exif_time(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Photo.DateTimeDigitized", md);
if(!done)
done = exif_data(image, "Exif.Photo.DateTimeOriginal", md);
if(!done)
return "";
std::string datetime = md->print();
// Some files have zeros for dates, just fail in that case
if(boost::lexical_cast<int>(datetime.substr(0,4)) == 0)
return "";
return scrub(datetime.substr(11));
}
std::string exif_hour(const Exiv2::Image *image, const fs::path &path)
{
std::string time = exif_time(image, path);
if(time.length())
return time.substr(0,2);
else
return time;
}
std::string exif_minute(const Exiv2::Image *image, const fs::path &path)
{
std::string time = exif_time(image, path);
if(time.length())
return time.substr(3,2);
else
return time;
}
std::string exif_second(const Exiv2::Image *image, const fs::path &path)
{
std::string time = exif_time(image, path);
if(time.length())
return time.substr(6,2);
else
return time;
}
bool iptc_get_time(const Exiv2::Image *image, Exiv2::TimeValue::Time &time)
{
Exiv2::IptcData::const_iterator md;
bool done = iptc_data(image, "Iptc.Application2.DigitizationTime", md);
if(!done)
done = iptc_data(image, "Iptc.Application2.TimeCreated", md);
if(!done)
return false;
time = ((Exiv2::TimeValue*)md->getValue().get())->getTime();
// Zero is a valid time, so this one is hard to check.
return true;
}
std::string iptc_time(const Exiv2::Image *image, const fs::path &)
{
Exiv2::TimeValue::Time time;
if(iptc_get_time(image, time))
return str(boost::format("%02d-%02d-%02d") % time.hour % time.minute % time.second);
else
return "";
}
std::string iptc_hour(const Exiv2::Image *image, const fs::path &)
{
Exiv2::TimeValue::Time time;
if(iptc_get_time(image, time))
return str(boost::format("%02d") % time.hour);
else
return "";
}
std::string iptc_minute(const Exiv2::Image *image, const fs::path &)
{
Exiv2::TimeValue::Time time;
if(iptc_get_time(image, time))
return str(boost::format("%02d") % time.minute);
else
return "";
}
std::string iptc_second(const Exiv2::Image *image, const fs::path &)
{
Exiv2::TimeValue::Time time;
if(iptc_get_time(image, time))
return str(boost::format("%02d") % time.second);
else
return "";
}
std::string file_time(const Exiv2::Image *, const fs::path &path)
{
std::tm tm;
if(file_get_tm(path, tm))
return str(boost::format("%02d-%02d-%02d") % tm.tm_hour % tm.tm_min % tm.tm_sec);
else
return "";
}
std::string file_hour(const Exiv2::Image *, const fs::path &path)
{
std::tm tm;
if(file_get_tm(path, tm))
return str(boost::format("%02d") % tm.tm_hour);
else
return "";
}
std::string file_minute(const Exiv2::Image *, const fs::path &path)
{
std::tm tm;
if(file_get_tm(path, tm))
return str(boost::format("%02d") % tm.tm_min);
else
return "";
}
std::string file_second(const Exiv2::Image *, const fs::path &path)
{
std::tm tm;
if(file_get_tm(path, tm))
return str(boost::format("%02d") % tm.tm_sec);
else
return "";
}
/*std::string xmp_time(const Exiv2::Image *image, const fs::path &)
{
return "";
}
std::string xmp_hour(const Exiv2::Image *image, const fs::path &)
{
return "";
}
std::string xmp_minute(const Exiv2::Image *image, const fs::path &)
{
return "";
}
std::string xmp_second(const Exiv2::Image *image, const fs::path &)
{
return "";
}*/
std::string exif_dimension(const Exiv2::Image *image, const fs::path &path)
{
return exif_width(image, path) + "-" + exif_height(image, path);
}
std::string exif_width(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Photo.PixelXDimension", md);
if(!done)
return "";
return scrub(md->print());
}
std::string exif_height(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Photo.PixelYDimension", md);
if(!done)
return "";
return scrub(md->print());
}
std::string file_dimension(const Exiv2::Image *image, const fs::path &path)
{
if(image)
return file_width(image, path) + "-" + file_height(image, path);
else
return "";
}
std::string file_width(const Exiv2::Image *image, const fs::path &)
{
if(image)
return str(boost::format("%02d") % image->pixelWidth());
else
return "";
}
std::string file_height(const Exiv2::Image *image, const fs::path &)
{
if(image)
return str(boost::format("%02d") % image->pixelHeight());
else
return "";
}
/*
std::string xmp_dimension(const Exiv2::Image *image, const fs::path &)
{
return ""
}
std::string xmp_width(const Exiv2::Image *image, const fs::path &)
{
return "";
}
std::string xmp_height(const Exiv2::Image *image, const fs::path &)
{
return "";
}*/
std::string exif_model(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Image.Model", md);
if(!done)
return "";
return scrub(md->print());
}
std::string exif_make(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Image.Make", md);
if(!done)
return "";
return scrub(md->print());
}
/*std::string xmp_model(const Exiv2::Image *image, const fs::path &)
{
return "";
}*/
std::string exif_speed(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Photo.ShutterSpeedValue", md);
if(!done)
done = exif_data(image, "Exif.Photo.ExposureTime", md);
if(!done)
return "";
return scrub(md->print());
}
/*std::string xmp_speed(const Exiv2::Image *image, const fs::path &)
{
return "";
}*/
std::string exif_aperture(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Photo.ApertureValue", md);
if(!done)
done = exif_data(image, "Exif.Photo.FNumber", md);
if(!done)
return "";
return scrub(md->print());
}
/*std::string xmp_aperture(const Exiv2::Image *image, const fs::path &)
{
return "";
}*/
std::string exif_focal(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Photo.FocalLength", md);
if(!done)
return "";
return scrub(md->print());
}
/*std::string xmp_focal(const Exiv2::Image *image, const fs::path &)
{
return "";
}*/
std::string exif_distance(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Photo.SubjectDistance", md);
if(!done)
return "";
return scrub(md->print());
}
/*std::string xmp_distance(const Exiv2::Image *image, const fs::path &)
{
return "";
}*/
std::string exif_meter(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Photo.MeteringMode", md);
if(!done)
return "";
return scrub(md->print());
}
std::string exif_macro(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data_easy(image, Exiv2::macroMode, md);
if(!done)
return "";
return scrub(md->print());
}
std::string exif_orientation(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data_easy(image, Exiv2::orientation, md);
if(!done)
return "";
return scrub(md->print(), true);
}
std::string exif_lens(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data_easy(image, Exiv2::lensName, md);
if(!done)
return "";
return scrub(md->print());
}
std::string exif_iso(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data_easy(image, Exiv2::isoSpeed, md);
if(!done)
return "";
return scrub(md->print());
}
/*std::string xmp_meter(const Exiv2::Image *image, const fs::path &)
{
return "";
}*/
std::string exif_keyword(const Exiv2::Image *image, const fs::path &)
{
Exiv2::ExifData::const_iterator md;
bool done = exif_data(image, "Exif.Photo.UserComment", md);
if(!done)
return "";
return scrub(md->print());
}
std::string iptc_keyword(const Exiv2::Image *image, const fs::path &)
{
Exiv2::IptcData::const_iterator md;
bool done = iptc_data(image, "Iptc.Application2.Keywords", md);
if(!done)
return "";
return scrub(md->print());
}
/*std::string xmp_keyword(const Exiv2::Image *image, const fs::path &)
{
return "";
}*/

View File

@ -0,0 +1,101 @@
// ***************************************************************** -*- C++ -*-
/*
* Copyright (C) 2004-2021 Exiv2 authors
* This program is part of the Exiv2 distribution.
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 2
* of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, 5th Floor, Boston, MA 02110-1301 USA.
*/
// *****************************************************************************
#ifndef HELPERS_HPP_
#define HELPERS_HPP_
#include <boost/filesystem.hpp>
#define BOOST_FILESYSTEM_NO_DEPRECATED
namespace fs = boost::filesystem;
typedef std::string (*pfunc)(const Exiv2::Image *image, const fs::path &path);
// This would be a lot smaller if Exiv2 had support
// for unified metadata
std::string exif_date(const Exiv2::Image *image, const fs::path &path);
std::string exif_year(const Exiv2::Image *image, const fs::path &path);
std::string exif_month(const Exiv2::Image *image, const fs::path &path);
std::string exif_day(const Exiv2::Image *image, const fs::path &path);
std::string iptc_date(const Exiv2::Image *image, const fs::path &path);
std::string iptc_year(const Exiv2::Image *image, const fs::path &path);
std::string iptc_month(const Exiv2::Image *image, const fs::path &path);
std::string iptc_day(const Exiv2::Image *image, const fs::path &path);
std::string file_date(const Exiv2::Image *image, const fs::path &path);
std::string file_year(const Exiv2::Image *image, const fs::path &path);
std::string file_month(const Exiv2::Image *image, const fs::path &path);
std::string file_day(const Exiv2::Image *image, const fs::path &path);
/*std::string xmp_date(const Exiv2::Image *image, const fs::path &path);
std::string xmp_year(const Exiv2::Image *image, const fs::path &path);
std::string xmp_month(const Exiv2::Image *image, const fs::path &path);
std::string xmp_day(const Exiv2::Image *image, const fs::path &path);*/
std::string exif_time(const Exiv2::Image *image, const fs::path &path);
std::string exif_hour(const Exiv2::Image *image, const fs::path &path);
std::string exif_minute(const Exiv2::Image *image, const fs::path &path);
std::string exif_second(const Exiv2::Image *image, const fs::path &path);
std::string iptc_time(const Exiv2::Image *image, const fs::path &path);
std::string iptc_hour(const Exiv2::Image *image, const fs::path &path);
std::string iptc_minute(const Exiv2::Image *image, const fs::path &path);
std::string iptc_second(const Exiv2::Image *image, const fs::path &path);
std::string file_time(const Exiv2::Image *image, const fs::path &path);
std::string file_hour(const Exiv2::Image *image, const fs::path &path);
std::string file_minute(const Exiv2::Image *image, const fs::path &path);
std::string file_second(const Exiv2::Image *image, const fs::path &path);
/*std::string xmp_time(const Exiv2::Image *image, const fs::path &path);
std::string xmp_hour(const Exiv2::Image *image, const fs::path &path);
std::string xmp_minute(const Exiv2::Image *image, const fs::path &path);
std::string xmp_second(const Exiv2::Image *image, const fs::path &path);*/
std::string exif_dimension(const Exiv2::Image *image, const fs::path &path);
std::string exif_width(const Exiv2::Image *image, const fs::path &path);
std::string exif_height(const Exiv2::Image *image, const fs::path &path);
std::string file_dimension(const Exiv2::Image *image, const fs::path &path);
std::string file_width(const Exiv2::Image *image, const fs::path &path);
std::string file_height(const Exiv2::Image *image, const fs::path &path);
/*std::string xmp_dimension(const Exiv2::Image *image, const fs::path &path);
std::string xmp_width(const Exiv2::Image *image, const fs::path &path);
std::string xmp_height(const Exiv2::Image *image, const fs::path &path);*/
std::string exif_model(const Exiv2::Image *image, const fs::path &path);
std::string exif_make(const Exiv2::Image *image, const fs::path &path);
/*std::string xmp_model(const Exiv2::Image *image, const fs::path &path);
std::string xmp_make(const Exiv2::Image *image, const fs::path &path);*/
std::string exif_speed(const Exiv2::Image *image, const fs::path &path);
//std::string xmp_speed(const Exiv2::Image *image, const fs::path &path);
std::string exif_aperture(const Exiv2::Image *image, const fs::path &path);
//std::string xmp_aperture(const Exiv2::Image *image, const fs::path &path);
std::string exif_focal(const Exiv2::Image *image, const fs::path &path);
//std::string xmp_focal(const Exiv2::Image *image, const fs::path &path);
std::string exif_distance(const Exiv2::Image *image, const fs::path &path);
//std::string xmp_distance(const Exiv2::Image *image, const fs::path &path);
std::string exif_meter(const Exiv2::Image *image, const fs::path &path);
//std::string xmp_meter(const Exiv2::Image *image, const fs::path &path);
std::string exif_macro(const Exiv2::Image *image, const fs::path &path);
std::string exif_orientation(const Exiv2::Image *image, const fs::path &path);
std::string exif_lens(const Exiv2::Image *image, const fs::path &path);
std::string exif_keyword(const Exiv2::Image *image, const fs::path &path);
std::string iptc_keyword(const Exiv2::Image *image, const fs::path &path);
//std::string xmp_keyword(const Exiv2::Image *image, const fs::path &path);
std::string exif_iso(const Exiv2::Image *image, const fs::path &path);
#endif //HELPERS_HPP_

View File

@ -0,0 +1,759 @@
// ***************************************************************** -*- C++ -*-
/*
* Copyright (C) 2004-2021 Exiv2 authors
* This program is part of the Exiv2 distribution.
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 2
* of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, 5th Floor, Boston, MA 02110-1301 USA.
*/
// *****************************************************************************
#include <boost/program_options.hpp>
#include <boost/regex.hpp>
#include <boost/array.hpp>
#include <boost/algorithm/string.hpp>
#include <boost/lexical_cast.hpp>
#include <exiv2/image.hpp>
#include <exiv2/error.hpp>
#include <exiv2/basicio.hpp>
#include <iostream>
#include <iomanip>
#include <cassert>
#include <limits>
#include "MD5.h"
#include "helpers.hpp"
typedef Exiv2::byte md5digest[16];
namespace po = boost::program_options;
bool g_verbose = false;
bool g_neednewline = false;
// Array size should match number of SLOTs
boost::array<int,4> g_run_order = {{-1, -1, -1, -1}};
const int EXIF_SLOT = 0;
const int IPTC_SLOT = 1;
const int XMP_SLOT = 2;
const int FILE_SLOT = 3;
const unsigned DOT_EVERY = 55;
struct Pattern {
std::string pat;
std::string desc;
pfunc funcs[4]; // order should always be exif, iptc, xmp, file
};
struct PathPart {
std::string pre;
const Pattern *pat;
std::string post;
PathPart(std::string pre_, const Pattern *pat_, std::string post_)
: pre(pre_), pat(pat_), post(post_) {}
};
std::vector<PathPart> g_path_parts;
// Instead of making these all global
struct ProcessParams {
const fs::path &dest_dir;
const bool dry_run;
const bool ignore_dups;
const bool ignore_unsorted;
const bool force;
const bool rename;
const bool symlink;
const bool verify;
const bool move;
const long limit_depth;
const fs::path &dups_dir;
const fs::path &unsorted_dir;
const std::vector<std::string> &excludes;
unsigned dups_count;
unsigned unsorted_count;
unsigned dir_err_count;
unsigned file_err_count;
unsigned ok_count;
unsigned dups_ignored_count;
unsigned unsorted_ignored_count;
unsigned dir_ex_count;
unsigned file_ex_count;
};
void process_directory(const fs::path &directory, const long depth,
ProcessParams &params);
const Pattern g_patterns[] = {
{"@date", "date captured (2009-01-19)",
{exif_date, iptc_date, NULL, file_date} },
{"@year", "year captured (2009)",
{exif_year, iptc_year, NULL, file_year} },
{"@month", "month captured (01)",
{exif_month, iptc_month, NULL, file_month} },
{"@day", "day captured (19)",
{exif_day, iptc_day, NULL, file_day} },
{"@time", "time captured (14-35-27)",
{exif_time, iptc_time, NULL, file_time} },
{"@hour", "hour captured (14)",
{exif_hour, iptc_hour, NULL, file_hour} },
{"@min", "minute captured (35)",
{exif_minute, iptc_minute, NULL, file_minute} },
{"@sec", "second captured (27)",
{exif_second, iptc_second, NULL, file_second} },
{"@dim", "pixel dimension (2272-1704)",
{exif_dimension, NULL, NULL, file_dimension} },
{"@x", "pixel width (2272)",
{exif_width, NULL, NULL, file_width} },
{"@y", "pixel height (1704)",
{exif_height, NULL, NULL, file_height} },
{"@make", "device make (Canon)",
{exif_make, NULL, NULL, NULL} },
{"@model", "device model (Canon PowerShot S40)",
{exif_model, NULL, NULL, NULL} },
{"@speed", "shutter speed (1-60)",
{exif_speed, NULL, NULL, NULL} },
{"@aper", "aperture (F3.2)",
{exif_aperture, NULL, NULL, NULL} },
{"@iso", "iso speed (400)",
{exif_iso, NULL, NULL, NULL} },
{"@focal", "focal length (8.6 mm)",
{exif_focal, NULL, NULL, NULL} },
{"@dist", "subject distance (1.03 m)",
{exif_distance, NULL, NULL, NULL} },
{"@meter", "meter mode (multi-segment)",
{exif_meter, NULL, NULL, NULL} },
{"@macro", "macro mode (Off)",
{exif_macro, NULL, NULL, NULL} },
{"@orient", "orientation (top_left)",
{exif_orientation, NULL, NULL, NULL} },
{"@lens", "lens name (Tamron 90mm f-2.8)",
{exif_lens, NULL, NULL, NULL} },
{"@key", "first keyword (Family)",
{exif_keyword, iptc_keyword, NULL, NULL} },
{"", "", {NULL, NULL, NULL, NULL} }
};
// Check that 'opt1' and 'opt2' are not specified at the same time.
void conflicting(const po::variables_map& vm,
const char* opt1, const char* opt2)
{
if (vm.count(opt1) && !vm[opt1].defaulted()
&& vm.count(opt2) && !vm[opt2].defaulted()) {
throw std::logic_error(std::string("conflicting options '")
+ opt1 + "' and '" + opt2 + "'");
}
}
// Check that 'required' is present
void required(const po::variables_map& vm, const char* required)
{
if (!vm.count(required) || vm[required].defaulted()) {
throw std::logic_error(std::string("required parameter '") + required
+ "' is missing");
}
}
void info(const std::string &msg)
{
if(g_verbose) {
std::cout << msg << "\n";
g_neednewline = false;
}
}
void error(const std::exception &e, const std::string &msg)
{
if(g_neednewline) {
std::cout << "\n";
g_neednewline = false;
}
std::cerr << e.what() << "\n";
std::cerr << msg << std::endl;
}
void usage_header(const char* exname)
{
std::cout << "Usage: " << exname << " [options] source-dir dest-dir pattern\n";
}
void usage_full(const po::options_description &options, const char* exname)
{
usage_header(exname);
std::cout << "\n Creates groups of files in new directories defined by a metadata 'pattern'.\n" <<
" Files are copied, moved, or linked from 'source-dir' to 'dest-dir'.\n" <<
" The destination directory should not be within the source directory.\n\n";
std::cout << options;
std::cout << "\nPattern values:\n";
for( const Pattern *pattern = g_patterns; pattern->pat.length(); ++pattern) {
std::cout << " " << std::setw(8) << std::left << pattern->pat;
std::cout << pattern->desc << "\n";
}
std::cout << "\nExamples:\n";
std::cout << " `" << exname << " -m mess clean @year-@month'\n";
std::cout << " Moves files from 'mess' into directories of 'clean' according to\n" <<
" year-month the file was captured (clean/2006-11/...)\n\n";
std::cout << " `" << exname << " -o ie source find width-@x/height-@y'\n";
std::cout << " Copies files into directories according first to pixel width then pixel\n" <<
" height. Check iptc then exif metadata (find/width-2272/height-1704/...)\n\n";
std::cout << " `" << exname << " -lf source find @aper/@hour'\n";
std::cout << " Force create symlinks in directories according first to aperture then\n" <<
" hour captured (find/F3.2/15/...)\n";
std::cout << std::endl;
}
void version()
{
std::cout << "organized 0.1\n" <<
"Copyright (C) 2009 Brad Schick. <schickb@gmail.com>\n\n" <<
"This program is free software; you can redistribute it and/or\n"
"modify it under the terms of the GNU General Public License\n"
"as published by the Free Software Foundation; either version 2\n"
"of the License, or (at your option) any later version.\n"
"\n"
"This program is distributed in the hope that it will be useful,\n"
"but WITHOUT ANY WARRANTY; without even the implied warranty of\n"
"MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n"
"GNU General Public License for more details.\n"
"\n"
"You should have received a copy of the GNU General Public\n"
"License along with this program; if not, write to the Free\n"
"Software Foundation, Inc., 51 Franklin Street, Fifth Floor,\n"
"Boston, MA 02110-1301 USA" << std::endl;
}
// Returns empty string if the destination subdirectory could not be determined
// for the supplied source file.
std::string build_dest(const fs::path &source_file)
{
std::string dest;
Exiv2::Image::AutoPtr image;
try {
image = Exiv2::ImageFactory::open(source_file.string());
image->readMetadata();
}
catch(const Exiv2::AnyError&) {
// No metadata, let things continue to try file info
}
std::vector<PathPart>::iterator iter = g_path_parts.begin();
std::vector<PathPart>::iterator end = g_path_parts.end();
for( ; iter != end; ++iter) {
dest += iter->pre;
std::string result;
const Pattern *pat = iter->pat;
for(unsigned fx = 0; fx < g_run_order.size(); ++fx) {
if(g_run_order[fx] != -1 && pat->funcs[g_run_order[fx]]) {
if(g_run_order[fx] == FILE_SLOT) {
// Always run file operations
result = pat->funcs[g_run_order[fx]](image.get(), source_file);
}
else if(image.get()) {
// No point in running metadata operations without an image
result = pat->funcs[g_run_order[fx]](image.get(), source_file);
}
if(result.length())
break;
}
}
// If we found no data, even for part of pattern, give up and
// return no destination
if(!result.length())
return result;
dest += (result + iter->post);
}
return dest;
}
bool md5sum(const fs::path &path, md5digest &digest)
{
try {
Exiv2::FileIo io(path.string());
if (io.open() != 0)
return false;
Exiv2::IoCloser closer(io);
Exiv2::byte buff[4096];
MD5_CTX context;
MD5Init(&context);
long read_count = io.read(buff, 4096);
while(read_count) {
MD5Update(&context, buff, read_count);
read_count = io.read(buff, 4096);
}
MD5Final(digest, &context);
return true;
}
catch (std::exception& ) {
return false;
}
}
int main(int argc, char* argv[])
{
po::options_description options("Options");
// Don't use default values because the help print it ugly and too wide
options.add_options()
("move,m", "move files rather than copy")
("symlink,s", "symlink files rather than copy (posix only)")
("order,o", po::value<std::string>(),
"order and types of metadata to read\ne=exif, i=iptc, f=file (default: eif)")
("unsorted,u", po::value<std::string>(),
"special directory to store unsorted files (default: unsorted)")
("dups,d", po::value<std::string>(),
"special directory to store files with duplicate names (default: duplicates)")
("force,f", "overwrite duplicate files instead of using special directory")
("rename,r", "rename duplicate files instead of using special directory")
("ignore,i", "ignore both unsorted and duplicate files instead of using special directories")
("ignore-unsorted", "ignore unsorted files instead of using special directory")
("ignore-dups", "ignore duplicate files instead of using special directory")
("verify", "verify copied or moved files and exit if incorrect")
("exclude,x", po::value< std::vector<std::string> >(),
"exclude directories and files that contain arg (case sensitive on all platforms)")
("limit-depth,l", po::value<long>(),
"limit recursion to specified depth (0 disables recursion)")
("verbose,v", "prints operations as they happen")
("dry-run,n", "do not make actual changes (implies verbose)")
("help,h", "show this help message then exit")
("version,V", "show program version then exit")
;
po::options_description hidden("Hidden Options");
hidden.add_options()
("source-dir", po::value< std::string >(), "directory of files to organize, may end in file wildcard")
("dest-dir", po::value< std::string >(), "designation directory for files, may not be within source-dir")
("pattern", po::value< std::string >(), "subdirectory pattern for grouping files within dest-dir")
;
po::options_description cmdline;
cmdline.add(options).add(hidden);
po::positional_options_description positional;
positional.add("source-dir", 1);
positional.add("dest-dir", 1);
positional.add("pattern", 1);
try {
po::variables_map vm;
po::store(po::command_line_parser(argc, argv).
options(cmdline).positional(positional).run(), vm);
po::notify(vm);
if (vm.count("help")) {
usage_full(options, argv[0]);
return 0;
}
if (vm.count("version")) {
version();
return 0;
}
conflicting(vm, "verify", "symlink");
conflicting(vm, "move", "symlink");
conflicting(vm, "unsorted", "ignore");
conflicting(vm, "unsorted", "ignore-unsorted");
conflicting(vm, "dups", "ignore");
conflicting(vm, "dups", "ignore-dups");
conflicting(vm, "force", "ignore");
conflicting(vm, "force", "ignore-dups");
conflicting(vm, "force", "rename");
conflicting(vm, "rename", "ignore");
conflicting(vm, "rename", "ignore-dups");
required(vm, "source-dir");
required(vm, "dest-dir");
required(vm, "pattern");
const bool dry_run = vm.count("dry-run") != 0;
g_verbose = (vm.count("verbose") != 0 || dry_run);
std::string order = "eif";
if(vm.count("order")) {
order = vm["order"].as<std::string>();
boost::to_lower(order);
if(order.length() > 3) {
throw std::logic_error(std::string("order is longer than 4 characters"));
}
}
unsigned i = 0;
std::string::iterator end = order.end();
for(std::string::iterator iter = order.begin(); iter != end && i < 4; ++iter, ++i) {
switch(*iter) {
case 'e':
g_run_order[i] = EXIF_SLOT;
break;
case 'i':
g_run_order[i] = IPTC_SLOT;
break;
case 'x':
throw std::logic_error(std::string("xmp not implemented yet '") +
*iter + "'");
break;
case 'f':
g_run_order[i] = FILE_SLOT;
break;
default:
throw std::logic_error(std::string("unknown order character '") +
*iter + "'");
}
}
const fs::path source_dir( vm["source-dir"].as<std::string>() );
if( !exists(source_dir) || !is_directory(source_dir) ) {
throw std::logic_error(std::string("source '") +
source_dir.string() + "' must exist and be a directory");
}
const fs::path dest_dir( vm["dest-dir"].as<std::string>() );
if( exists(dest_dir) && !is_directory(dest_dir) ) {
throw std::logic_error(std::string("destination '") +
dest_dir.string() + "' must be a directory");
}
// Boost doesn't seem to have a way to get a canonical path, so this
// simple test is easy to confuse with some ../../'s in the paths. Oh
// well, this is good enough for now.
fs::path test_dest(dest_dir);
for(; !test_dest.empty(); test_dest = test_dest.parent_path()) {
if(fs::equivalent(source_dir, test_dest)) {
throw std::logic_error(std::string("dest-dir must not be within source-dir"));
}
}
// Disect the pattern
std::string pattern = vm["pattern"].as<std::string>();
boost::regex regex( "([^@]*)(@[[:alpha:]]+)([^@]*)");
boost::sregex_iterator m_iter = make_regex_iterator(pattern, regex);
boost::sregex_iterator m_end;
for( ; m_iter != m_end; ++m_iter) {
const boost::smatch &match = *m_iter;
const std::string &pre = match[1];
const std::string &pat = match[2];
const std::string &post = match[3];
// Should put this in a map, but there aren't that many options now
bool found = false;
for( const Pattern *pattern = g_patterns; pattern->pat.length(); ++pattern) {
if(pattern->pat == pat) {
PathPart part(pre, pattern, post);
g_path_parts.push_back(part);
found = true;
break;
}
}
if(!found) {
throw std::logic_error(std::string("unknown pattern '") + pat + "'");
}
}
// Assign defaults to params that need them
const bool ignore = vm.count("ignore") != 0;
std::vector<std::string> excludes;
if(vm.count("exclude"))
excludes = vm["exclude"].as< std::vector<std::string> >();
long limit_depth = LONG_MAX;
if(vm.count("limit-depth")) {
limit_depth = vm["limit-depth"].as<long>();
// Boost program_options doesn't work with unsigned, so do it manually
if( limit_depth < 0 )
throw std::logic_error(std::string("recursion depth limit must be positive"));
}
std::string dups = "duplicates";
if(vm.count("dups"))
dups = vm["dups"].as<std::string>();
const fs::path dups_dir = dest_dir / dups;
std::string unsorted = "unsorted";
if(vm.count("unsorted"))
unsorted = vm["unsorted"].as<std::string>();
const fs::path unsorted_dir = dest_dir / unsorted;
ProcessParams params = {
dest_dir,
dry_run,
(vm.count("ignore-dups") != 0 || ignore),
(vm.count("ignore-unsorted") != 0 || ignore),
vm.count("force") != 0,
vm.count("rename") != 0,
vm.count("symlink") != 0,
vm.count("verify") != 0,
vm.count("move") != 0,
limit_depth,
dups_dir,
unsorted_dir,
excludes,
0, 0, 0, 0, 0, 0, 0, 0, 0
};
process_directory(source_dir, 0, params);
std::string op = "copied";
if(params.symlink)
op = "linked";
else if(params.move)
op = "moved";
if(dry_run)
op = std::string("would be ") + op;
if(g_neednewline)
std::cout << "\n";
std::cout << "\n" << params.ok_count << " files " << op << "\n";
std::cout << " " << params.dups_count << " duplicates\n";
std::cout << " " << params.unsorted_count << " unsorted\n";
if(params.dups_ignored_count)
std::cout << params.dups_ignored_count << " duplicates ignored\n";
if(params.unsorted_ignored_count)
std::cout << params.unsorted_ignored_count << " unsorted ignored\n";
if(params.dir_ex_count)
std::cout << params.dir_ex_count << " directories excluded\n";
if(params.file_ex_count)
std::cout << params.file_ex_count << " files excluded\n";
if(params.dir_err_count)
std::cout << params.dir_err_count << " directory errors\n";
if(params.file_err_count)
std::cout << params.file_err_count << " file errors\n";
return 0;
}
catch (Exiv2::AnyError& e) {
error(e, std::string("Aborting"));
return -1;
}
catch(std::logic_error& e) {
error(e, "");
usage_header(argv[0]);
std::cout << argv[0] << " -h for more help" << std::endl;
return -2;
}
catch(std::exception& e) {
error(e, "Aborting");
return -3;
}
}
boost::regex uregex("(.*?)\\(([[:digit:]]{1,2})\\)$");
fs::path uniquify(const fs::path &dest)
{
std::string ext = dest.extension().string();
std::string fname = dest.stem().string();
fs::path parent = dest.parent_path();
unsigned number = 1;
std::string newfname;
fs::path newdest;
boost::smatch match;
if(boost::regex_search(fname, match, uregex)) {
// Matches are indexes into fname, so don't change it while reading values
newfname = match[1];
number = boost::lexical_cast<short>(match[2]);
fname = newfname;
}
do {
newfname = fname + "(" + boost::lexical_cast<std::string>(++number) + ")" + ext;
newdest = parent / newfname;
} while(fs::exists(newdest));
return newdest;
}
void process_directory(const fs::path &directory, const long depth,
ProcessParams &params)
{
// Exclude entire directories
bool exclude = false;
std::vector<std::string>::const_iterator x_iter = params.excludes.begin();
std::vector<std::string>::const_iterator x_end = params.excludes.end();
for( ; x_iter != x_end; ++x_iter ) {
if(boost::contains(directory.string(), *x_iter)) {
exclude = true;
break;
}
}
if(exclude) {
info(std::string("excluding directory: ") + directory.string() +
" matched: " + *x_iter);
++params.dir_ex_count;
return;
}
try {
fs::directory_iterator p_iter(directory), p_end;
for( ; p_iter != p_end; ++p_iter) {
if( is_directory(*p_iter) ) {
// recurse if we haven't hit the limit
if(depth < params.limit_depth)
process_directory(p_iter->path(), depth + 1, params);
else {
info(std::string("depth reached, skipping: ") +
p_iter->path().string());
}
}
else if( is_regular_file(*p_iter) ) {
// Check again for excluding file names
exclude = false;
x_iter = params.excludes.begin();
for( ; x_iter != x_end; ++x_iter ) {
if(boost::contains(p_iter->path().string(), *x_iter)) {
exclude = true;
break;
}
}
if(exclude) {
info(std::string("excluding file: ") + p_iter->path().string() +
" matched: " + *x_iter);
++params.file_ex_count;
continue;
}
try {
const fs::path dest_subdir = build_dest(*p_iter);
fs::path dest_file;
if(!dest_subdir.empty())
dest_file = params.dest_dir / dest_subdir;
else if(params.ignore_unsorted) {
info(std::string("ignoring unsorted: ") + p_iter->path().string());
++params.unsorted_ignored_count;
continue;
}
else {
info(std::string("unsorted file (missing metadata): ") + p_iter->path().string());
dest_file = params.unsorted_dir;
++params.unsorted_count;
}
dest_file /= p_iter->path().filename();
if(fs::exists(dest_file)) {
if(params.ignore_dups) {
info(std::string("ignoring: ") + p_iter->path().string() +
" duplicates: " + dest_file.string());
++params.dups_ignored_count;
continue;
}
else {
if(params.force) {
info(std::string("force removing: ") + dest_file.string() + " for: "
+ p_iter->path().string());
if(!params.dry_run)
fs::remove(dest_file);
}
else if(params.rename) {
info(std::string("renaming: ") + p_iter->path().string() +
" duplicates: " + dest_file.string());
dest_file = uniquify(dest_file);
}
else {
info(std::string("duplicate file: ") + p_iter->path().string() +
" of: " + dest_file.string());
dest_file = params.dups_dir / dest_subdir / p_iter->path().filename();
// Ugh, more dup possibilities
if(fs::exists(dest_file)) {
info(std::string("renaming: ") + p_iter->path().string() +
" duplicates: " + dest_file.string());
dest_file = uniquify(dest_file);
}
}
++params.dups_count;
}
}
if(!params.dry_run)
fs::create_directories(dest_file.parent_path());
if(params.symlink) {
info(std::string("linking from: ") + p_iter->path().string() +
" to: " + dest_file.string());
if(!params.dry_run) {
// The target of a symlink must be either absolute (aka complete) or
// relative to the location of the link. Easiest solution is to make
// a complete path.
fs::path target;
if(p_iter->path().is_complete())
target = p_iter->path();
else
target = fs::initial_path() / p_iter->path();
fs::create_symlink(target, dest_file);
}
}
else {
info(std::string("copying from: ") + p_iter->path().string() +
" to: " + dest_file.string());
if(!params.dry_run) {
// Copy the file and restore its write time (needed for posix)
std::time_t time = fs::last_write_time(*p_iter);
fs::copy_file(*p_iter, dest_file);
fs::last_write_time(dest_file, time);
if(params.verify) {
md5digest src_digest, dst_digest;
bool ok = md5sum(p_iter->path(), src_digest);
if(ok)
ok = md5sum(dest_file, dst_digest);
if(ok)
ok = (memcmp(src_digest,dst_digest, sizeof(md5digest))==0);
if(!ok) {
// Should probably find a more appropriate exception for this
throw std::runtime_error(std::string("File verification failed: '")
+ p_iter->path().string() + "' differs from '" +
dest_file.string() + "'");
}
else {
info(std::string("verification passed"));
}
}
}
}
if(params.move) {
info(std::string("removing: ") + p_iter->path().string());
if(!params.dry_run)
fs::remove(*p_iter);
}
if(!g_verbose && (params.ok_count % DOT_EVERY)==0) {
std::cout << "." << std::flush;
g_neednewline = true;
}
++params.ok_count;
}
catch(fs::filesystem_error& e) {
error(e, std::string("skipping file: " + p_iter->path().string()));
++params.file_err_count;
}
}
}
}
catch(fs::filesystem_error& e) {
error(e, std::string("skipping directory: " + directory.string()));
++params.dir_err_count;
}
}

97
contrib/vms/README.md Normal file
View File

@ -0,0 +1,97 @@
# Vagrant development boxes
This directory contains a `Vagrantfile` which can be used to automatically
create virtual machines for testing purposes. The virtual machines are
automatically provisioned with all required dependencies for building & testing
of exiv2 (the provisioning is shared with the GitLab CI).
The following Linux distributions are provided (the name in the brackets is the
name of the Vagrant VM):
- Fedora 28 ("Fedora")
- Debian 9 aka Stretch ("Debian")
- Archlinux ("Archlinux")
- Ubuntu 16.04 aka Bionic Beaver ("Ubuntu")
- CentOS 7 ("CentOS")
- OpenSUSE Tumbleweed ("OpenSUSE")
The Fedora, Archlinux and OpenSUSE boxes are the 'vanilla' distribution with
some additional packages installed.
For Debian and Ubuntu, we build gtest manually from source and install the
resulting library to /usr/lib/.
On CentOS, we have to install a `cmake3` and `python36` (the default cmake is
too old and a default python3 does not exist) which we symlink to
`/usr/bin/cmake` & `/usr/bin/python3` to retain a similar workflow to the other
distributions.
For further details, consult the shell scripts `setup.sh` and
`ci/install_dependencies.sh`.
All boxes come with `conan` installed via pip in the `vagrant` user's home
directory and the `exiv2` git repository cloned.
Please note that these VMs are not continuously tested and the provisioning can
break. Please open an issue on GitHub if you happen to encounter a problem.
## Usage
Please install [Vagrant](https://www.vagrantup.com/) and a supported provider
(e.g. libvirt, VirtualBox).
Choose a box from the above list and run in the directory where the
`Vagrantfile` resides:
``` shell
vagrant up $name
```
where `$name` is the name in the brackets in the above list, e.g. `OpenSUSE` or
`Archlinux`. Depending on your default provider you may have to set the provider
manually via `vagrant up $name --provider $provider_name` (the Ubuntu image does
only support VirtualBox, which is not the default on Linux and will result in an
error unless you explicitly set the provider to `virtualbox`).
This will download a box from the vagrantcloud and set it up. Once the whole
process is finished, you can ssh into the machine via:
``` shell
vagrant ssh $name
```
Don't forget to turn it off via `vagrant halt $name` or the VM will keep
running! A VM can be discarded when it is no longer required via `vagrant
destroy $name` (Vagrant will keep the base box around in `~/.vagrant.d/boxes`
and libvirt sometimes leaves images around in `/var/lib/libvirt/` or
`/var/libvirt`, so check these folders too).
You can also setup & start all VMs at once via `vagrant up`, but keep in mind
that it will start 6 VMs and occupy between 10 and 20 GB of disk space.
# Notes for OpenSUSE Tumbleweed
Unfortunately the OpenSUSE Tumbleweed box cannot be provisioned easily with
Vagrant as it must perform a system upgrade first, which cannot be done
non-interactively. To get the OpenSUSE box up and running, follow these steps:
``` shell
$ vagrant up OpenSUSE
# you'll get a failure in the first provisioning script
$ vagrant ssh OpenSUSE
vagrant@opensuse-exiv2:~> su - # the root password is vagrant
Password:
opensuse-exiv2:~ # zypper refresh
opensuse-exiv2:~ # zypper dup
# zypper will now perform a system upgrade
# you'll probably get a few file conflicts, confirm the overwrite with 'yes'
# once the upgrade is done, exit the ssh session
$ vagrant halt OpenSUSE
$ vagrant up OpenSUSE
$ vagrant provision OpenSUSE
```
Provided the system upgrade went fine, you should now have an OpenSUSE
Tumbleweed virtual machine ready to go.

52
contrib/vms/Vagrantfile vendored Normal file
View File

@ -0,0 +1,52 @@
Vagrant.configure("2") do |config|
config.vm.define "Fedora" do |fedora|
fedora.vm.box = "fedora/30-cloud-base"
fedora.vm.hostname = "fedora-exiv2"
end
config.vm.define "Debian" do |debian|
debian.vm.box = "generic/debian10"
debian.vm.hostname = "debian-exiv2"
end
config.vm.define "Archlinux" do |archlinux|
archlinux.vm.box = "archlinux/archlinux"
archlinux.vm.hostname = "archlinux-exiv2"
end
config.vm.define "Ubuntu" do |ubuntu|
ubuntu.vm.box = "ubuntu/bionic64"
ubuntu.vm.hostname = "ubuntu-exiv2"
end
config.vm.define "CentOS" do |centos|
centos.vm.box = "centos/7"
centos.vm.hostname = "centos-exiv2"
end
config.vm.define "openSUSE" do |opensuse|
opensuse.vm.box = "opensuse/openSUSE-Tumbleweed-Vagrant.x86_64"
opensuse.vm.hostname = "opensuse-exiv2"
end
config.vm.synced_folder ".", "/vagrant", owner: "vagrant", group: "vagrant",
disabled: false, type: "rsync"
# use the CI script from gitlab to setup all dependencies
config.vm.provision "install_dependencies", type: "shell" do |shell|
shell.path = "../../ci/install_dependencies.sh"
end
# install additional dependencies for development
config.vm.provision "install_devel_dependencies", type: "shell" do |shell|
shell.path = "setup.sh"
end
# install conan & clone the exiv2 repo
config.vm.provision "setup_repository", type: "shell" do |shell|
shell.path = "setup_user.sh"
shell.privileged = false
end
end

34
contrib/vms/setup.sh Normal file
View File

@ -0,0 +1,34 @@
#!/bin/bash
set -e
distro_id=$(grep '^ID=' /etc/os-release|awk -F = '{print $2}'|sed 's/\"//g')
case "$distro_id" in
'fedora')
dnf -y --refresh install python3-pip git
;;
'debian' | 'ubuntu')
apt-get install -y python3-pip git
;;
'arch')
pacman --noconfirm -S python-pip git
;;
'centos' | 'rhel')
yum -y install centos-release-scl-rh
yum clean all
yum -y install rh-python36-python-pip git
;;
'opensuse' | 'opensuse-tumbleweed')
zypper --non-interactive install python3-pip git
;;
*)
echo "Sorry, no predefined dependencies for your distribution exist yet"
exit 1
;;
esac

44
contrib/vms/setup_user.sh Normal file
View File

@ -0,0 +1,44 @@
#!/bin/bash
set -e
function clone_exiv2() {
git clone https://github.com/Exiv2/exiv2.git
cd exiv2
sed -i '/fetch = +refs\/heads\/\*:refs\/remotes\/origin\//a \ \ \ \ \ \ \ \ fetch = +refs\/pull\/\*\/head:refs\/remotes\/origin\/pr\/*' .git/config
cd ..
}
distro_id=$(grep '^ID=' /etc/os-release|awk -F = '{print $2}'|sed 's/\"//g')
case "$distro_id" in
'debian' | 'ubuntu' | 'fedora' | 'opensuse' | 'opensuse-tumbleweed')
PIP=pip3
;;
'arch')
PIP=pip
;;
'centos' | 'rhel')
PIP=/opt/rh/rh-python36/root/usr/bin/pip3
;;
*)
echo "Sorry, no predefined dependencies for your distribution exist yet"
exit 1
;;
esac
$PIP install conan --user --upgrade
CONAN_PROFILE=~/.conan/profiles/default
# create a new conan profile & set the used libstdc++ to use the C++11 ABI
[ -e $CONAN_PROFILE ] || ~/.local/bin/conan profile new --detect default
sed -i 's/compiler.libcxx=libstdc++/compiler.libcxx=libstdc++11/' $CONAN_PROFILE
[ -d exiv2 ] || clone_exiv2
cd exiv2 && git fetch && cd ..

370
contrib/vs2019/README.md Normal file
View File

@ -0,0 +1,370 @@
# exiv2-x86_x64
Visual Studio project files to compile exiv2 on Windows (x86/x64/Release/Debug)
## Background ##
One of the main building blocks for a personal product that I was
developing was exiv2. When I started this project, support for
building on Windows using Visual Studio was quite good and in
particular, [Robin Mills](https://github.com/clanmills), the then
maintainer, was excellent in helping resolve questions.
However, as time progressed, the maintainers changed and their primary
focus turned to Linux and Windows compilation was an afterthought
accomplished by the swiss knife CMake. In reality, CMake, produces
horrible Visual Studio solution/projects that just creates wrappers on
cmake commands, tries to enforce the Linux custom of all dependencies
installed in a single location like /usr/local as well as not
retaining the flexibility of targeting separate x86/x64/Release/Debug
builds while using the same solution/projects.
The native solution/project files were removed from the exiv2
repository and not even added to a contrib folder for others to use, a
restriction that I found particularly galling. The fact that the main
audience didn't squeal much at these steps is not the answer as they
may not care about compiling on Windows using VS. This was a
retrograde step, but they believed that they were going forward.
One can never predict what the world wants. If one is not open to
possibilities and instead, claims that my way is the highway, the
utility of the product is diminished. This project is to help people
who would like to build exiv2 using Visual Studio on Windows, to do
so.
## Building Philosophy ##
Unlike the linux-style philosophy of installing everything in and then,
linking from, /usr/local-type folders, typically source trees on
Windows are not organized that way. At least, I don't.
I prefer to git clone from the target repository onto my workspace. I
also prefer to get the latest git versions of exiv2's dependencies as
well. This is an important distinction. The usual philosophy would
install released packages (say into C:\Program Files\...) and the
compiled package to link against them.
The advantage with my approach is that I have a transparent solution
with all the dependencies visible explicitly instead of needing to be
added in areas like Additional Libraries. The debugger is able to step
into any area of the code, including the dependencies easily instead
of worrying whether the dependency is release/debug compiled and is
consistent with all the choices that I make for the application (like
static/runtime CRT or x86/x64 or Release/Debug).
This involves creating one solution and including in it, multiple
projects (its own as well as those of the dependencies) and linking
the dependencies through references instead of explicit identification
through path/folder names.
For e.g. exiv2 will be git-cloned into E:\Projects\exiv2. libexpat (a
dependency that is required for compiling exiv2) will be git-cloned to
E:\Projects\libexpat. The exiv2 solution will have the exiv2 project
(it has many more, but mentioning just one for simplifying the
exposition) and the expat-static project (from libexpat). The
expat-static project is added as a reference to the exiv2
project. This make Visual Studio generate the correct dependency
heirarchy and compile correctly.
# Folder Structure #
Because these are project files that are pre-generated, they expect
exiv2 and its dependencies to be placed in a certain hierarchy.
Each of the dependencies will have to be placed at the same level as
the exiv2 source tree.
For e.g.
C:\Sources\exiv2
C:\Sources\zlib
C:\Sources\openssl
C:\Sources\libssh2
and so on. The requirement that it be at the same level as the exiv2
source tree is only mandated in a relative sense. That is, you could
have your exiv2 sources in C:\Sources or E:\Projects or
C:\Users\Sridhar\Documents\VS\exiv2. There are problems with spaces in
directory names, because there are some commands executed through
utilities like perl which may not handle them well. Please don't use
spaces.
## VS solution/projects for dependencies ##
Just like the solution/project file to compile exiv2 in VS (the one
that you are reading about now), most dependencies also have similar
VS solution/project files, because their VS-build infrastructure is
deficient. Specific instructions on fetching these project files
separately and copying them to the source tree is given below for each
dependency. I fetch these *-x86_x64 repositories into a separate
folder from these other source folders. For e.g., I keep these in
E:\Projects\github. Whereas the sources are in E:\Projects.
Theoretically, just fetching these dependency sources, placing them in
the correct folder hierarchies, fetching their associated VS project
files and placing them in the source tree in specific folders
(detailed below) should be enough to compile exiv2 directly. exiv2
links against static libraries of these dependencies. However, these
dependencies are full-fledged distributions that contain, apart from
those libraries that exiv2 needs, other executables and tests. These
can be compiled, optionally, if one needs to test the integrity of
those projects.
## Dependency Order ##
Some of these dependencies depend on other dependencies. Thus, it is
best if they are executed in the order shown below.
# Dependencies #
* [Strawberry Perl, used v5.30.1.1 x64](strawberryperl.com)
* [NASM, used 2.14.02 x64](https://www.nasm.us/)
* [Visual Studio, used Community 2019 edition, v16.5.4](https://visualstudio.microsoft.com/)
* [zlib, tested w/ v1.2.11](https://github.com/madler/zlib.git)
* Optionally, switch the git tree to the 1.2.11 branch.
* Use my [zlib-x86_x64](https://github.com/sridharb1/zlib-x86_x64)
to compile on Windows using Visual Studio
* Copy the contents of this folder into the contrib folder
of the zlib repository. Thus, the contrib/vc14 folder of
zlib-x86_x64 should overwrite the same in zlib.
* Optionally, use zlibvc.sln to compile.
* [libexpat, tested w/ v2.2.9](https://github.com/libexpat/libexpat)
* Optionally, switch the git tree to the R_2_2_9 branch.
* Use my
[libexpat-x86_x64](https://github.com/sridharb1/libexpat-x86_x64)
to compile on Windows using Visual Studio
* Copy the build folder of libexpat-x86_x64 into the libexpat
source tree.
* Optionally, use expat.sln to compile.
* [OpenSSL, tested w/ v1.1.1g-DEV](https://github.com/openssl/openssl)
* Optionally, switch the git tree to the OpenSSL_1_1_1-stable branch.
* Use my
[openssl-x86_x64](https://github.com/sridharb1/openssl-x86_x64)
to compile on Windows using Visual Studio
* Copy the build folder of openssl-x86_x64 into the openssl tree.
* Optionally, use openssl1_1.sln to compile.
* [libssh, tested w/ v0.9.3](https://git.libssh.org/projects/libssh.git/)
* Optionally, switch the git tree to the stable-0.9 branch.
* Use my
[libssh-x86_x64](https://github.com/sridharb1/libssh-x86_x64.git)
to compile on Windows using Visual Studio
* Copy the build folder of libssh-x86_x64 into the libssh source tree.
* Optionally, use libssh.sln to compile.
* [libssh2, tested w/ v1.9.0](https://github.com/libssh2/libssh2.git)
* Optionally, switch the git tree to the libssh2-1.9.0 branch.
* Use my
[libssh2-x86_x64](https://github.com/sridharb1/libssh2-x86_x64.git)
to compile on Windows using Visual Studio
* Copy the build folder of libssh2-x86_x64 into the libssh2 source tree.
* Optionally, use libssh2.sln to compile.
* [brotli, tested w/ v1.0.7+](https://github.com/google/brotli)
* Use my
[brotli-x86_x64](https://github.com/sridharb1/brotli-x86_x64.git)
to compile on Windows using Visual Studio
* Copy the build_folder folder of brotli-x86_x64 into the brotli source tree.
* Optionally, use brotli.sln to compile.
* [curl, tested w/ v7.69.1](https://github.com/curl/curl.git)
* Optionally, switch the git tree to the 7_69_1 branch.
* Use my
[curl-x86_x64](https://github.com/sridharb1/curl-x86_x64.git)
to compile on Windows using Visual Studio
* Copy the contents of this folder into the **projects/Windows/VC15**
folder of the curl source tree. *Note: instructions different
from other projects*
* Optionally, use curl_all.sln to compile.
* [googletest, tested w/ v1.10.x](https://github.com/google/googletest.git)
* Optionally, switch the git tree to the v1.10.x branch.
* Use my
[googletest-x86_x64](https://github.com/sridharb1/googletest-x86_x64.git)
to compile on Windows using Visual Studio
* Copy the build folder of googletest-x86_x64 into the googletest source tree.
* Optionally, use googletest-distribution.sln to compile.
* [libintl (aka gettext), tested with v0.20.1](https://git.savannah.gnu.org/git/gettext.git)
* When you clone gettext, you might also clone a submodule called
gnulib. This is not necessary. You can turn off the recursive
flag while cloning.
* Optionally, switch the git tree to the 0.20.1 branch.
* Use my
[gettext-x86_x64](https://github.com/sridharb1/gettext-x86_x64)
to compile on Windows using Visual Studio
* Copy the build folder into the root folder of gettext source tree.
* Optionally, use gettext.sln to compile.
* [libiconv, tested w/ v1.16](https://github.com/sridharb1/libiconv-x86_x64)
* This dependency is a little different from the others in the
sense that in the others, you fetch the source and the VS
project files separately. In this case, this repository provides
both the source and the VS project files.
* Optionally, use libiconv.sln to compile.
# Sources #
* [Exiv2, tested w/ v0.27.2](https://github.com/Exiv2/exiv2)
* In v0.27.3, my solution/project files can be found in contrib/vs2019/solution
* For other versions, use my
[exiv2-x86_x64](https://github.com/sridharb1/exiv2-x86_x64) to
compile on Windows
* It may, in particular, not work for the HEAD or 0.28+ branches as
there have been incompatible changes made, that have not been
incorporated in these project files.
* Place the contents of exiv2-x86_x64 in a folder called
contrib/vs2019/solution in the exiv2 repository and build using
the provided solution. Please note that the dependencies listed
above are needed.
* `exiv2 -vV` (output of generated exiv2.exe on my machine for reference)
``` shell
exiv2 0.27.2
This program is free software; you can redistribute it and/or
modify it under the terms of the GNU General Public License
as published by the Free Software Foundation; either version 2
of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public
License along with this program; if not, write to the Free
Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
Boston, MA 02110-1301 USA
exiv2=0.27.2
platform=windows
compiler=MSVC
bits=64
dll=0
debug=0
cplusplus=199711
version=13.25 (2017/x64)
date=Apr 6 2020
time=11:28:24
processpath=E:\My Projects\exiv2\contrib\vs2019\solution\src\x64\Release
localedir=/../share/locale
package_name=exiv2
curlprotocols=dict file ftp ftps gopher http https imap imaps ldap ldaps pop3 pop3s rtsp scp sftp smb smbs smtp smtps telnet tftp
curl=1
executable=E:\Projects\exiv2\contrib\vs2019\solution\src\x64\Release\exiv2.exe
library=C:\WINDOWS\SYSTEM32\ntdll.dll
library=C:\Program Files\AVAST Software\Avast\aswhook.dll
library=C:\WINDOWS\System32\KERNEL32.DLL
library=C:\WINDOWS\System32\KERNELBASE.dll
library=C:\WINDOWS\System32\ucrtbase.dll
library=C:\WINDOWS\System32\PSAPI.DLL
library=C:\WINDOWS\System32\WS2_32.dll
library=C:\WINDOWS\System32\RPCRT4.dll
library=C:\WINDOWS\System32\SHELL32.dll
library=C:\WINDOWS\System32\cfgmgr32.dll
library=C:\WINDOWS\System32\shcore.dll
library=C:\WINDOWS\System32\msvcrt.dll
library=C:\WINDOWS\System32\combase.dll
library=C:\WINDOWS\System32\bcryptPrimitives.dll
library=C:\WINDOWS\System32\windows.storage.dll
library=C:\WINDOWS\System32\msvcp_win.dll
library=C:\WINDOWS\System32\sechost.dll
library=C:\WINDOWS\System32\advapi32.dll
library=C:\WINDOWS\System32\profapi.dll
library=C:\WINDOWS\System32\powrprof.dll
library=C:\WINDOWS\System32\UMPDC.dll
library=C:\WINDOWS\System32\shlwapi.dll
library=C:\WINDOWS\System32\GDI32.dll
library=C:\WINDOWS\System32\win32u.dll
have_strerror_r=0
have_gmtime_r=0
have_inttypes=0
have_libintl=0
have_lensdata=1
have_iconv=1
have_memory=1
have_lstat=0
have_regex=0
have_regex_h=0
have_stdbool=1
have_stdint=1
have_stdlib=0
have_strlib=0
have_strerror_r=0
have_strings_h=0
have_mmap=0
have_munmap=0
have_sys_stat=1
have_unistd_h=0
have_sys_mman=0
have_libz=1
have_xmptoolkit=1
adobe_xmpsdk=0
have_bool=0
have_strings=0
have_sys_types=1
have_unistd=0
have_unicode_path=1
enable_video=1
enable_webready=1
enable_nls=1
use_curl=1
use_ssh=1
config_path=C:\Users\Sridhar\exiv2.ini
xmlns=DICOM:http://ns.adobe.com/DICOM/
xmlns=GPano:http://ns.google.com/photos/1.0/panorama/
xmlns=Iptc4xmpCore:http://iptc.org/std/Iptc4xmpCore/1.0/xmlns/
xmlns=Iptc4xmpExt:http://iptc.org/std/Iptc4xmpExt/2008-02-29/
xmlns=MP:http://ns.microsoft.com/photo/1.2/
xmlns=MPRI:http://ns.microsoft.com/photo/1.2/t/RegionInfo#
xmlns=MPReg:http://ns.microsoft.com/photo/1.2/t/Region#
xmlns=MicrosoftPhoto:http://ns.microsoft.com/photo/1.0/
xmlns=acdsee:http://ns.acdsee.com/iptc/1.0/
xmlns=album:http://ns.adobe.com/album/1.0/
xmlns=asf:http://ns.adobe.com/asf/1.0/
xmlns=audio:http://www.audio/
xmlns=aux:http://ns.adobe.com/exif/1.0/aux/
xmlns=bmsp:http://ns.adobe.com/StockPhoto/1.0/
xmlns=creatorAtom:http://ns.adobe.com/creatorAtom/1.0/
xmlns=crs:http://ns.adobe.com/camera-raw-settings/1.0/
xmlns=crss:http://ns.adobe.com/camera-raw-saved-settings/1.0/
xmlns=dc:http://purl.org/dc/elements/1.1/
xmlns=dcterms:http://purl.org/dc/terms/
xmlns=digiKam:http://www.digikam.org/ns/1.0/
xmlns=dwc:http://rs.tdwg.org/dwc/index.htm
xmlns=exif:http://ns.adobe.com/exif/1.0/
xmlns=exifEX:http://cipa.jp/exif/1.0/
xmlns=expressionmedia:http://ns.microsoft.com/expressionmedia/1.0/
xmlns=iX:http://ns.adobe.com/iX/1.0/
xmlns=jp2k:http://ns.adobe.com/jp2k/1.0/
xmlns=jpeg:http://ns.adobe.com/jpeg/1.0/
xmlns=kipi:http://www.digikam.org/ns/kipi/1.0/
xmlns=lr:http://ns.adobe.com/lightroom/1.0/
xmlns=mediapro:http://ns.iview-multimedia.com/mediapro/1.0/
xmlns=mwg-kw:http://www.metadataworkinggroup.com/schemas/keywords/
xmlns=mwg-rs:http://www.metadataworkinggroup.com/schemas/regions/
xmlns=pdf:http://ns.adobe.com/pdf/1.3/
xmlns=pdfaExtension:http://www.aiim.org/pdfa/ns/extension/
xmlns=pdfaField:http://www.aiim.org/pdfa/ns/field#
xmlns=pdfaProperty:http://www.aiim.org/pdfa/ns/property#
xmlns=pdfaSchema:http://www.aiim.org/pdfa/ns/schema#
xmlns=pdfaType:http://www.aiim.org/pdfa/ns/type#
xmlns=pdfaid:http://www.aiim.org/pdfa/ns/id/
xmlns=pdfx:http://ns.adobe.com/pdfx/1.3/
xmlns=pdfxid:http://www.npes.org/pdfx/ns/id/
xmlns=photoshop:http://ns.adobe.com/photoshop/1.0/
xmlns=plus:http://ns.useplus.org/ldf/xmp/1.0/
xmlns=png:http://ns.adobe.com/png/1.0/
xmlns=rdf:http://www.w3.org/1999/02/22-rdf-syntax-ns#
xmlns=stArea:http://ns.adobe.com/xmp/sType/Area#
xmlns=stDim:http://ns.adobe.com/xap/1.0/sType/Dimensions#
xmlns=stEvt:http://ns.adobe.com/xap/1.0/sType/ResourceEvent#
xmlns=stFnt:http://ns.adobe.com/xap/1.0/sType/Font#
xmlns=stJob:http://ns.adobe.com/xap/1.0/sType/Job#
xmlns=stMfs:http://ns.adobe.com/xap/1.0/sType/ManifestItem#
xmlns=stRef:http://ns.adobe.com/xap/1.0/sType/ResourceRef#
xmlns=stVer:http://ns.adobe.com/xap/1.0/sType/Version#
xmlns=tiff:http://ns.adobe.com/tiff/1.0/
xmlns=video:http://www.video/
xmlns=wav:http://ns.adobe.com/xmp/wav/1.0/
xmlns=xml:http://www.w3.org/XML/1998/namespace
xmlns=xmp:http://ns.adobe.com/xap/1.0/
xmlns=xmpBJ:http://ns.adobe.com/xap/1.0/bj/
xmlns=xmpDM:http://ns.adobe.com/xmp/1.0/DynamicMedia/
xmlns=xmpG:http://ns.adobe.com/xap/1.0/g/
xmlns=xmpGImg:http://ns.adobe.com/xap/1.0/g/img/
xmlns=xmpMM:http://ns.adobe.com/xap/1.0/mm/
xmlns=xmpNote:http://ns.adobe.com/xmp/note/
xmlns=xmpRights:http://ns.adobe.com/xap/1.0/rights/
xmlns=xmpT:http://ns.adobe.com/xap/1.0/t/
xmlns=xmpTPg:http://ns.adobe.com/xap/1.0/t/pg/
xmlns=xmpidq:http://ns.adobe.com/xmp/Identifier/qual/1.0/
```

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,42 @@
#ifndef EXIV2API_H
#define EXIV2API_H
#ifdef exiv2lib_STATIC
# define EXIV2API
# define EXIV2LIB_NO_EXPORT
#else
# ifndef EXIV2API
# ifdef exiv2lib_EXPORTS
/* We are building this library */
# define EXIV2API
# else
/* We are using this library */
# define EXIV2API
# endif
# endif
# ifndef EXIV2LIB_NO_EXPORT
# define EXIV2LIB_NO_EXPORT
# endif
#endif
#ifndef EXIV2LIB_DEPRECATED
# define EXIV2LIB_DEPRECATED __declspec(deprecated)
#endif
#ifndef EXIV2LIB_DEPRECATED_EXPORT
# define EXIV2LIB_DEPRECATED_EXPORT EXIV2API EXIV2LIB_DEPRECATED
#endif
#ifndef EXIV2LIB_DEPRECATED_NO_EXPORT
# define EXIV2LIB_DEPRECATED_NO_EXPORT EXIV2LIB_NO_EXPORT EXIV2LIB_DEPRECATED
#endif
#if 0 /* DEFINE_NO_DEPRECATED */
# ifndef EXIV2LIB_NO_DEPRECATED
# define EXIV2LIB_NO_DEPRECATED
# endif
#endif
#endif /* EXIV2API_H */

Some files were not shown because too many files have changed in this diff Show More