I ran into some troubles using Mandrill PHP API
SSL certificate problem: unable to get local issuer certificate
I have dig some informations on the net about configuring cacarts file for cURL. So I have the current cert extract from http://curl.haxx.se/docs/caextract.html, and also configured (VALID) path for that file in PHP.ini
curl.cainfo=some\local\path\\certs\cacert.pem
Now I am testing multiple HTTPS websites (src of the test here)
$this->nxs_cURLTest("https://mandrillapp.com/api/1.0/users/ping.json", "TestApiCALL", "Mandrill API CERT");
$this->nxs_cURLTest("https://www.google.com/intl/en/contact/", "HTTPS to Google", "Mountain View, CA");
$this->nxs_cURLTest("http://www.google.com/intl/en/contact/", "HTTP to Google", "Mountain View, CA");
$this->nxs_cURLTest("https://www.facebook.com/", "HTTPS to Facebook", 'id="facebook"');
$this->nxs_cURLTest("https://www.linkedin.com/", "HTTPS to LinkedIn", 'link rel="canonical" href="https://www.linkedin.com/"');
$this->nxs_cURLTest("https://twitter.com/", "HTTPS to Twitter", 'link rel="canonical" href="https://twitter.com/"');
$this->nxs_cURLTest("https://www.pinterest.com/", "HTTPS to Pinterest", 'content="Pinterest"');
$this->nxs_cURLTest("https://www.tumblr.com/", "HTTPS to Tumblr", 'content="Tumblr"');
and got inconsistent results like:
Testing ... https://mandrillapp.com/api/1.0/users/ping.json - https://mandrillapp.com/api/1.0/users/ping.json
....TestApiCALL - Problem
SSL certificate problem: unable to get local issuer certificate
Array
(
[url] => https://mandrillapp.com/api/1.0/users/ping.json
[content_type] =>
[http_code] => 0
[header_size] => 0
[request_size] => 0
[filetime] => -1
[ssl_verify_result] => 0
[redirect_count] => 0
[total_time] => 0.14
[namelookup_time] => 0
[connect_time] => 0.062
[pretransfer_time] => 0
[size_upload] => 0
[size_download] => 0
[speed_download] => 0
[speed_upload] => 0
[download_content_length] => -1
[upload_content_length] => -1
[starttransfer_time] => 0
[redirect_time] => 0
[certinfo] => Array
(
)
[primary_ip] => 54.195.231.78
[primary_port] => 443
[local_ip] => 192.168.2.142
[local_port] => 63719
[redirect_url] =>
)
There is a problem with cURL. You need to contact your server admin or hosting provider.Testing ... https://www.google.com/intl/en/contact/ - https://www.google.com/intl/en/contact/
....HTTPS to Google - OK
Testing ... http://www.google.com/intl/en/contact/ - http://www.google.com/intl/en/contact/
....HTTP to Google - OK
Testing ... https://www.facebook.com/ - https://www.facebook.com/
....HTTPS to Facebook - OK
Testing ... https://www.linkedin.com/ - https://www.linkedin.com/
....HTTPS to LinkedIn - OK
Testing ... https://twitter.com/ - https://twitter.com/
....HTTPS to Twitter - OK
Testing ... https://www.pinterest.com/ - https://www.pinterest.com/
....HTTPS to Pinterest - Problem
SSL certificate problem: unable to get local issuer certificate
Array
(
[url] => https://www.pinterest.com/
[content_type] =>
[http_code] => 0
[header_size] => 0
[request_size] => 0
[filetime] => -1
[ssl_verify_result] => 0
[redirect_count] => 0
[total_time] => 0.078
[namelookup_time] => 0
[connect_time] => 0.016
[pretransfer_time] => 0
[size_upload] => 0
[size_download] => 0
[speed_download] => 0
[speed_upload] => 0
[download_content_length] => -1
[upload_content_length] => -1
[starttransfer_time] => 0
[redirect_time] => 0
[certinfo] => Array
(
)
[primary_ip] => 23.65.117.124
[primary_port] => 443
[local_ip] => 192.168.2.142
[local_port] => 63726
[redirect_url] =>
)
There is a problem with cURL. You need to contact your server admin or hosting provider.Testing ... https://www.tumblr.com/ - https://www.tumblr.com/
....HTTPS to Tumblr - OK
As you can see, in overall SSL configuration is working, but for some reason for 2 calls
I got the same error. Above links opens just fine in the browser, also their certificates with CA chains are valid. What could be reason here?
EDIT:
I have spend like 6 hours to try to fix this, and find a clue about what is going on about 2 minutes after posting question on SO. So I have read one more time info on http://curl.haxx.se/docs/caextract.html about provided extracts there. What cought my eye (now, but not 100 times I have read it before)
RSA-1024 removed
Around early September 2014, Mozilla removed the trust bits from the certs in their CA bundle that were still using RSA 1024 bit keys. This may lead to TLS libraries having a hard time to verify some sites if the library in question doesn't properly support "path discovery" as per RFC 4158. (That includes OpenSSL and GnuTLS.)
The last CA bundle we converted from before that cleanup: an older ca-bundle from github.
So I took a shot and tired budle from "before cleanup" - all tests are passing trough now! So another question is - is it about out-of-date software on my mashine like OpenSSL, PHP, cURL etc or is it that sites that were failing on tests has out-of-date certificates format according to RFC 4158 and that is what causing the troubles?
So another question is - is it about out-of-date software on my mashine like OpenSSL, PHP, cURL etc or is it that sites that were failing on tests has out-of-date certificates format according to RFC 4158 and that is what causing the troubles?
Probably none of these. The removed certificates where old Root-CA with only a 1024bit key. These certificates somehow got replaced with newer certificates, but not on the same place, i.e. if you often have multiple possible trust path:
host-cert -> intermediate.1 -> 2048bit intermediate.2 -> 1024bit root-CA
host-cert -> intermediate.1 -> 2048bit new root
The public key of the 2048bit new root
is the same as the one of the 2048bit intermediate.2
, so the signature for intermediate.1
will still match so that chain validation will succeed. But, while most TLS stack try to find the best chain OpenSSL insists of the longest chain. This means if the server sends the chain
host-cert -> intermediate.1 -> 2048bit intermediate.2
then OpenSSL will insist on finding a root-CA signing intermediate.2
, even if it has a root-CA signing intermediate.1
(i.e. 2048bit new root
). If the old 1024bit root-CA
is no longer in the trust store the validation will fail. If instead the server sends only
host-cert -> intermediate.1
then the validation will succeed with the 2048bit new root
. But lots of servers will still send the longer chain to maintain compatibility with older clients which don't have the 2048bit new root
.
All very ugly and the bug was reported in 2012 and again in 2015. OpenSSL 1.0.2 (fresh released) has at least an option X509_V_FLAG_TRUSTED_FIRST
to work around the problem and there are changes in OpenSSL git which seem to fix the issue but it is not clear if they get every backported to 1.0.2 or lower :(
For now you better just keep the old 1024bit certificates in the trust store.