Recently I had some problems connecting with DataGrip to my Microsoft SQL Server 2012 instance provided as part of purchased web hosting plan.

By some reason my connection was always rejected with following error message:

The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target".

Microsoft’s documentation (available here) turned out to be very helpful. According to it, the initialization is always encrypted with JBDC and I actually should focus on encrypt and trustServerCertificate parameters, setting both to ‘true’. In this configuration client-side (my tool), was expecting SSL traffic and had disabled any checks done over the certificate itself.

Partial success, for now the error message got changed to:

The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "java.security.cert.CertificateException: Certificates do not conform to algorithm constraints".

This lead to a conclusion that the certificate is invalid in more ways. Little help from stackoverflow.com revealed it might be so old that the latest Java 8 OpenJDK had put it on blacklist due to weaknesses and vulnerabilities of used MD5 signatures. Brutal and effective patch is then restored it back and whitelist ;-)

Edit file: %ProgramFiles%\JetBrains\DataGrip 2017.1.5\jre64\lib\security\java.security

and remove MD5 and MD5withRSA

from those two variables: jdk.certpath.disabledAlgorithms and jdk.tls.disabledAlgorithms.

Now, connection succeeded and we are ready to play with the database.

You are right. At this point I should stop and let them know about the issue and ask about certificate upgrade. Thanks.


It’s been quite a while since I upgraded my machine for daily development. Till four months ago I was still using my 2011’ MacBook Pro with some additional upgrades done down the line (described them here and here). Nevertheless it was more often seen it became older and older and somehow the speed and work comfort was insufficient anymore.  

Once decision was made, for the first time in my life I wanted to buy my next laptop wisely. So I wrote requirements!:

  1. it should be small – 13’’ would be preferable, but 15’’ should go fine too
  2. it should have some brand new processor and 32GB RAM or more
  3. display should support al least FullHD on IPS matrix; 4k resolution is probably unnecessary for 13’’
  4. it should allow to plug-in 2 external monitors simultaneously
  5. must have 1Gbit Ethernet socket (as there is no WiFi inside an office due to security reasons)
  6. should support 2 SSD drives, at least one must be PCIex NVMe
  7. good secondary graphic card will be a huge advantage (as the MacBook had only the build-in Intel HD Graphics 3000 – slow as a 3-wheel kid bike)
  8. doesn’t need DVD drive
  9. should support Windows 10 Pro x64
  10. BT 4.0, TPM module, SD-card reader will be an advantage
  11. not too heavy

And then started looking around for anything matching my wishes. Unfortunately, and I cried about it a lot, Apple was out almost immediately. Since I wanted to max the spec on the day one and never care about an upgrade in the future, MacBooks turned out to be extremely expensive. 2 sometimes even 3 times more than the competition with similar components. That is insane. I like them, but I am not such a fan boy.

Also Lenovo jumped out from the competition, but mostly by my personal preferences. Few years back I had two business editions of T51 and I was using them happy until both died almost at the same time, few months after warranty period. I don’t want to say anything bad about Lenovo’s quality as they were really good hardware. It just kept worrying me at the back of my head.

My final choice became: MSI GE62VR Apache Pro i7-7700HQ/32GB/1TB GTX1060 with Samsung 512GB 960 Pro M.2 2280 NVMe as the main drive. After those four months of usage I am really satisfied with this laptop.


  • it’s fast (~9 seconds for cold boot to be logged in in Windows, including PIN typing)
  • the quality of colors of the build-in display in really impressive
  • supports 3 displays simultaneously (2 externals and the build-in), that gives lots of space for developer
  • Steelseries keyboard, which is stunning with key reprogramming (changed Pause/Break into Delete) and customizable colors
  • touchpad can be disabled with a shortcut (Fn+F3)
  • it has GTX 1060 and let play newest games with really good quality and framerate
  • is VR-ready


  • it almost can’t work without power supply (1.5h if doing only presentation with only HDMI projector attached; when Visual Studio or similar IDE is running it goes down to less than 1h)
  • the given HDD is extremely noisy, I mean it!
  • playing newest games on ultra might make the center of keyboard really hot (65C or more)
  • chassis bottom is plastic not aluminum

I hope you find it useful.


I really like development in Visual Studio even if I have a feature to implement for other platforms. And anytime I can I try to continue this experience. Period.

This time I had a pleasure to wrap several HTTPS calls using libcurl. But how to make it run on my x64 Windows machine? There is some info on StackOverflow.com, what can be moved to VS2017 in following steps:

  1. Start VS2017 console: "%ProgramFiles(x86)%\Microsoft Visual Studio 14.0\VC\vcvarsall.bat" amd64
  2. Download the curl-7.53.1.zip source-code and unzip it
  3. enter “winbuild” folder
  4. compile with Windows SSL build-in support: nmake /f Makefile.vc mode=static VC=14 ENABLE_IPV6=no MACHINE=AMD64(optionally mode can be set as ‘dll’ to have later one more DLL do deal with in the project)
  5. grab the outcomes from “/builds/libcurl-vc14-AMD64-release-static-sspi-winssl” folder
  6. setup new project’s ‘include’ and ‘lib’ folder (put libcurl_a.lib or libcurl.lib into references)
  7. and remember to take a look at samples!



Lastly I have shown how to enforce encoding of strings in DBF table by setting up code-page inside its header. I also mentioned it was the easiest way. That’s still true. But sometimes there is no room to be polite and things need to be done little messy in the code (for example when the DBF file is often recreated by 3rd-party tool an can be altered in any way). So each time the string value is loaded try to recover it with those steps.

First get the original bytes stored from loaded *text* (assume that system inappropriately returned Windows-1250 encoded string):

var bytes = Encoding.GetEncoding("Windows-1250").GetBytes(text);

Secondly convert them from correct encoding (it was natively stored as Latin-2 aka CP-852) to UTF-8:

var convertedBytes = Encoding.Convert(Encoding.GetEncoding(852), Encoding.UTF8, bytes);
return Encoding.UTF8.GetString(convertedBytes);

Of course encoding objects can be cached to increase performance.


Recently I had a problem importing data from a 10-years-old set of DBF tables. All was fine until it came to reading texts with polish diacritic marks. It worked fine on 9 out of 10 machines, all with identical configurations (or at least I had hoped they are identical and couldn’t find any differences - Windows 7 x64 PL, .NET 4.5.2, the same regional options). On that single one all special letters got converted into some eye-hurting characters and looked purely wrong.

As it started to reveal, the OleDbConnection class I used to connect (with “Microsoft.Jet.OLEDB.4.0” provider) magically treated strings as Windows-1250 encoded, event though they were CP852 Latin-2. Thanks to this site, helping me to find out about it.

I tried to enforce the encoding by updating 0x1D byte of the DBF header with proper code page. Following is the list of all possible values (I used 0x64), but still it didn’t help much.




0x00No codepage defined
0x01Codepage 437 (US MS-DOS)
0x02Codepage 850 (International MS-DOS)
0x03Codepage 1252  Windows ANSI
0x04Codepage 10000  Standard MacIntosh
0x64Codepage 852  Easern European MS-DOS
0x65Codepage 866  Russian MS-DOS
0x66Codepage 865  Nordic MS-DOS
0x67Codepage 861  Icelandic MS-DOS
0x68Codepage 895  Kamenicky (Czech) MS-DOS
0x69Codepage 620  Mazovia (Polish) MS-DOS
0x6ACodepage 737  Greek MS-DOS (437G)
0x6BCodepage 857  Turkish MS-DOS
0x78Codepage 950    Chinese (Hong Kong SAR, Taiwan) Windows
0x79Codepage 949  Korean Windows
0x7ACodepage 936  Chinese (PRC, Singapore) Windows
0x7BCodepage 932  Japanese Windows
0x7CCodepage 874  Thai Windows
0x7DCodepage 1255  Hebrew Windows
0x7ECodepage 1256  Arabic Windows
0x96Codepage 10007  Russian MacIntosh
0x97Codepage 10029  MacIntosh EE
0x98Codepage 10006  Greek MacIntosh
0xC8Codepage 1250  Eastern European Windows
0xC9Codepage 1251  Russian Windows
0xCACodepage 1254  Turkish Windows
0xCBCodepage 1253  Greek Windows
all othersUnknown / invalid


Ultimately, the very old Visual FoxPro driver did the trick (with switched provider to “VFPOLEDB.1”) and respected encoding, saving me from manual strings transcoding in my C# application.


Now you have seen everything!