Here is a sample code (only sample code to understand it easily, no error handling, no close handles, and so on):
SC_HANDLE hSCManager = ::OpenSCManager(nullptr, nullptr, 0);
DWORD buffSize = 0;
::GetServiceDisplayName(hSCManager, m_serviceName, nullptr, &buffSize);
LPTSTR buff = new TCHAR[++buffSize];
VERIFY(::GetServiceDisplayName(hSCManager, m_serviceName, buff, &buffSize));
My sample service has the display name of "notepad starter"
(15 characters).
Switching between build configuations, GetServiceDisplayName()
returns a buffer size of 30 under ANSI (GetServiceDisplayNameA
), and 15 under UNICODE (GetServiceDisplayNameW
).
Documentation for this API says it returns the buffer size in characters excluding the null terminator (not well documented, but I'm expecting the buffer size to include the null terminator in the second call).
Why is it returning different buffer sizes in different build configurations?
I think the correct answer came after 6 month (i saw it yet after 3 years) from Raymond Chen;
Why is it reporting a required buffer size larger than what it actually needs?
Because character set conversion is hard.
When you call the GetServiceDisplayNameA function (ANSI version), it forwards the call to GetServiceDisplayNameW function (Unicode version). If the Unicode version says, “Sorry, that buffer is too small; it needs to be big enough to hold N Unicode characters,” the ANSI version doesn’t know how many ANSI characters that translates to. A single Unicode character could expand to as many as two ANSI characters in the case where the ANSI code page is DBCS. The GetServiceDisplayNameA function plays it safe and takes the worst-case scenario that the service display name consists completely of Unicode characters which require two ANSI characters to represent.
That’s why it over-reports the buffer size.