one character of data

  • pebbleonthebeach (6/8/2009)


    http://msdn.microsoft.com/en-us/library/ms176089.aspx states "Use char when the sizes of the column data entries are consistent."

    So char(1) is the recommended answer, although nchar(1) would be safer.

    ...and http://msdn.microsoft.com/en-us/library/ms186939.aspx states "Use nchar when the sizes of the column data entries are probably going to be similar." - same argument supporting nchar(1) in this case.

    The issue isn't the choice between char and varchar (or nchar and nvarchar), it's char vs. nchar. The "correct" answer depends on whether you code for all cases allowed by the specification (you have no choice but to choose nchar) or code for the narrowest interpretation you you can put on the specification - the "developer knows best" approach as in "they said this, but I reckon they probably meant that" - (in which case you'd probably choose char).

    I come here to learn (and I have plenty to learn!) but in this case I cannot see any defence for using char. There is insufficient information in the question to come to any other conclusion. If we had more information (such as knowing the column was intended to carry a status value of 'A', 'B' or 'C') then the given answer could be appropriate, but with the total information as given it has to be nchar(1)... and I want my point back! 😉

  • Scott Coleman (6/8/2009)


    I appreciate the need for NCHAR, but in over fifteen years of database programming and administration I have yet to need to support any foreign language characters.

    The application I work on needs to support any language, so I guess we're approaching the question from different directions 🙂

  • From msdn...

    char [ ( n ) ]

    Fixed-length, non-Unicode character data with a length of n bytes. n must be a value from 1 through 8,000. The storage size is n bytes. The ISO synonym for char is character.

    varchar [ ( n | max ) ]

    Variable-length, non-Unicode character data. n can be a value from 1 through 8,000. max indicates that the maximum storage size is 2^31-1 bytes. The storage size is the actual length of data entered + 2 bytes. The data entered can be 0 characters in length. The ISO synonyms for varchar are char varying or character varying.

    The answer is definitely not varchar(1).

  • dave.farmer (6/8/2009)


    ... and I want my point back! 😉

    Dave (and others),

    I believe you got a point simply by posting your response. And hey! I think I get one for this one! Woo-hoo!!!:-P

  • john.arnott (6/8/2009)


    dave.farmer (6/8/2009)


    ... and I want my point back! 😉

    Dave (and others),

    I believe you got a point simply by posting your response. And hey! I think I get one for this one! Woo-hoo!!!:-P

    Hmm... could this be why so many people post to the thread without (apparently) reading anything that has gone before? Could/should the site implement a "negative points if you are just saying the same thing that has been said more than 5 times in the thread" policy?

    (or are others not as frustrated by the effort of separating the new or developed arguments from the mindless "XXX is best!" posts?)

    http://poorsql.com for T-SQL formatting: free as in speech, free as in beer, free to run in SSMS or on your version control server - free however you want it.

  • Tao Klerks (6/8/2009)


    john.arnott (6/8/2009)

    Hmm... could this be why so many people post to the thread without (apparently) reading anything that has gone before? Could/should the site implement a "negative points if you are just saying the same thing that has been said more than 5 times in the thread" policy?

    (or are others not as frustrated by the effort of separating the new or developed arguments from the mindless "XXX is best!" posts?)

    Unfortunately I don't think there's a technical solution, but yeah I am. Particularly since they don't even seem to be saying "me too" just posting "XXX is best!".

  • I still think CHAR(01) is the better answer - the column will not be NULL or contain an empty string. The VARCHAR(01) add additional overhead to manage variable strings by managing the max size of the column and the size of the data stored within it.

  • May as well pile on. This absolutely should have been CHAR(1) or NCHAR(1). VARCHAR or NVARCHAR were the worst choice under any fair interpretation of the definition. The answer stated would work, but the question asked what was best.

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • john.moreno (6/8/2009)


    Tao Klerks (6/8/2009)


    john.arnott (6/8/2009)

    Hmm... could this be why so many people post to the thread without (apparently) reading anything that has gone before? Could/should the site implement a "negative points if you are just saying the same thing that has been said more than 5 times in the thread" policy?

    (or are others not as frustrated by the effort of separating the new or developed arguments from the mindless "XXX is best!" posts?)

    Unfortunately I don't think there's a technical solution, but yeah I am. Particularly since they don't even seem to be saying "me too" just posting "XXX is best!".

    Some of us care about our % correct more than our actual points. I can earn a ton of points by posting about how wrong a question is, but that doesn't help to move my % correct up at all.



    --Mark Tassin
    MCITP - SQL Server DBA
    Proud member of the Anti-RBAR alliance.
    For help with Performance click this link[/url]
    For tips on how to post your problems[/url]

  • Apparently working with SQL 2000 makes the Char(1) and Varchar(1) answer correct when only considering non-Unicode characters as it doesn't mention anything about a 2-byte overhead.

    http://msdn.microsoft.com/en-us/library/aa258242(SQL.80).aspx

  • nchar(1) was the best choice, and not just because I chose it 😀 if the question was really asking for a data type that uses only one byte of storage for each character they should have been more specific...a byte does not always a character make!

    Seriously though, Microsoft breezes by the Unicode topic on a regular basis in their products as well as their documentation by saying "Unicode" instead of being specific about the encoding. See the DTS "File type" drop-down on their file source list. ANSI, Unicode or OEM, really??? They did a better job in SSIS by introducing the "Code Page" property in the "Flat File Source" component but they were late to the party...besides, SSIS has many other fundamental flaws that more than outweigh the progress they made in terms of allowing us to specify a code page. Their treatment of "Unicode" has hurt their acceptance in certain communities where most actually know what Unicode and code pages are...the differences between Windows-1252, UCS-2LE and UTF-8 are important and should be explained better in their documentation, and represented better in their products.

    The explanation of "Using Unicode Data" in SQL 2000 http://msdn.microsoft.com/en-us/library/aa223981(SQL.80).aspx makes no mention of UCS-2LE, just "Unicode", and generically and incorrectly states that Unicode only supports 65,536 different patterns.

    http://support.microsoft.com/kb/232580 applies to platforms SQL 2005 and SQL 2000 and talks about UTF-8 vs. UCS-2 but still no mention of the fact that SQL Server is in fact UCS-2 little-endian (UCS-2LE).

    Treat us better Redmond, don't worry, we can take it!

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato

  • MarkusB (6/5/2009)


    I've chosen for nchar(1) because it only says that the value won't be NULL or empty, but there's no limitation on the characters used. Also depending on your collation sometimes a character is covered by char but in another collation it's a nchar.

    Try this:

    CREATE TABLE #Chars(

    c1 char(1) COLLATE SQL_Latin1_General_Cp850_BIN,

    c2 nchar(1)COLLATE SQL_Latin1_General_Cp850_BIN,

    c3 char(1)COLLATE SQL_Latin1_General_CP1_CI_AS,

    c4 nchar(1)COLLATE SQL_Latin1_General_CP1_CI_AS)

    INSERT INTO #chars

    VALUES(CHAR(0128),CHAR(0128),CHAR(0128),CHAR(0128))

    SELECT * FROM #chars

    DROP TABLE #chars

    As you can see the nchar value is consistent, while the char value depends on the collation.

    I agree, I think the answer is wrong.

    /Håkan Winther

    /Håkan Winther
    MCITP:Database Developer 2008
    MCTS: SQL Server 2008, Implementation and Maintenance
    MCSE: Data Platform

  • This is one of the worst QotD's of this year. Not because of the question (which is fair [*]), but because of available answer options, the answer considered correct, and the explanation.

    [*] (though in all fairness, I must say that no thing exists as an empty space -the author appears to confuse empty string and the space character- and that I dislike the implied suggestion that NULL and '' (empty string) are in the same category).

    The only correct answer based on the question as stated would be "either char(1) or nchar(1), depending on the range of characters to be stored". This one was not available, leaving either one of the two options char(1) or nchar(1) as the next best.

    The explanation contains so many false statements that it actually hurts me to quote it here, so I won't. I'll just debunk them in their order of appearance.

    1) The best data type is not always the one that used the least bytes - though I admit that in 99% of all cases, size is the most important factor to choose from datatypes that all fit the logical model and the constraints.

    2) Varchar(1) would not allocate 1 byte. All varchar data allocates 2 byte for the length indication plus 1 byte per character actually used. That would be 3 bytes in this specific case.

    3) Even if there's no data in the variable, varchar is still not a better option. You'd still have the 2 bytes for the length, as opposed to the single byte for char(1).

    4) A char(1) is not padded with space when a NULL is stored. It remains NULL, which is distinct from both '' (empty string) and ' ' (single space). It is true that an empty string will be padded (and thus transformed into a single space) when stored in char(1), but neither empty string nor single space should ever be used to represent "no data"; that's what NULLs were invented for.

    I must applaud VM for submitting a lot of questions (it appears as if he/she s the only QotD author of late), but this one is really way below the bar.

    Steve, please change the question and the explanation. Many people will visit the site or come here through Google and take the explanation here for correct. They won't check the correctness by wading through 11 pages of forum discussion, but deem it correct since it's publisheed at this high profile site.


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • Cliff Jones (6/5/2009)


    I don't think any of the answers that include varchar or nvarchar can be correct.

    But the question provoked an interesting discussion.

    I agree, there is an overhead of 2 bytes for variable length and depending on if you need unicode or not the choise is between char(1) and nchar(1).

    /Håkan Winther
    MCITP:Database Developer 2008
    MCTS: SQL Server 2008, Implementation and Maintenance
    MCSE: Data Platform

  • My answer is Nchar(1).

    I suspect author of the question of speaking (and thinking) in English. We, who use accented letters, we prefer Unicode 🙂



    See, understand, learn, try, use efficient
    © Dr.Plch

Viewing 15 posts - 91 through 105 (of 182 total)

You must be logged in to reply to this topic. Login to reply