4

So I am generating a JSON file from SQL server 2016 using 'FOR JSON'

I have used JSON_QUERY to wrap queries to prevent the escape characters from appearing before the generated double quotes ("). This worked correctly except they are still showing up for the forward slashes (/) on the formatted dates.

One thing to note is that I am converting the datetime objects in SQL using the following method CONVERT(VARCHAR, [dateEntity], 101)

An example (This is a subquery)

JSON_QUERY((
SELECT [LegacyContactID]
      ,[NameType]
      ,[LastName]
      ,[FirstName]
      ,[Active]
      ,[Primary]
      ,CONVERT(VARCHAR,[StartDate],101) AS [StartDate]
      ,CONVERT(VARCHAR,[EndDate],101) AS [EndDate]
FROM [LTSS].[ConsumerFile_02_ContactName]
WHERE [LegacyContactID] = ContactList.[LegacyContactID]
FOR JSON AUTO, WITHOUT_ARRAY_WRAPPER
)) AS ContactName

And the result will be

"ContactName": {
    "LegacyContactID": "123456789",
    "NameType": "Name",
    "LastName": "Jack",
    "FirstName": "Apple",
    "Active": true,
    "Primary": true,
    "StartDate": "04\/01\/2016",
    "EndDate": "04\/30\/2016"
}

I have the whole query wrapped in JSON_QUERY to eliminate the escaping but it still escapes the forward slashes on the dates.

I also have passed the dates as strings without the conversion and still get the same results.

Any insight?

3 Answers 3

1

One solution is to avoid the "/" generally in date, by using the "right" JSON data format

SELECT JSON_QUERY((
    SELECT TOP 1 object_id, create_date
    FROM sys.tables
    FOR JSON AUTO, WITHOUT_ARRAY_WRAPPER
))

Result

{"object_id":18099105,"create_date":"2017-08-14T11:19:22.670"}
0

UPDATED: Ah, yes, escape and CLRF characters.

Unless you your environment shows the offending characters, you will be forced to manually copy and paste from the result sets and replace the string from there.

Now, what you mention in your recent update got me considering why you feel the need to transform your data in the first place. DATES do not have formatting by default, so unless JSON is incompatible with handling SQL dates, there is really no need to transform this data inside JSON if you your target tables enforce the correct format.

So unless there is still a concern for the truncation of data, from an ETL perspective there are two ways you can accomplish this:

1 - USE STAGING TABLES

  • Staging tables can either be temporary tables, CTEs, or actual empty tables you use to extract, cleanse, and transform your data.
  • Advantages: You are only dealing with the rows being inserted, do not have to be concerned with constraints, and can easily modify OUTSIDE JSON any corruption or non-structured aspect of your data.
  • Disadvantages: Staging tables may represent more object in your database, depending how repetitive the need for them is. Thus, finding better, consistent structured data is preferable.

2 - ALTER YOUR TABLE TO USE STRINGS

  • Here you enforce the business rules cleansing the data AFTER insertion into the persistent table.
  • Advantages: You save on space, simplify the cleansing process, and can still use indexes. SQL Server is pretty efficient at parsing through DATE strings, still take advantage of EXISTS() and possible SARGS to check for not-dates when running your insert.
  • Disadvantages: You lose a primary integrity check on your table while the dates are now stored as strings, opening up possibilities of dirty data being exposed. Your UPDATE statements will be forced to use the entire table, which can drag on performances.
    JSON_QUERY((
SELECT [LegacyContactID]
      ,[NameType]
      ,[LastName]
      ,[FirstName]
      ,[Active]
      ,[Primary]
      ,[StartDate] --it already is in a dateformat
      ,[EndDate]
FROM [LTSS].[ConsumerFile_02_ContactName]
WHERE [LegacyContactID] = ContactList.[LegacyContactID]
FOR JSON AUTO, WITHOUT_ARRAY_WRAPPER
)) AS ContactName
1
  • This does not work because what is happening is the CONVERT function is generating a date in the format of 'MM/DD/YYYY' and then the FOR JSON AUTO, WITHOUT_ARRAY_WRAPPER adds the escape sequences to the slashes. I've tried applying the JSON_QUERY around the select statements but it does not help with the dates.
    – Chris
    Commented Jul 1, 2016 at 14:48
0

I have ran into some similar issues. Without going into a ton of detail, believe this is some of the reason the new JSON functionality isn't getting a ton of adoption yet from what I can see.

I've added a couple comments to the MSDN about this and a tweet:

"Why can't the auto-escaping of ALL strings be turned off with a flag???" - https://msdn.microsoft.com/en-us/library/dn921889.aspx

"Almost there, but not quite yet..." - https://msdn.microsoft.com/en-us/library/dn921882.aspx

"Anyone else frustrated with forced auto-escaping of all JSON in @SQLServer / @AzureSQLDB? (see link for my comments) msdn.microso…" - https://twitter.com/brian_jorden/status/844621512711831552

If you come across a method or way to deal with this, would love to hear in this or any of those threads, and good luck...

3
  • The complication came from a third-party needing data from me. I would send over the data which included escaped characters but the file upload on their part was failing. They blamed it on the escape characters but come to find out it had nothing to do with that but rather their parsing of the file was incorrect. So I didn't need to worry about removing the escape characters.
    – Chris
    Commented Mar 29, 2017 at 15:19
  • Does this link help?
    – Fawad Raza
    Commented Apr 7, 2017 at 9:56
  • 1
    @Fwd079 that actually gets a bit closer, although I would argue it is a terribly convoluted way to deal with something already identified in their "solve common issues" section. The super hacky workaround I've been using is to wrap my json result in replace(some_json, '\/', '/'). I've updated my comments on the now replaced documentation page because my old comment is now gone after they moved things over to here: learn.microsoft.com/en-us/sql/relational-databases/json/… Commented Aug 4, 2017 at 20:16

Not the answer you're looking for? Browse other questions tagged or ask your own question.