I am working on a ASP.Net WebForms legacy App and i need to retrofit one new feature into it. I am using a generated DataSet (Using VS 2013) to bridge the gap between ReportViewer and SQL server (Local reports, rdlc).
Everything Works nicely except one thing: Float conversions. On two Windows 8.1 En_US systems -10.5 (One of the values in a column) is seen on the report as -10.5 but on the server (Win 7 SP1 Es_CO) it displays as -105 even though the query is returning -10.5 on the server's local SQL instance.
I've checked out the generated code for the dataset and it casts an object from the datarows straight into double so i am assuming SQL server already handles the conversión (Via a CAST instruction on each column)
Is there anything i can do about it? It is worth mentioning all requests to the server (Win7 machine) came from one Win8.1 En_US machine.
Status update: I am hinted (Not completely sure) that the fault is in the conversion from SQL to CLR types, as marking the report column as String yields the same result.