Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
I'm creating a data aggrator/report generator. My data collectors will output to a .csv file so my ultimate goal is to be able to dump the outputs to a specific location, then open my report generator hit data refresh an the only manual work in the report is documentation of collection process. I have to build this to be able to flex from 5 instruments recording to two collectors up to five collectors.
The headache I'm running in to is the data collectors will collect data every second, but when I export to .csv the timestamp drops the second portion. Is there a way to use a forumla inside power query to look at the minute fake in a second value or some other apporach that I'm overlooking?
Some other aspects I'm trying to also work through is that the instrument that runs the least amount of time controls lenght of the study.
Solved! Go to Solution.
Thanks all for the posts. My orginal post came at the end of a Friday that was spent trying to troubleshoot and figure out why my report generator was crashing when I had started the day expecting a smooth process and being able to finish writting and validating my SOP that this report would be a part of. After taking the weekend to get a clear head and fresh perspective on it, and digesting some information from a vendor, I've got everything working correctly this morning.
The core problem was that the .csv files had a time stamp format of m/d/yyyy h:mm and that was the information the power querry was digesting, resulting in the :ss field being populated with 00 when it would change type to date time. Once I modified the .csv to show the full time stamp including seconds everything function as expected.
Thanks all for the posts. My orginal post came at the end of a Friday that was spent trying to troubleshoot and figure out why my report generator was crashing when I had started the day expecting a smooth process and being able to finish writting and validating my SOP that this report would be a part of. After taking the weekend to get a clear head and fresh perspective on it, and digesting some information from a vendor, I've got everything working correctly this morning.
The core problem was that the .csv files had a time stamp format of m/d/yyyy h:mm and that was the information the power querry was digesting, resulting in the :ss field being populated with 00 when it would change type to date time. Once I modified the .csv to show the full time stamp including seconds everything function as expected.
Please show an example of the raw csv data and tyhe M Code you are having an issue with.
Make sure you also show (a screenshot of) the step where you first see the timestamp coming in, so right after the call to CSV.Document()
Hi @JwEvans ,
You're on the right track with your report generator setup using Power Query and CSV inputs, especially for scaling between different numbers of data collectors. Regarding the timestamp issue, since your CSV export drops the seconds from the timestamps, Power Query won’t be able to retrieve that lost data unless it’s recorded elsewhere.
Unfortunately, Power Query can’t "fake" or recreate the exact second values unless there’s a consistent pattern or an identifiable sequence in the data (like a known sampling rate). If each row represents one second of data and the data is sorted chronologically, you might be able to add an index column and calculate the missing seconds by assuming uniform intervals.
However, this approach would only work reliably if no data points are missing or delayed. As for the study length being dictated by the shortest-running instrument, you’ll likely want to build logic in Power Query to filter or trim the data from all collectors to match the duration of the shortest one. This could involve identifying the earliest end time across all instruments and filtering accordingly.
Passionate about leveraging data analytics to drive strategic decision-making and foster business growth.
Connect with me on LinkedIn: Rohit Kumar.
but when I export to .csv the timestamp drops the second portion.
That should not happen. Is the timestamp in UTC, and is it in ISO-8601 format?
Check out the April 2025 Power BI update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.