I am trying to automate the import of a csv file into SQL database. I tried to see online if i could get any help but i could see only SQL queries that does that job. The problem i am facing is that there is no validation that could be done as i need to validate few fields e.g Date if the date is greater than Get date() then throw some error.
When searching, of course, you more than likely will not find all you need / require in one place. You need to break that search in to pieces to find the other hooks.
Complete validation and complete error handling is often not in much of the links / resource you 'd hit. That is something left to you. As the authors are being as generic as possible, to allow for the broadest use case(s). Many of the hits though the author do have a Q&A section for you to ask if they could elaborate on a specific thing you'd have in mind.
Even here, you don't clarify what you mean by this...
> The problem i am facing is that there is no validation that could be done as i need to validate few fields e.g Date if the date is greater than Get date() then throw some error.
- Where are these Csvs coming from?
- Are they always the same format?
- Are you saying, a new Csv could have new columns / fields, that are not part of the target and thus you have to alter that target DB to host the new data columns?
- Are the fields in the Csv's always the same, and data in them, like say are the dates are always formatted the same way, etc.?
- Are they always in the same language?
- You don't show a sample of the data you are working with.
- You don't show what the DB looks like.
- You don't show any code you've tried and where it fails or is not working at all.
- What work can you show about what you have in place for validation / error handling thus far?
These are just a few things that come to mind as I read this.
What is the reason why you believe SQL queries would not work for you? (validation / error handling aside)
You can leverage SQL queries in PoSH directly using sqlcmd or SQLPS.
Many of MS own docs point to this as well as many SQL / PowewrShell MVP's do as well.
A simple web search using you search engine of choice show this.
'powershell 'import csv data into sql database'
Resources (just in case in your search you did not come across them):
Four Easy Ways to Import CSV Files to SQL Server with PowerShell
Importing CSV files into SQL Server
Windows PowerShell has built in support for creating CSV files by using the Export-CSV cmdlet. However, the creation of a CSV file is usually only a short stop in an overall process that includes loading the file into another system. In this post, we'll look at a few scripted-based approaches to import CSV data into SQL Server. Note: SQL Server includes a component specifically for data migration called SQL Server Integration Services (SSIS), which is beyond the scope of this article.
Fast CSV Import in PowerShell to SQL Server
I recently had to write some simple scripts for a proof of concept. These scripts use PowerShell to import and export CSV files from SQL Server. The scripts were simple to write and maintain but were slow. It turns out there are a few ways to improve CSV import and export performance in PowerShell.
Before I landed on PowerShell I considered a few other approaches to get this done.• SSIS. This gave me all the performance and flexibility around CSV formats that I wanted. But this would have been their first SSIS package and much of the functionality would have been embedded in those SSIS packages. I was looking for something that was a more explicit to read.• SQL Server BULK INSERT or BCP. This was more script-able but getting the format file right proved to be a challenge. The import file included quotes around the values but only if there was a comma inside the string.
High-Performance Techniques for Importing CSV to SQL Server using PowerShell
Importing into SQL Server
Two of the fastest ways to import data is to use bcp or SqlBulkCopy. BCP is finicky, and requires a separate download. My intention is to create a script that can be used on any machine running PowerShell v3 and above, so these scripts will be using SqlBulkCopy. The catch is that SqlBulkCopy.WriteToServer() requires a Datatable, Datarow, or IDataReader as input. This means your plain-text CSV data must be transformed into one of these objects prior to being passed to WriteToServer().
Import Large CSVs into SQL Server Using PowerShell
I wrote this easily reusable PowerShell script which allows for the import of an unlimited number of rows. For the data I’m working with, this script takes about 31 seconds to insert 1 million rows on an indexed table!
Thank you so much for taking the time in responding to my query. Much appreciated.
Currently i have a csv File which has a fixed format.
I have created the SQL table in the same format as of the csv file and currently using SQL jobs to upload csv onto the database and it works for me.
I understand i have not shown any code which i have worked as i do not have anything in mind. But frankly telling i am looking for some options to do some validation such as Date validation,Check if the data is already present in the table(EmpID). Some advice towards achieving them would greatly help.
Powered by IDERA