Those who have read my previous posts will know I was planning on using the SSIS SharePoint Connectors for extracting data from SharePoint to SQL Server. All was going swimmingly well until I deployed them onto the server, then for some reason (and I spent a day scratching my head) they did not work. Unfortunately on this server I have limited debugging resources, and had to make a risk based decision. Do I put in debugging, find out why it does not work then fix it or try another solution. I have never struggled a whole day on a problem like that before, and at 5 in the morning (whilst asleep) I had a blinding idea!
The basic idea is to use the ADO.Net SqlBulkCopy to insert the data, but my main challenge is to make this configurable so that any SharePoint list can be inserted into any SQL Server table. To do so I have created a number of objects which store the required data. These are described below, but on a very basic level it mimics the SharePoint hierarchy.
Class | Properties |
SharepointSite | string SiteURL |
string AuditList | |
List<SharepointList> Lists | |
SharepointList | string ListName |
string SQLTableName | |
List<SharepointField> Fields | |
SharepointField | string FieldName |
string SQLFieldName |
Then using the method below we can insert the data into database,SharepointList is the object model above, the DataTable is the SharePoint list data in form of a DataSet table and the connectionstring is self explanatory.
- private static void InsertIntoDatabase(SharepointList sharepointList, DataTable data, string connectionString)
- {
- //connectionstring
- using (var connection = new SqlConnection(connectionString))
- {
- var bulkCopy = new SqlBulkCopy(connection)
- {
- DestinationTableName = sharepointList.SQLTableName
- };
- foreach (var field in sharepointList.Fields)
- {
- var mapping = new SqlBulkCopyColumnMapping
- {
- DestinationColumn = field.SQLFieldName,
- SourceColumn = field.FieldName
- };
- bulkCopy.ColumnMappings.Add(mapping);
- }
- connection.Open();
- bulkCopy.WriteToServer(data);
- }
- }
The clever bit is using Serialization for the configuration. By Serializing and De-serializing the data into and out of XML we can now use any combination of Sharepoint Lists and SQL Tables. The code below shows how to do this. You will need to mark classes as Serializable see here.
- private static SharepointSite GetSharepointSitefromSettings()
- {
- var mappingFile = ConfigurationSettings.AppSettings["MappingFile"];
- var ser = new XmlSerializer(typeof(SharepointSite));
- var xmlReader = new XmlTextReader(mappingFile);
- var sharepointSite = (SharepointSite)ser.Deserialize(xmlReader);
- xmlReader.Close();
- return sharepointSite;
- }
And this is a snippet of the XML
<SharepointSite xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<SiteURL>http://srvJupiter</SiteURL>
<Lists>
<SharepointList>
<ListName>Projects</ListName>
<SQLTableName>ProjectsStaging</SQLTableName>
<Fields>
<SharepointField>
<FieldName>Title</FieldName>
<SQLFieldName>Title</SQLFieldName>
</SharepointField>
<SharepointField>
<FieldName>ID</FieldName>
<SQLFieldName>ID</SQLFieldName>
</SharepointField>
</Fields>
</SharepointList>
</Lists>
</SharepointSite>
Conclusion
I hope this gives an insight on how I solved my problem.
Obviously this is not the complete solution, as that is too large to go into detail;
- I made this as a SharePoint Timer Job, which requires a OWSTimer.exe.config file for the settings. I stored the mappings in a separate file, which allows you to keep the xml format.
- The timer job also runs stored procedures before and after the import, which allows data transformations to happen.
- Audit data is written to a SharePoint list as well.
But there is enough code here to create a POC.