SQL Agent running SSIS package oddity

Encountered a weird one today and I blog in the hope that some whizz out there will know the reason why.

I have a fairly simple SSIS package that takes a flat file and imports to a database table.  The control flow looks like this:

1.   Find all files (pipe delimited) that need to be processed. Files are located on a network folder.

2. Calls a SP that creates the staging table or, if already exists, removes all rows.

3. Call DFT that imports file into staging table (see image below).

4. Calls a SP that performs and UPSERT from the staged table to the main table.

4. File system task that creates an archive folder.

5. Places processed file into archive folder.

For testing purposes I have readied a single flat file with one row of data.

Now the package executes fine on my local machine within BIDS and also via Integration Services up on my SQL server – all steps execute, data placed in the database and the file archived.

The weirdness occurs when I call the package as a step within a SQL agent job. The job runs successfully (SP called and the file is archived) but no data is placed into the staging table.

Strange!  Must be a permission thing…

The user assigned to the SQL Agent service (which the job runs under) has all relevant permissions proved by the fact that the database is called and the file is moved.  It just seems to bypass the retrieving data from the file!

The package logs and I can see the following when the package is run interactively

component OLE DB Destination" (285)" wrote 1 rows.

and this when run under the agent

component OLE DB Destination" (285)" wrote 0 rows.

This tells me that the DFT is being executed all the way to the “OLE DB Destination” task.

I have logged into the server as the user assigned to the SQL Agent service and ran the package again from IS and the agent job but the outcome was still the same!!!
The fact that running the package interactively works when using this profile suggests that the account is ok but there is a subtle issue with this user under SQL Agent.

Any ideas out there?


Update: 2010-01-20

Finally managed to resolve this.  (Much thanks to bluedog67 for some handy diagnosis tips).

Turns out that the issue was with the protection level of the package and the use of XML package configuration.

Early in my investigation into the problem I looked at the ProtectionLevel and it was set to “DontBeSensitive”.  I also set the property to this when I imported the package up to the server (just to be sure!).
What I didn’t realise was that when I created the XML package configuration (to store a conn string and folder location) the ProtectionLevel must have still been set to the default “EncryptSensitiveWithUserKey” as there were specifics denoting this when I looked at the XML (Author, Key etc).

When I removed the package configuration and deployed the package again it all worked!!!

A strange issue but one to look out for in the future.

This article makes pointers to the issue.


My fav mince pie recipe….

With Christmas in mind…

Jamie Olivers Mince Pies


SQLBits Session Review: Designing for simplification. Rob Farley

I turned up expecting a talk on SQL versus SQLCLR (I thought this may unearth some good arguments on why I should ever consider using the later) but this  had been replaced by Rob’s Designing for simplification and it turned out to be  one of my favourite sessions from SQLBits 09.

After starting the talk by demonstrating a true gift for languages by speaking a little but near perfect, Welsh (he must have had some home tutoring!) Rob took us through a number of techniques that allow the query optimiser to do its job and magically removed those pesky expensive plans.

Rob discusses some of these on his blog:-

Redundant joins

Understanding unique indexes and group by

I have recently been involved in a project where the design of a reporting database consisted of a layered structure of queries (sp’s, views, functions).  His design tips challenged a number of our ideas and meant I took away many valuable techniques just from this quick hour long session.

Those of you who know Rob Farley will know he’s a funny guy with an enormous armoury of shockingly bad jokes.  I quite like bad jokes so found that this complimented his laid back presentation style and made for a very entertaining and engaging session.

Thanks Rob.

Remove time from datetime in SQL Server

a few ways…
select convert(datetime, convert(char, getdate(), 106))

without string conversion
select cast(floor(cast(getdate() as float)) as datetime)

avoiding float to date conversion
select DATEADD(day, DATEDIFF(day, '20000101', getdate()), '20000101')

SQLBits 2009: Keynote by Donald Farmer – PowerPivot

First time attending www.sqlbits.com (thanks to my kiwi bud Don for sorting out tickets) and have to say, very impressed.

The conference was held at the glorious surroundings of the Celtic Manor Resort in Newport which is famed for it’s hosting of the Ryder Cup and, in the surroundings of such sporting pedigree, a more than apt location to entertain a few hundred SQL DBA and developer types!

The conference is held over three days and I was able to attend Friday and Saturday.  They were 4 time slots each day where you had the choice of 5 different topics. Deciding which session to attend was a nightmare as all looked fantastic but I focused on BI or Performance tuning topics.

As I sieve through the notes I took at each session I will knock up a supporting blog but first up is the keynote speech:-

Keynote by Donald Farmer (Microsoft BI Guru) SQL Server 2008 R2 and Power Pivot.

An excellent, entertaining lecture where Donald introduced PowerPivot (formerly known as Project Gemini).  This is Microsofts future of BI and provides the end user with access to potentially enormous amounts of data.  The tool is delivered as an Excel 2010 add on and Donald demoed working with a worksheet with over 101 million rows, browsing the data, filtering and sorting in real time with no performance impact – all from a laptop !  Pretty impressive.  The secret we were told is in its compression mechanics where it focuses on column based compression.

The time taken to get the data and perform the compression was not covered so a big question mark there.

To work with the data MS have provided a load of functions – taken from Cube function space.  This provides the end user with yet more analysis power.

Once the user has consumed, analysed and produced something great they then share this via Sharepoint 2010.  Again this comes as an add on.

Donald moved on the talk to show us how data can be consumed via RSS.  He very quickly targeted an existing sharepoint report, exported the data to Excel PowerPivot via Excel services and played with the data in PowerPivot. He then demonstrated how this would then be deployed back to Sharepoint.  The hosted reports were navigated to with some funky looking screens (carousel animation in abundance).  Not knowing Sharepoint that well I’m not sure whether this is already built in function, something with 2010 or comes with PowerPivot.

Donald kicked of the conference is superb style.  He’s a funny guy and his presentation theme (comparing modern day data issues with that of a bygone age – pre computers) was very entertaining – some great slides so hope the presentation is made available.

Working in the finance sector, where Excel huggers thrive and getting the data to them is always an issue, this new technology is of great interest and one in which I plan to keep an eye on.

Find out more here:- www.powerpivot.com

Other initial conference highlights…   15 pens, 2 t-shirts, 4 usb pens and a SQLBits mug!

Hello world!

Just back from the SQLBits conference in not so sunny South Wales (glorious location at the Celtic Manor Resort in Newport – the Welsh home for the 2010 Ryder cup!).  The event is run by a great bunch of people who really put on a great conference with enthusiastic knowledgeable speakers.

This inspired me to finally set up this blog (something I’ve been meaning to get around to for ages) and start logging all the things I get up to that I want to record.

Here goes…