Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Server Management Studio 2005 insufficient memory issues???

  • 07-05-2011 3:00pm
    #1
    Closed Accounts Posts: 3,912 ✭✭✭


    Hi folks,

    I've an SQL file that I'm trying to run, it's fairly big in fairness, it has around 80K lines in it, it creates two separate tables, and 99% of the code is inserting a small amount of into into each line in each table, (small data per record inserted such as product name, description, cost price, sale price, supplier, etc)...

    It's taking the script around 5-10 minutes to run and it's coming back with errors as in cannot run parts of the code.

    I recently had a much smaller file and it ran grand but when I added to it, it seems to be having serious issues running what are just standard additional insert queries.

    One of the problem I'm having is that I'm getting an error on MS 2005 Server Management Studio saying insufficient memory, I've nothing running in the background, just MS SQL Server Management Studio... I don't think I should be getting this amount of hassle for what are afterall just 80K lines of poxy code, everything ran fine when the file was approximately half it's size.

    I'm wondering could my (wireless "broadband" :rolleyes: Vodafone dongle), maybe be an issue here in terms of a stable internet connection)...

    ???

    :confused::confused::confused:


Comments

  • Moderators, Technology & Internet Moderators Posts: 1,336 Mod ✭✭✭✭croo


    most of the overheads with a big insert like this will be the overheads of transactions. But it sounds like this might be an initial load?
    In "the old days" when memory was measured in KB :), if I had a big load to do I would backup the DB, disable transaction management then do the load. If it didn't complete cleanly for any reason I would just restore the backup and try again. The other option might be to break the file up and commit it in parts.... or just add a few "commit"s interspersed to achieve the same.


  • Registered Users, Registered Users 2 Posts: 2,781 ✭✭✭amen


    there are few things you can do here.

    As croo suggested turn off the transaction logging while importing the file ( bit nasty though and I wouldn't really suggest it)

    You could look at the SQL bulk insert procedure and use that instead
    and it's coming back with errors as in cannot run parts of the code
    what type of errors?

    Do you have sufficient disk space?

    80k insert statements is a lot of insert statements all in on go.

    you should also look at the go statement


  • Registered Users, Registered Users 2 Posts: 2,781 ✭✭✭amen


    meant to ask are there any messages in the NT Event log?

    Could you post the the SQL server configured variables such as mem allocated, nt priority, etc


  • Closed Accounts Posts: 3,912 ✭✭✭HellFireClub


    Thanks to everyone for replying...

    I've had to abandon this little exercise for the moment , due to the absolute headache it has become over the weekend. It's an upgrade exercise for my own site, which involves increasing a DB size by approximately exactly 2/5ths in size. As things stand, the original SQL script will execute fine, but once it is extended, it runs into trouble from line 4xxx in the code, which is fairly early I think for a file with just under 100K lines of what are mainly simple enough insert instructions. I feel like I'm back in 1992 when 100K lines of anything caused problems...


  • Closed Accounts Posts: 1,759 ✭✭✭Dr.Silly


    Make sure your paging file is big enough.
    Make sure you allocate SQL to use 90% of the available memory, you can do this through sp_configure if you google it.
    Also, as mentioned, temporarily change recovery model to bulk and try a bulk insert.


  • Advertisement
Advertisement