So just FYI, i have very limited SQL training, and my new position is requiring that i learn so please be gentle with me : )
First off, i saw this :
And that is exactly what i need in principle; however, my data structure is a little different. i have roughly 24 databases each averaging around 1gb of information over 8 years across 420 institutions (4 databases for each year - about 16 million observations in total). every field is identical across databases.
I need to run analyses on all of this information, but ms-access databases are limited to 2 gb, so i'm trying to figure out a workaround. my idea is to link each table into a master database, and then run queries using the selection query from the above link. my question is whether or not this will actually work. my computer has 32gb of physical memory, so i feel like i should be able to load all of it into active memory while performing any queries, but i still don't know. is there a more efficient way?
Basically, what i need to be able to query institutions over years. right now that's impossible and it's causing issues. because institutions aren't subjected to any scrutiny regarding the information they report, we need to understand how reporting trends evolved within and between them over time.
I was given a list of about 40 questions that all involve different queries that we need to run, so i'm hoping against hope that i can figure out a solution that doesn't involve me losing my mind.
Thank you all for your help!