Another Unique Tip when Leading Modernization
#CorporateExplorers, another unique tip when leading #modernization: explore whether you can automate your #dataMigration. 🤔 Can you refresh all test data every night?! Why or why not?
This one might seem like overkill. In short — better to tackle the hardest challenges up front, if you can. And #dataMigration most certainly represents a significant challenge in seamless systems modernization. So let’s do this! 🤼
Why would we need fresh data automatically every night? We’ll just cut over when we’re ready, right? — big bang 💥, turn key 🔑, waterfall 💦, Bob’s-your-Uncle. (Actually, I grew up with two Uncle Bobs and they rock!)
Like getting onto a fast moving walkway 🚶, if you can first get your data legs moving, you’ll find that your organization will stay in balance when your feet hit the floor. Having fresh test data at your fingertips, automatically blown away and refreshed every night, unlocks a more seamless migration.
Create simple scripts to find and import from your legacy files into your new centralized data file. You or your team can typically write these scripts in either the old or the new files — either push or pull. These scripts will encapsulate and solidify your data transposition. Rather than endless documentation — as Nike would say: just do it. 👟 Prove it out, every night, even through changes during the process. Extract, transform and load (#ETL).
We prefer to write them into the new, receiving data file so that they remain centralized and we can do things like log the migration performance.
With #Claris #FileMaker, you can also use a tool like #OttoFMS to perform these automated migrations.
Within each table, find only the data that you need to bring forward. Fight hard to bring over only what you need, ideally recent records or work in progress. Remember — junk in, junk out. And note that your old system won’t go away immediately, so you can use it to look up older work and then put the files on deep freeze for reference.
You’ll not only modernize, but you’ll also have created a great data archiving / retention precedent in the process.
Getting this logic down pat, and automating every night gives you plenty of timeline to work out the kinks and smoke out data gremlins / anomalies. 😈
You’ll need to refresh data a handful of times even if you skip this step, so why not fully automate it up front and enjoy the benefits downstream?!