I am not sure what kind of data and what kind of data management you are referring to. But from my own perspective, data management is not always a technology problem. Data management should be addressed top-down, with business owning the data quality. Technology should be used to help identify and correct data issues.
'Top-down' is actually why data management is still the way it is. Our handling of the primitives (think bottom-up approach) that build up to how things are structured at the higher level is still subpar at best, resulting in the massive inconsistencies, scaling issues and data integrity problems we face daily. Neurons (near-atomic components), from the biological sciences are still being compared analogously with more complex data structures in computing, raw grapes and wine style. Per being a technology problem, the question was about area/market segment requiring a breath of fresh air, not particularly what technology needs to be deployed.
Top-down means the top is responsible for data quality's failure. Engineering can only build what the business anticipated to get, but in the end, human needs to be the one driving the mandate. When you are looking at enterprise data management, with many data format and global business, with many legacy systems in place, data management is not possible without a full commitment from the top. Heads of each business division has to be responsible for the shit their department produced. Whenever engineering shouts "data management" the business will just leave that to the engineering folks to build automated system, but who is actually looking after the data? Machines doesn't know what they are doing - someone is coding the expectation, that's all. But data management is dynamic. You can't just have an engineer shouts "this stinks". The business needs to own the shout and the mandate.