Data velocity is the speed at which data travels in to a network and is processed. Data velocity is important in order for businesses to make fast, strategic decisions, such as trading stocks through an online broker.
In part three of information governance expert Jeffrey Ritter's big data mining webcast with SearchCompliance.com, he explains his "Velocity Principle," and why transparency is essential to efficient information governance.
"Time is money, we've learned that so well in e-discovery," Ritter said. "It takes time to find the information, but it takes even more time to validate that the information is what it purports to be."
Just like other business cases, time is a valuable information governance commodity, he added, noting that increasing data transparency helps improve data velocity. As the speed of data travelling through systems increases, businesses are better able to identify the data's provenance and verify business transactions quicker, according to Ritter.
Ritter used a metaphor of a traveler encountering an immigration officer to paint a picture of his velocity principle, and how efficient data transparency can prove beneficial to businesses, as it improves data velocity. Ritter said that like when a traveler comes to the immigration officer, it's not that different than information moving across systems.
The immigration agent has to validate the traveler's provenance: Where did this traveler come from? Does the traveler's physical description match the person's government identification and the passport?
Ritter said that if the gaining access to the traveler's identification data is slow, the traveler is stopped and must be accurately identified before being allowed to move through a country.
"That's what's happening in business intelligence analytics -- the velocity of information is directly tied to how well we can prove its provenance," he said.
"The same thing is true when we're sucking big data out of a corporate system and looking to use it in analytics to create business intelligence -- if we have questions about its provenance, about its integrity, about the effectiveness of the security controls, about the identity of each of the actors that physically access the data as users, the momentum of the data slows," Ritter said.
Ritter suggests creating rules to track data and bind the provenance data to the primary content in order to increase data velocity, and also increase data's "potential utility in creating output from analytics engines that generate true business intelligence."
Watch part three of this webcast to learn more about how improving data management transparency and velocity can boost companies' bottom line. Then visit SearchCompliance to view parts one and two to learn more from Jeffrey Ritter about the business benefits of big data mining.