What is Flow-base programming?

Thoughts, comments and discussion from Baycastle staff and partners

Moderators: Tom, ian

What is Flow-base programming?

Postby ian » Thu Jan 09, 2014 10:03 am

The migration of data (for example data import and data export) is necessarily a process that invites an approach built around packets of data. Think of each row of the data as a packet moving along a path or string, rather like a string of pearls. As the packets encounter processes they are manipulated as required. Packets sit on the string waiting for the process to be ready to carry out its nominated task. Each process wakes up when it has work to do and then goes back to sleep when it has no work to carry out.

If we give a simple example:

  • Read in data from a text file, placing each row in a data packet
  • Pass each packet to a Transformation process that carries out some manipulation of the data
  • Pass each packet to a SQL write process responsible for writing the data in each packet to a SQL database
If the Read process is quicker than the downstream processes it will halt when the output string is full. Perhaps the string is allowed to hold 10 packets of data. When full it tells the text read process to halt. As the downstream process takes packets of the string the text read process will wake up again and process more data. 

Image

FBP can be deployed with a single thread or multiple threads. It can be expanded to multiple cores, processors and indeed multiple servers. The connections have their capacity set to suite the needs of the network and to match the memory available on the machine. Where a process is slow, for example relying on an external database server, then more than one process can be added to the network to divide the task and to improve throughput. More processing power can be provided if available and the whole process is self managing and simple to understand.

DataSlave version 3 is based on this approach. Data Export, Data Import and Data Manipulation is quickly and efficiently carried out. DataSlave 3 is well suited to deployment on any machine from a small desktop to a large enterprise-level server. ETL has never been smarter.

Baycastle’s DataSlave product offers a market-leading utility for data import, data validation and data export. DataSlave 3 will enhance the offering, adding improved facilities for enterprise deployment of batch data migrations and features added to allow integration by third parties into their own products including re-branding.

“DataSlave 3 approach based on flow-based programming to data migration allowing the team to quickly build support for each new data source”, Tom, Lead Architect.

Contact us today to discuss your Data Import or Data Export projects.
ian
 
Posts: 364
Joined: Sat Dec 18, 2004 8:13 am
Location: UK

Return to Baycastle Blog

Who is online

Users browsing this forum: No registered users and 1 guest

cron