Friday, May 25, 2018

Week 2

Implementation phase

In the preceding week, I was basically trying to study  sync2, how it works and technically understanding its codebase. At the beginning of this week, I finalized on the study case, roughly made a design model for the  MergePatientData module. I also came up with a number of properties the project must portray ie:-
  • Serialisation and Encryption(Cryptography):
    Resource data will be serialized to JSON and stored in a file, which will be encrypted to a 'data.mpd' file. This can reversely be decrypted by the module. 
  • Persistance:
    The module should be in position to retrieve/save data from the OpenMRS context after-all the module merges data from one OpenMRS instance to another.
  • Auditing:
    Like in the sync2 module, Auditing is required to keep track of what really happened during a transfer process of data.
  • Configuration and Validation:
    We should configure which type of data is required to be merged.
Late this week, I started with the coding phase. I'm writing code basing on the design I came up with.

Work done so far.

  • Resource domain:
    I wrote code for almost all intended Resources .
  • Implemented a Service layer for some Resources like PatietResourceService .. However, more work is still required here.
  • I came up with a Repository(MergeAbleBatchRepo) that can store all Resources intended to be shared in a format that it could be easy to serialize. 
  • Lots work has been done. (check it out)
Best Regards 

Samuel Male

Tuesday, May 22, 2018

Week 1

Design phase 

Currently, sync2 depends on AtomFeed and Fhir/REST modules for synchronizing Patient and related metadata in Openmrs. As a matter of fact that the MergePatientData module will be an improvement of the sync module, I was advised that it should depend on the sync2 module. What basically fails sync2 is synchronizing data that existed before it was running in the OpenMRS context. Since we are depending on sync2, I spent the 1st week trying to gather information and looking at the entire codebase of sync and how it works. I came with a simple design. MergePatientData should checkout all existing synchronize-able data at a given Node(Could be child/parent). If sync is running, get data from sync tables, do a filtration of data and store it in a zip file with some encryption. The module should be able to download the zip file.
In a nutshell, I have encountered the following:
  • I started designing a workflow of how the project will be best implemented.
  • I closely studied and inspected the codebase of the sync module and left at an average understanding of the entire logic.
  • I created a skeleton(an initialization) of the module(mergepatientdata)


I don't know whether its quite trivial but in the process of studying sync, All I know it uses AtomFeed module to update it of new events/feeds but I have failed to understand how :( . 

Sunday, May 13, 2018

Google Summer of Code with OpenMRS


I received an email on April 23, it sounded like, “You got selected for Google Summer of Code 2018”. This made me pretty excited, called my friends and jazzed them about it. To my side it was a great achievement because it was my first time to apply and write a project proposal. However, according to the contributions that I made before in OpenMRS, I felt it/I was worth.

About The Project

I made two project proposals, one for Merge Patient Data From Multiple Installation  and the other for Location Based Access Control and I got selected for Merge Patient Data From Multiple Installation . Its about creating an OpenMRS module that adds the functionality of merging Patient and related Metadata from like a childNode database to the ParentNode or Central Database to OpenMRS. To some of the OpenMRS implementations,  there is no guarantee for a stable Internet connection for the "sync module" to work efficiently. 

I thank the OpenMRS community for entrusting me with this project. 

Happy coding and best regards.

Samuel Male.