Skip to content

Commit ebba904

Browse files
balanscottJonathan Myerbalanscott-healthcaredataguy
authored
Enhancement/nodeapis resolvingissues (#68)
* Updates: - dataattributegenerator.controller.js - made the names more understandable - mgmt_automtd_dataattribute_generation.js - resolved a cont variable issue causing runtime issues. - Resolved API endpoint inconsistency and create a random data query API named randomdata.controller.js. - Created an operational random query node code set. This will help for those implementations that are more than API only * added latest random query generated in node script with parameters * added latest random query generated in node script with parameters * Updates: - Enhanced randomdata_queries.js to properly output single transactions * Updates: - Enhanced Existing data API, this involves all the dataexisting.controller.js code to ensure that better json responses were provided, when data is not returned a consistent message is returned with some details for the requestor * Updates: - Created a generatedata_industrystds.js file so data generation can be run from the command line as needed * Updates: - Revised the DataGenerated APIs in datagenerated.controller.js to better formate the api output. * Updates: - Enhancements to Node-APIs README.md. - Enhancements to Usage-Node-Assets.md based on testing and active implementation feedback. - Cleanup of the node code modules generatedata_dataattributes.js and generatedata_datastructures.js * Updates: - Enhancements to datamodel.controller.js, dataplatform.controller.js, implementationdata.controller.js, randomdata.controller.js to enable better Api responses. - Cleaned up implementationdata.controller.js queries to perform better and be more accurate. * Updates: - Enhanced Existing data API, this involves all the referencedata.controller.js code to ensure that better json responses were provided, when data is not returned a consistent message is returned with some details for the requestor - Enhanced Existing data API, this involves all the termsdata.controller.js code to ensure that better json responses were provided, when data is not returned a consistent message is returned with some details for the requestor Co-authored-by: Jonathan Myer <[email protected]> Co-authored-by: Alan Scott <[email protected]>
1 parent 7e41407 commit ebba904

19 files changed

+943
-228
lines changed

DataTier-APIs/Node-APIs/README.md

Lines changed: 47 additions & 45 deletions
Original file line numberDiff line numberDiff line change
@@ -4,48 +4,52 @@ There is no specific plans to ONLY have one technology for APIs. Currently, we a
44
the best way to address and keep feature parity because we want to ensure that we dont limit
55
technology.
66

7-
For these assets you will want to ensure you have the needed versions of Node, npm and yarn installed and working for your environment.
7+
For these assets you will want to ensure you have the needed versions of Node, npm and yarn installed and working for
8+
your environment.
89

910
# Settings
10-
The biggest thing to understand is that all settings for this are contained within a .env file. It is important to know
11-
that if you clone the repository the file WILL NOT be included or created. You must manually create a .env file and
12-
the settings used are defined below.
11+
The biggest thing to understand is that all settings for this solution are done through environment variable.
12+
It is important to know that if you clone the repository the file WILL NOT be included or created.
13+
14+
Here is the real world example of the environment variables:
1315

1416
```
17+
# Platform Settings
18+
export httpPort=8001
19+
export runQuantity=7500
1520
# Auditing
16-
auditing=false
17-
auditingTopicName=kic_dataintgrtntransactions
18-
appintegrationauditingTopicName=kic_appintgrtntransactions
19-
# Output
20-
# values: kafka kafka-datapersistence file rdbms nosql
21-
outputAdapter=file
21+
export auditing=false
22+
export auditingTopicName=kic_appintgrtntransactions
23+
# Output values: kafka kafka-datapersistence file rdbms nosql
24+
export outputAdapter=kafka-datapersistence
2225
# Output Setting
23-
edi_location
24-
fhir_location
25-
hl7_location
26+
export edi_location=undefined
27+
export fhir_location=undefined
28+
export hl7_location=undefined
2629
# Kafka Settings
27-
kafka_server=localhost:9092
28-
kafka_group=""
29-
KAFKA_CONSUMER_TOPIC= ""
30-
KAFKA_PRODUCE_TOPIC=""
31-
kafka_client_id="1234"
30+
export kafka_server=localhost:9092
31+
export kafka_group=undefined
32+
export KAFKA_CONSUMER_TOPIC= undefined
33+
export KAFKA_PRODUCE_TOPIC=undefined
34+
export kafka_client_id="1234"
3235
# Database Tech
33-
rdbms=postgreSQL
36+
export rdbms=postgreSQL
3437
# Postgres Database Setting
35-
PostgreSQL_URL=postgres://postgres:Developer123@localhost:5432/datasynthesis+
38+
export dbURL=postgres://postgres:Developer123@localhost:5432/datasynthesis
3639
# MySQL/MariaDB Database Setting
37-
#dbhost=127.0.0.1
38-
#dbuser=root
39-
#dbpassword=Developer123
40-
#db=datasynthesis
40+
export dbHost=127.0.0.1
41+
export dbPort=1234
42+
export dbUser=root
43+
export dbPassword=Developer123
44+
export dbName=datasynthesis
4145
# Vendor Centric Settings
4246
# iDaaS
43-
iDaaS_FHIR_Server_URI=""
44-
iDaaS_Cloud=true
45-
iDaaS_Cloud_Kafka=
47+
export iDaaS_FHIR_Server_URI=undefined
48+
export iDaaS_Cloud=true
49+
export iDaaS_DataSymthesis_Kafka=idaas_datasynthesis
4650
```
4751

48-
# Pre-Requisites
52+
# Pre-Requisites - Node v > 12
4953
This section is intended to help with any pre-requisites and we have tried to make them as
5054
specific to OS as we can.
5155

@@ -61,14 +65,14 @@ brew install npm <br/>
6165
brew install yarn <br/>
6266
brew upgrade <package> <br/>
6367

64-
# Windows
68+
## Windows
6569
Find the download from https://nodejs.org/en/download/ and install it.
6670

67-
# Linux
71+
## Linux
6872
Depending on your flavor of Linux you will find the needed downloads
6973
https://nodejs.org/en/download/ or within your Linux implementation.
7074

71-
## Node
75+
# Node
7276
We always prefer to be very close to the latest Node and Project releases as their are constant performance and security
7377
enhancements occuring within the technology.
7478

@@ -82,15 +86,10 @@ or
8286
yarn install
8387
```
8488

85-
# IDE or Command Line Experience
86-
If you are wanting to leverage the libraries and look at the code from a development experience perspective, then either
87-
having all the proper node
88-
89-
## Running in IDE
90-
The following section is intended to cover generic IDE and platform usage. To date though as long as IDEs have been
91-
setup and are working with Node then we have seen no issues.
89+
# Command Line Experience
90+
From a command line you can follow the following common commands to use the Node APIs.
9291

93-
### Starting the Solution
92+
## Installing/Updating Needed Packages
9493
Always make sure you have either install or updated the packages first:
9594

9695
Install:
@@ -105,6 +104,9 @@ Upgrade:
105104
npm upgrade
106105
```
107106

107+
## Starting the Solution
108+
Always make sure you have either install or updated the packages first:
109+
108110
To start the solution from the command line at the project level simply type:
109111
```
110112
npm start
@@ -115,24 +117,24 @@ Or, if you want to work with it locally and potentially enhance it then from the
115117
nodemon app.js
116118
```
117119

120+
## Running in IDE
121+
The following section is intended to cover generic IDE and platform usage. To date though as long as IDEs have been
122+
setup and are working with Node then we have seen no issues.
123+
118124
# Implementation and Usage
119125
The capabilities delivered through this code base are extensive, below is a series of links to help guide specific
120126
implementation needs and usage based scenarios. Within the capabilities provided by the developed Node-APIs.
121127

122-
123-
124-
125128
| Node Implementation Type | Description |
126129
|--------------------------|------------------------------------------------------------------------|
127-
|[Node APIs](Usage-Node-APIs.md) | APIs developed to provided DataSynthesis data access and functionality |
128130
|[Node Usage](Usage-Node-Assets.md)| Assets developed to provided DataSynthesis platform. |
131+
|[Node APIs](Usage-Node-APIs.md) | APIs developed to provided DataSynthesis data access and functionality |
129132

130133
# Testing APIs
131134
To help enable resources to leverage the APIs we have pre-built and are continuing to enhance a set of PostMan APIs.
132135
The intent is to that anyone can see how the APIs can be leveraged simply and directly.
133136

134-
https://www.postman.com/balanscott/workspace/datasynthesis/collection/16526170-6e45e3ca-8eaf-47c9-a0cb-0e024a852505
135-
137+
https://go.postman.co/workspace/DataSynthesis~6a46c0cf-955b-49b4-b495-68940fde4c31/collection/16526170-6e45e3ca-8eaf-47c9-a0cb-0e024a852505?action=share&creator=16526170
136138

137139
Happy Coding
138140

DataTier-APIs/Node-APIs/Usage-Node-Assets.md

Lines changed: 77 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,81 @@
11
# Node-Assets
2+
Within the Node-API efforts there are a set of node assets that can be run from the command line on any
3+
machine where these are implemented.
24

35
# Pre-Requisites
6+
- Node installed and configured to work from command line or IDE
7+
- Based on your OS the environment variables set. We have multiple ways we have seen these implemented through
8+
implementations.
9+
- The code repo cloned
10+
11+
# Assets
12+
The following are the command line assets that can run and what they are designed for. These assets will automatically
13+
output to whatever is defined within the environment variable named outputAdapter.
14+
15+
Values for outputAdapter are: kafka kafka-datapersistence file rdbms nosql. The most commonly used and established
16+
ones are kafka-persistence and file.
17+
18+
| Node Implementation Type | Description |
19+
|------------------------------------------|---------------------------------------------------------------------|
20+
| generatedata_dataattributes.js | Ability to generate data attriubutes for platform |
21+
| generatedata_datastructures.js | Ability to generate data structures for platform |
22+
| generatedata_industrystds.js | Ability to generate industry standards data from platform |
23+
| mgmt_automtd_dataattribute_generation.js | Ability to leverage an automated data generator for data attributes |
24+
| randomdata_queries.js | Ability to generate data structures for platform |
25+
26+
## Usage
27+
In this section we will provide some specific examples, these are not exhaustive as there are several
28+
hundred plus ways as these assets are very extensible.
29+
30+
### Generate Data Attributes
31+
This provides the SAME capabilities as the API for generating data attributes found at:
32+
/api/generatedata/generate/<attributename>?limit=xxx
33+
34+
There are two arguments, one is specific and required the second one if not included will be defaulted to the runQuantity
35+
environment variable.
36+
37+
node generatedata_dataattributes.js <attributename> <quantity>
38+
39+
1. Generate accountnumbers with the included regular expression. This will use the environment variable quantity
40+
defined within the runQuantity.
41+
42+
node generatedata_dataattributes.js accountnumbers
43+
44+
2. Generate accountnumbers with the included regular expression. This will generate 525 records.
45+
46+
node generatedata_dataattributes.js accountnumber 525
47+
48+
### Generate Data Structures
49+
This provides the SAME capabilities as the API for generating data attributes found at:
50+
/api/generatedata/generatedatastructures/namedstructure?count=3250&datastructurename=Person Demographics
51+
52+
There is only argument, the quantity generated will be based on the runQuantity environment variable.
53+
54+
node generatedata_datastructures.js <datastructure name>
55+
56+
1. Generate Person Demographics
57+
58+
node generatedata_datastructures.js "Person Demographics"
59+
60+
### Generate Industry Standards
61+
This provides the SAME capabilities as the API for generating data attributes found at:
62+
/api/industrystds/generator-hl7?count=100
63+
64+
There are two arguments, one is specific and required the second one if not included will be defaulted to the runQuantity
65+
environment variable.
66+
67+
generatedata_industrystds.js <industrystd> <quantity>
68+
69+
1. Generate 500 HL7 Messages
70+
71+
generatedata_industrystds.js hl7 500
72+
73+
### Automated Data Attribute Generation
74+
There is NO API that provides this capability overall, the functionality is available per data attribute within the developed
75+
APIs; however, this is intended to be run and as long as it is running creating data attirbutes as defined within the
76+
management subsystems. The definition also has the quantity, so it is intended to be an all encompassing record.
77+
78+
FYI: as of this content creation this was in place but not fully developed!!
79+
80+
node mgmt_automtd_dataattribute_generation.js
481

5-
# Implementation/Usage

DataTier-APIs/Node-APIs/api-dataoutput/industrystds-test.industrystds

Whitespace-only changes.

DataTier-APIs/Node-APIs/api/datagenerators/dataattributegenerator.controller.js

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ router.get("/addresses", async(req, res) => {
1717

1818
});
1919

20-
router.get("/phone-numbers", async(req, res) => {
20+
router.get("/phonenumbers-us", async(req, res) => {
2121
const number_of_phone_numbers = parseInt(req.query.count) || 1000;
2222
const country = req.query.country || "US";
2323
const results = dataattributesGenerator.generateUSPhoneNumbers(number_of_phone_numbers, country)

0 commit comments

Comments
 (0)