This project is a demonstration of building a full-stack web application with the following components:
- Frontend: Developed using Angular 16.
- Backend: Implemented with Go and the Gin framework.
- Database: PostgreSQL for storing user and application data.
- Message Queue: Kafka for event-driven communication.
- Data Storage for Kafka messages: A separate Go module captures Kafka messages and stores them in MongoDB.
This project showcases the integration of these technologies to create a scalable, robust, and efficient web application, emphasizing proficiency in frontend, backend, and distributed systems.
-
Frontend:
- Angular 16
- Angular Material (for UI components like tables and forms)
-
Backend:
- Go
- Gin Web Framework
- PostgreSQL (as the main relational database)
- Kafka (for message brokering)
-
Data Storage:
- MongoDB (for storing Kafka messages)
- Frontend (Angular): Located in the
/angular_godirectory. - Backend (Go and PostgreSQL): Located in the
/go_userlistdirectory. - Kafka Consumer (Go and MongoDB): Located in the
/go_mongo_kafkadirectory.
- Node.js (v16 or higher)
- Go (v1.20 or higher)
- PostgreSQL (v13 or higher)
- MongoDB (v5 or higher)
- Kafka (with Zookeeper)
-
Navigate to the
angular_godirectory.cd angular_go -
Install Angular dependencies.
npm install
-
Run the Angular development server.
ng serve
-
The application will be available at
http://localhost:4200.
-
Ensure that PostgreSQL is installed and running. Create a database for the project.
-
Update the PostgreSQL connection configuration in the
backenddirectory (likely located inconfig.go).const ( Host = "localhost" Port = 5432 User = "your-username" Password = "your-password" Dbname = "your-dbname" )
-
Navigate to the
go_userlistdirectory.cd go_userlist -
Install the required Go modules.
go mod tidy
-
Run database migrations (if applicable).
-
Run the Go backend server.
go run . -
The API server will be available at
http://localhost:8080.
-
Ensure that Kafka and Zookeeper are installed and running.
-
Start Zookeeper.
zookeeper-server-start.sh config/zookeeper.properties
-
Start Kafka.
kafka-server-start.sh config/server.properties
-
Create Kafka topics.
kafka-topics.sh --create --topic prism-user-create --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1 kafka-topics.sh --create --topic prism-user-delete --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1 kafka-topics.sh --create --topic prism-user-update --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1
-
Ensure that MongoDB is installed and running. Create three collections: user-delete, user-new, and user-update.,
-
Navigate to the
go_mongo_kafkadirectory.cd go_mongo_kafka -
Update MongoDB connection settings in
consumer.go.const ( mongoURI = "mongodb://localhost:27017" dbName = "mydb" collection = "messages" )
-
Install the required Go modules.
go mod tidy
-
Run the Kafka consumer to listen for messages and store them in MongoDB.
go run .
- Interact with the Angular frontend by performing actions that will trigger API calls to the Go backend.
- The backend will publish events to Kafka.
- The Kafka consumer module will capture those messages and store them in MongoDB.
The Kafka consumer is a separate Go module that listens to Kafka messages from a specific topic (e.g., prism-user-update). Upon receiving a message, it parses the message and stores it in a MongoDB collection. This demonstrates event-driven architecture and decouples message processing from the main application.
- Connects to the Kafka broker and subscribes to a topic.
- Reads messages as they arrive.
- Processes each message and inserts it into MongoDB, preserving the message structure.
This separation allows for horizontal scalability where multiple consumers can be added to handle high message volumes, ensuring efficient processing and persistence of event data.
This project demonstrates the creation of a full-stack application leveraging Angular, Go, PostgreSQL, Kafka, and MongoDB, integrating both synchronous and asynchronous workflows. The use of Kafka allows the application to scale, while MongoDB serves as an ideal storage solution for unstructured event data. This system can be expanded further to include more microservices and advanced features such as fault-tolerant processing, security, and more.