원문정보
초록
영어
As the size of big data models grows, distributed training is emerging as an essential element for largescale machine learning tasks. In this paper, we propose ParamHub for distributed data training. During the training process, this agent utilizes the provided data to adjust various conditions of the model's parameters, such as the model structure, learning algorithm, hyperparameters, and bias, aiming to minimize the error between the model's predictions and the actual values. Furthermore, it operates autonomously, collecting and updating data in a distributed environment, thereby reducing the burden of load balancing that occurs in a centralized system. And Through communication between agents, resource management and learning processes can be coordinated, enabling efficient management of distributed data and resources. This approach enhances the scalability and stability of distributed machine learning systems while providing flexibility to be applied in various learning environments.
목차
1. INTRODUCTION
2. PROPOSED SYSTEM
2.1. System Component
2.2. Sequence Diagram
3. COMPERATIVE ANALYSIS
4. CONCLISION
ACKNOWLEDGMENT
REFERENCES
