earticle

논문검색

NMS(New Media Service)

Design of a ParamHub for Machine Learning in a Distributed Cloud Environment

초록

영어

As the size of big data models grows, distributed training is emerging as an essential element for largescale machine learning tasks. In this paper, we propose ParamHub for distributed data training. During the training process, this agent utilizes the provided data to adjust various conditions of the model's parameters, such as the model structure, learning algorithm, hyperparameters, and bias, aiming to minimize the error between the model's predictions and the actual values. Furthermore, it operates autonomously, collecting and updating data in a distributed environment, thereby reducing the burden of load balancing that occurs in a centralized system. And Through communication between agents, resource management and learning processes can be coordinated, enabling efficient management of distributed data and resources. This approach enhances the scalability and stability of distributed machine learning systems while providing flexibility to be applied in various learning environments.

목차

Abstract
1. INTRODUCTION
2. PROPOSED SYSTEM
2.1. System Component
2.2. Sequence Diagram
3. COMPERATIVE ANALYSIS
4. CONCLISION
ACKNOWLEDGMENT
REFERENCES

저자정보

  • Su-Yeon Kim The master’s course, Graduate School of Smart Convergence, Kwangwoon University
  • Seok-Jae Moon Professor, Graduate School of Smart Convergence, KwangWoon University, Seoul, Korea

참고문헌

자료제공 : 네이버학술정보

    함께 이용한 논문

      ※ 원문제공기관과의 협약기간이 종료되어 열람이 제한될 수 있습니다.

      0개의 논문이 장바구니에 담겼습니다.