Package: gradDescent
Maintainer: Lala Septem Riza <lala.s.riza@upi.edu>
Type: Package
Title: Gradient Descent for Regression Tasks
Version: 3.0
URL: https://github.com/drizzersilverberg/gradDescentR
Date: 2018-01-03
Author: Galih Praja Wijaya, Dendi Handian, Imam Fachmi Nasrulloh, Lala Septem Riza, Rani Megasari, Enjun Junaeti
Description: An implementation of various learning algorithms based on Gradient Descent for dealing with regression tasks. 
	The variants of gradient descent algorithm are :
	Mini-Batch Gradient Descent (MBGD), which is an optimization to use training data partially to reduce the computation load.
	Stochastic Gradient Descent (SGD), which is an optimization to use a random data in learning to reduce the computation load drastically.
	Stochastic Average Gradient (SAG), which is a SGD-based algorithm to minimize stochastic step to average.
	Momentum Gradient Descent (MGD), which is an optimization to speed-up gradient descent learning.
	Accelerated Gradient Descent (AGD), which is an optimization to accelerate gradient descent learning.
	Adagrad, which is a gradient-descent-based algorithm that accumulate previous cost to do adaptive learning.
	Adadelta, which is a gradient-descent-based algorithm that use hessian approximation to do adaptive learning.
	RMSprop, which is a gradient-descent-based algorithm that combine Adagrad and Adadelta adaptive learning ability.
	Adam, which is a gradient-descent-based algorithm that mean and variance moment to do adaptive learning.
	Stochastic Variance Reduce Gradient (SVRG), which is an optimization SGD-based algorithm to accelerates the process toward converging by reducing the gradient.
	Semi Stochastic Gradient Descent (SSGD),which is a SGD-based algorithm that combine GD and SGD to accelerates the process toward converging by choosing one of the gradients at a time.
	Stochastic Recursive Gradient Algorithm (SARAH), which is an optimization algorithm similarly SVRG to accelerates the process toward converging by accumulated stochastic information.
	Stochastic Recursive Gradient Algorithm+ (SARAHPlus), which is a SARAH practical variant algorithm to accelerates the process toward converging provides a possibility of earlier termination.
License: GPL (>= 2) | file LICENSE
RoxygenNote: 6.0.1
NeedsCompilation: no
Packaged: 2018-01-23 14:06:03 UTC; GalihPW
Repository: CRAN
Date/Publication: 2018-01-25 13:33:54 UTC
