Seite - 163 - in Short-Term Load Forecasting by Artificial Intelligent Technologies
Bild der Seite - 163 -
Text der Seite - 163 -
Energies2018,11, 2038
whereT is thenumberof leaves in the treewith leafweightsw = (w1,w2, . . . ,wT).Using thesecond
orderTaylorexpansion, (3) canbesimpliïŹedto:
LË(t) = n
â
i= 1 [
gi ft(xi)+ 1
2 hi f2t (xi) ]
+ÎłT+ 1
2 λ T
â
j= 1 w2j (4)
wheregi = âyË(tâ1) l (yi, yË (tâ1))andhi = â2yË(tâ1) l (yi, yË (tâ1)).
Denotingby Ij = {i|q(xi) = j} the instancesetof leaf j,wecanrewrite (4), as follows:
LË(t) = T
â
j= 1 âĄâŁâââ
iâIj gi ââ wj+ 12 âââ
iâIj hi+λ ââ w2j â€âŠ+ÎłT (5)
Therefore, theoptimalweight isgivenby:
wâj = â âiâIj gi
âiâIj hi+λ (6)
andthecorrespondingoptimalobjectiveby:
LË(t)(q) = â1
2 T
â
j= 1 (
âiâIj gi )2
âiâIj hi+λ + ÎłT (7)
whereq represents theoptimal treestructurewithT leavesandleafweightswâ = ( wâ1,w â
2, . . . ,w â
T )
.
Duetothe impossibilityofenumeratingall thepossible treestructuresq, agreedyalgorithmisused
(itstartswithasingle leafandaddsbranches iteratively).Denotingby ILand IR the instancesetsof left
andrightnodesafter thesplit, I = ILâȘ IR, thereductionintheobjectiveafter thesplit isgivenby:
Lsplit = 12 âĄâąâŁ (
âiâIL gi )2
âiâIL hi+λ + (
âiâIR gi )2
âiâIR hi+λ â (âiâI gi) 2
âiâI hi+λ â€â„âŠâÎł (8)
Thetaskofsearchingthebestsplithasbeendevelopedintwoscenarios: anexactgreedyalgorithm
(itenumeratesall thepossiblesplitsonall the features,which iscomputationaldemanding)andan
approximategreedyalgorithmforbigdatasets, see [37] formoredetails.
Themain difference between random forest and boosting is that the former builds the base
learners independently throughbootstrapsamplingonthetrainingdataset,while the latterobtains
themsequentially focusingontheerrorsof theprevious iterationandusinggradientdescentmethods.
Somestrengthsof theXGBoost implementationcomparingtoothermethodsare:
âą Anexactgreedyalgorithmisavailable.
âą Approximateglobalandapproximate localalgorithmsareavailable forbigdatasets.
âą Itperformsparallel learning. Besides, aneffectivecache-awareblockstructure isavailable for
out-of-core tree learning.
âą It isefïŹcient incaseofsparse inputdata (includingthepresenceofmissingvalues).
The extremegradient boostingmethod (XGBoost) has been implemented bymeans of theR
packageâxgboostâ, see [38].
Apart fromitshighlycomputationalefïŹciency, theXGBoostoffersagreatïŹexibility,but it requires
settingupmore thanthe tenparameters thatcouldnotbe learnedfromthedata. Taking intoaccount
thatRpackageâxgboostâdoesnothaveanyhyperparameter tuning, theparameter tuningcanbe
163
Short-Term Load Forecasting by Artificial Intelligent Technologies
- Titel
- Short-Term Load Forecasting by Artificial Intelligent Technologies
- Autoren
- Wei-Chiang Hong
- Ming-Wei Li
- Guo-Feng Fan
- Herausgeber
- MDPI
- Ort
- Basel
- Datum
- 2019
- Sprache
- englisch
- Lizenz
- CC BY 4.0
- ISBN
- 978-3-03897-583-0
- Abmessungen
- 17.0 x 24.4 cm
- Seiten
- 448
- Schlagwörter
- Scheduling Problems in Logistics, Transport, Timetabling, Sports, Healthcare, Engineering, Energy Management
- Kategorie
- Informatik