Web-Books
im Austria-Forum
Austria-Forum
Web-Books
Informatik
Short-Term Load Forecasting by Artificial Intelligent Technologies
Seite - 163 -
  • Benutzer
  • Version
    • Vollversion
    • Textversion
  • Sprache
    • Deutsch
    • English - Englisch

Seite - 163 - in Short-Term Load Forecasting by Artificial Intelligent Technologies

Bild der Seite - 163 -

Bild der Seite - 163 - in Short-Term Load Forecasting by Artificial Intelligent Technologies

Text der Seite - 163 -

Energies2018,11, 2038 whereT is thenumberof leaves in the treewith leafweightsw = (w1,w2, . . . ,wT).Using thesecond orderTaylorexpansion, (3) canbesimpliïŹedto: L˜(t) = n ∑ i= 1 [ gi ft(xi)+ 1 2 hi f2t (xi) ] +ÎłT+ 1 2 λ T ∑ j= 1 w2j (4) wheregi = ∂yˆ(t−1) l (yi, yˆ (t−1))andhi = ∂2yˆ(t−1) l (yi, yˆ (t−1)). Denotingby Ij = {i|q(xi) = j} the instancesetof leaf j,wecanrewrite (4), as follows: L˜(t) = T ∑ j= 1 ⎡⎣⎛⎝∑ i∈Ij gi ⎞⎠wj+ 12 ⎛⎝∑ i∈Ij hi+λ ⎞⎠w2j ⎀⎊+ÎłT (5) Therefore, theoptimalweight isgivenby: w∗j = − ∑i∈Ij gi ∑i∈Ij hi+λ (6) andthecorrespondingoptimalobjectiveby: L˜(t)(q) = −1 2 T ∑ j= 1 ( ∑i∈Ij gi )2 ∑i∈Ij hi+λ + ÎłT (7) whereq represents theoptimal treestructurewithT leavesandleafweightsw∗ = ( w∗1,w ∗ 2, . . . ,w ∗ T ) . Duetothe impossibilityofenumeratingall thepossible treestructuresq, agreedyalgorithmisused (itstartswithasingle leafandaddsbranches iteratively).Denotingby ILand IR the instancesetsof left andrightnodesafter thesplit, I = ILâˆȘ IR, thereductionintheobjectiveafter thesplit isgivenby: Lsplit = 12 ⎡⎱⎣ ( ∑i∈IL gi )2 ∑i∈IL hi+λ + ( ∑i∈IR gi )2 ∑i∈IR hi+λ − (∑i∈I gi) 2 ∑i∈I hi+λ âŽ€âŽ„âŽŠâˆ’Îł (8) Thetaskofsearchingthebestsplithasbeendevelopedintwoscenarios: anexactgreedyalgorithm (itenumeratesall thepossiblesplitsonall the features,which iscomputationaldemanding)andan approximategreedyalgorithmforbigdatasets, see [37] formoredetails. Themain difference between random forest and boosting is that the former builds the base learners independently throughbootstrapsamplingonthetrainingdataset,while the latterobtains themsequentially focusingontheerrorsof theprevious iterationandusinggradientdescentmethods. Somestrengthsof theXGBoost implementationcomparingtoothermethodsare: ‱ Anexactgreedyalgorithmisavailable. ‱ Approximateglobalandapproximate localalgorithmsareavailable forbigdatasets. ‱ Itperformsparallel learning. Besides, aneffectivecache-awareblockstructure isavailable for out-of-core tree learning. ‱ It isefïŹcient incaseofsparse inputdata (includingthepresenceofmissingvalues). The extremegradient boostingmethod (XGBoost) has been implemented bymeans of theR package“xgboost”, see [38]. Apart fromitshighlycomputationalefïŹciency, theXGBoostoffersagreatïŹ‚exibility,but it requires settingupmore thanthe tenparameters thatcouldnotbe learnedfromthedata. Taking intoaccount thatRpackage“xgboost”doesnothaveanyhyperparameter tuning, theparameter tuningcanbe 163
zurĂŒck zum  Buch Short-Term Load Forecasting by Artificial Intelligent Technologies"
Short-Term Load Forecasting by Artificial Intelligent Technologies
Titel
Short-Term Load Forecasting by Artificial Intelligent Technologies
Autoren
Wei-Chiang Hong
Ming-Wei Li
Guo-Feng Fan
Herausgeber
MDPI
Ort
Basel
Datum
2019
Sprache
englisch
Lizenz
CC BY 4.0
ISBN
978-3-03897-583-0
Abmessungen
17.0 x 24.4 cm
Seiten
448
Schlagwörter
Scheduling Problems in Logistics, Transport, Timetabling, Sports, Healthcare, Engineering, Energy Management
Kategorie
Informatik
Web-Books
Bibliothek
Datenschutz
Impressum
Austria-Forum
Austria-Forum
Web-Books
Short-Term Load Forecasting by Artificial Intelligent Technologies