Page - 124 - in Short-Term Load Forecasting by Artificial Intelligent Technologies
Image of the Page - 124 -
Text of the Page - 124 -
Energies2018,11, 3283
3.2.3. EstimatingtheYear-AheadConsumption
Theyear-aheadloadaimstoutilize the trendof theannualelectrical loadbyshowingthepower
consumptionof thesameweekof thepreviousyear.However, theelectrical loadof theexact same
weekof thepreviousyear isnotalwaysusedbecause thedaysof theweekaredifferentandpopular
Koreanholidaysarecelebratedaccordingto the lunarcalendar. Everyweekof theyearhasaunique
weeknumberbasedonISO-8601 [32].Asmentionedbefore, theaverageofpowerconsumptionsofall
holidaysorworkdaysof theweekarecalculatedtowhichthepredictiontimebelongsanddepending
on theyear, oneyear comprises52or53weeks. In thecaseof an issue suchas theprediction time
belongs to the53rdweek, there isnosameweeknumber in thepreviousyear. Tosolve thisproblem,
thepowerconsumptionof the52ndweekfromthepreviousyear isutilizedsince the twoweekshave
similarexternal factors. Especially,electrical loadsshowvery lowconsumptiononaspecialholiday
like theLunarNewYearholidaysandKoreanThanksgivingdays [35]. Toshowthisusagepattern,
theaveragepowerconsumptionof thepreviousyear’s specialholidayrepresents theyear-ahead’s
special holiday’s load. Theweek number can differ depending on the year, so representing the
year-ahead’sspecialholidaypowerconsumptioncannotbedonedirectlyusingtheweeknumberof
theholiday. This issuecanbehandledeasilybyexchangingthepowerconsumptionof theweekand
theweekof theholiday in thepreviousyear. Figure3showsanexampleofestimating theyear-ahead
consumption. If thecurrent timeisMondayof the33rdweek2016,weuse the33rdweek’selectrical
loadof the lastyear. Toestimate theyear-aheadconsumptionofSundayof the33rdweek,weuse the
averageof theelectrical loadsof theholidaysof the33rdweekof the lastyear.
Figure3.Exampleofestimatingtheyear-aheadconsumption.
3.2.4. LoadForecastingBasedonLSTMNetworks
Arecurrentneuralnetwork(RNN) isaclassofANNwhereconnectionsbetweenunits forma
directedgraphalongasequence.Unlikea feedforwardneuralnetwork(FFNN),RNNscanuse their
internal stateormemorytoprocess inputsequences [36]. RNNscanhandle timeseriesdata inmany
applications, suchasunsegmented, connectedhandwriting recognitionor speech recognition [37].
However,RNNshaveproblemsinthat thegradientcanbeextremelysmallor large; theseproblems
are called the vanishing gradient and exploding gradient problems. If the gradient is extremely
small,RNNscannot learndatawith long-termdependencies. Ontheotherhand, if thegradient is
extremely large, itmoves theRNNparameters farawayanddisrupts the learningprocess. Tohandle
the vanishing gradient problem, previous studies [38,39] have proposed sophisticatedmodels of
RNNarchitectures. One successfulmodel is long short-termmemory (LSTM),which solves the
RNNproblemthroughacell stateandaunit calledacellwithmultiplegates. LSTMNetworksuse
amethod that influences the behinddata by reflecting the learned informationwith theprevious
data as the learning progresseswith time. Therefore, it is suitable for time series data, such as
124
Short-Term Load Forecasting by Artificial Intelligent Technologies
- Title
- Short-Term Load Forecasting by Artificial Intelligent Technologies
- Authors
- Wei-Chiang Hong
- Ming-Wei Li
- Guo-Feng Fan
- Editor
- MDPI
- Location
- Basel
- Date
- 2019
- Language
- English
- License
- CC BY 4.0
- ISBN
- 978-3-03897-583-0
- Size
- 17.0 x 24.4 cm
- Pages
- 448
- Keywords
- Scheduling Problems in Logistics, Transport, Timetabling, Sports, Healthcare, Engineering, Energy Management
- Category
- Informatik