000 04180nam a22005535i 4500
001 978-981-19-8934-6
003 DE-He213
005 20240207153726.0
007 cr nn 008mamaa
008 230307s2023 si | s |||| 0|eng d
020 _a9789811989346
_9978-981-19-8934-6
050 4 _aQ336
072 7 _aUN
_2bicssc
072 7 _aCOM031000
_2bisacsh
072 7 _aUN
_2thema
082 0 4 _a005.7
_223
100 1 _aWu, Hao.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
245 1 0 _aDynamic Network Representation Based on Latent Factorization of Tensors
_h[electronic resource] /
_cby Hao Wu, Xuke Wu, Xin Luo.
250 _a1st ed. 2023.
264 1 _aSingapore :
_bSpringer Nature Singapore :
_bImprint: Springer,
_c2023.
300 _aVIII, 80 p. 20 illus., 16 illus. in color.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
490 1 _aSpringerBriefs in Computer Science,
_x2191-5776
500 _aAcceso multiusuario
505 0 _aChapter 1 IntroductionChapter -- 2 Multiple Biases-Incorporated Latent Factorization of tensors -- Chapter 3 PID-Incorporated Latent Factorization of Tensors -- Chapter 4 Diverse Biases Nonnegative Latent Factorization of Tensors -- Chapter 5 ADMM-Based Nonnegative Latent Factorization of Tensors -- Chapter 6 Perspectives and Conclusion. .
520 _aA dynamic network is frequently encountered in various real industrial applications, such as the Internet of Things. It is composed of numerous nodes and large-scale dynamic real-time interactions among them, where each node indicates a specified entity, each directed link indicates a real-time interaction, and the strength of an interaction can be quantified as the weight of a link. As the involved nodes increase drastically, it becomes impossible to observe their full interactions at each time slot, making a resultant dynamic network High Dimensional and Incomplete (HDI). An HDI dynamic network with directed and weighted links, despite its HDI nature, contains rich knowledge regarding involved nodes' various behavior patterns. Therefore, it is essential to study how to build efficient and effective representation learning models for acquiring useful knowledge. In this book, we first model a dynamic network into an HDI tensor and present the basic latent factorization of tensors (LFT) model. Then, we propose four representative LFT-based network representation methods. The first method integrates the short-time bias, long-time bias and preprocessing bias to precisely represent the volatility of network data. The second method utilizes a proportion-al-integral-derivative controller to construct an adjusted instance error to achieve a higher convergence rate. The third method considers the non-negativity of fluctuating network data by constraining latent features to be non-negative and incorporating the extended linear bias. The fourth method adopts an alternating direction method of multipliers framework to build a learning model for implementing representation to dynamic networks with high preciseness and efficiency.
541 _fUABC ;
_cPerpetuidad
650 0 _aArtificial intelligence
_xData processing.
650 0 _aQuantitative research.
650 1 4 _aData Science.
650 2 4 _aData Analysis and Big Data.
700 1 _aWu, Xuke.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
700 1 _aLuo, Xin.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
710 2 _aSpringerLink (Online service)
773 0 _tSpringer Nature eBook
776 0 8 _iPrinted edition:
_z9789811989339
776 0 8 _iPrinted edition:
_z9789811989353
830 0 _aSpringerBriefs in Computer Science,
_x2191-5776
856 4 0 _zLibro electrónico
_uhttp://libcon.rec.uabc.mx:2048/login?url=https://doi.org/10.1007/978-981-19-8934-6
912 _aZDB-2-SCS
912 _aZDB-2-SXCS
942 _cLIBRO_ELEC
999 _c262952
_d262951