Search results for: resistor network
2828 Multi-Agent System Based Distributed Voltage Control in Distribution Systems
Authors: A. Arshad, M. Lehtonen. M. Humayun
Abstract:
With the increasing Distributed Generation (DG) penetration, distribution systems are advancing towards the smart grid technology for least latency in tackling voltage control problem in a distributed manner. This paper proposes a Multi-agent based distributed voltage level control. In this method a flat architecture of agents is used and agents involved in the whole controlling procedure are On Load Tap Changer Agent (OLTCA), Static VAR Compensator Agent (SVCA), and the agents associated with DGs and loads at their locations. The objectives of the proposed voltage control model are to minimize network losses and DG curtailments while maintaining voltage value within statutory limits as close as possible to the nominal. The total loss cost is the sum of network losses cost, DG curtailment costs, and voltage damage cost (which is based on penalty function implementation). The total cost is iteratively calculated for various stricter limits by plotting voltage damage cost and losses cost against varying voltage limit band. The method provides the optimal limits closer to nominal value with minimum total loss cost. In order to achieve the objective of voltage control, the whole network is divided into multiple control regions; downstream from the controlling device. The OLTCA behaves as a supervisory agent and performs all the optimizations. At first, a token is generated by OLTCA on each time step and it transfers from node to node until the node with voltage violation is detected. Upon detection of such a node, the token grants permission to Load Agent (LA) for initiation of possible remedial actions. LA will contact the respective controlling devices dependent on the vicinity of the violated node. If the violated node does not lie in the vicinity of the controller or the controlling capabilities of all the downstream control devices are at their limits then OLTC is considered as a last resort. For a realistic study, simulations are performed for a typical Finnish residential medium-voltage distribution system using Matlab ®. These simulations are executed for two cases; simple Distributed Voltage Control (DVC) and DVC with optimized loss cost (DVC + Penalty Function). A sensitivity analysis is performed based on DG penetration. The results indicate that costs of losses and DG curtailments are directly proportional to the DG penetration, while in case 2 there is a significant reduction in total loss. For lower DG penetration, losses are reduced more or less 50%, while for higher DG penetration, loss reduction is not very significant. Another observation is that the newer stricter limits calculated by cost optimization moves towards the statutory limits of ±10% of the nominal with the increasing DG penetration as for 25, 45 and 65% limits calculated are ±5, ±6.25 and 8.75% respectively. Observed results conclude that the novel voltage control algorithm proposed in case 1 is able to deal with the voltage control problem instantly but with higher losses. In contrast, case 2 make sure to reduce the network losses through proposed iterative method of loss cost optimization by OLTCA, slowly with time.Keywords: distributed voltage control, distribution system, multi-agent systems, smart grids
Procedia PDF Downloads 3122827 Analyzing the Commentator Network Within the French YouTube Environment
Authors: Kurt Maxwell Kusterer, Sylvain Mignot, Annick Vignes
Abstract:
To our best knowledge YouTube is the largest video hosting platform in the world. A high number of creators, viewers, subscribers and commentators act in this specific eco-system which generates huge sums of money. Views, subscribers, and comments help to increase the popularity of content creators. The most popular creators are sponsored by brands and participate in marketing campaigns. For a few of them, this becomes a financially rewarding profession. This is made possible through the YouTube Partner Program, which shares revenue among creators based on their popularity. We believe that the role of comments in increasing the popularity is to be emphasized. In what follows, YouTube is considered as a bilateral network between the videos and the commentators. Analyzing a detailed data set focused on French YouTubers, we consider each comment as a link between a commentator and a video. Our research question asks what are the predominant features of a video which give it the highest probability to be commented on. Following on from this question, how can we use these features to predict the action of the agent in commenting one video instead of another, considering the characteristics of the commentators, videos, topics, channels, and recommendations. We expect to see that the videos of more popular channels generate higher viewer engagement and thus are more frequently commented. The interest lies in discovering features which have not classically been considered as markers for popularity on the platform. A quick view of our data set shows that 96% of the commentators comment only once on a certain video. Thus, we study a non-weighted bipartite network between commentators and videos built on the sub-sample of 96% of unique comments. A link exists between two nodes when a commentator makes a comment on a video. We run an Exponential Random Graph Model (ERGM) approach to evaluate which characteristics influence the probability of commenting a video. The creation of a link will be explained in terms of common video features, such as duration, quality, number of likes, number of views, etc. Our data is relevant for the period of 2020-2021 and focuses on the French YouTube environment. From this set of 391 588 videos, we extract the channels which can be monetized according to YouTube regulations (channels with at least 1000 subscribers and more than 4000 hours of viewing time during the last twelve months).In the end, we have a data set of 128 462 videos which consist of 4093 channels. Based on these videos, we have a data set of 1 032 771 unique commentators, with a mean of 2 comments per a commentator, a minimum of 1 comment each, and a maximum of 584 comments.Keywords: YouTube, social networks, economics, consumer behaviour
Procedia PDF Downloads 692826 Development of Fault Diagnosis Technology for Power System Based on Smart Meter
Authors: Chih-Chieh Yang, Chung-Neng Huang
Abstract:
In power system, how to improve the fault diagnosis technology of transmission line has always been the primary goal of power grid operators. In recent years, due to the rise of green energy, the addition of all kinds of distributed power also has an impact on the stability of the power system. Because the smart meters are with the function of data recording and bidirectional transmission, the adaptive Fuzzy Neural inference system, ANFIS, as well as the artificial intelligence that has the characteristics of learning and estimation in artificial intelligence. For transmission network, in order to avoid misjudgment of the fault type and location due to the input of these unstable power sources, combined with the above advantages of smart meter and ANFIS, a method for identifying fault types and location of faults is proposed in this study. In ANFIS training, the bus voltage and current information collected by smart meters can be trained through the ANFIS tool in MATLAB to generate fault codes to identify different types of faults and the location of faults. In addition, due to the uncertainty of distributed generation, a wind power system is added to the transmission network to verify the diagnosis correctness of the study. Simulation results show that the method proposed in this study can correctly identify the fault type and location of fault with more efficiency, and can deal with the interference caused by the addition of unstable power sources.Keywords: ANFIS, fault diagnosis, power system, smart meter
Procedia PDF Downloads 1402825 Water Body Detection and Estimation from Landsat Satellite Images Using Deep Learning
Authors: M. Devaki, K. B. Jayanthi
Abstract:
The identification of water bodies from satellite images has recently received a great deal of attention. Different methods have been developed to distinguish water bodies from various satellite images that vary in terms of time and space. Urban water identification issues body manifests in numerous applications with a great deal of certainty. There has been a sharp rise in the usage of satellite images to map natural resources, including urban water bodies and forests, during the past several years. This is because water and forest resources depend on each other so heavily that ongoing monitoring of both is essential to their sustainable management. The relevant elements from satellite pictures have been chosen using a variety of techniques, including machine learning. Then, a convolution neural network (CNN) architecture is created that can identify a superpixel as either one of two classes, one that includes water or doesn't from input data in a complex metropolitan scene. The deep learning technique, CNN, has advanced tremendously in a variety of visual-related tasks. CNN can improve classification performance by reducing the spectral-spatial regularities of the input data and extracting deep features hierarchically from raw pictures. Calculate the water body using the satellite image's resolution. Experimental results demonstrate that the suggested method outperformed conventional approaches in terms of water extraction accuracy from remote-sensing images, with an average overall accuracy of 97%.Keywords: water body, Deep learning, satellite images, convolution neural network
Procedia PDF Downloads 902824 Using Geospatial Analysis to Reconstruct the Thunderstorm Climatology for the Washington DC Metropolitan Region
Authors: Mace Bentley, Zhuojun Duan, Tobias Gerken, Dudley Bonsal, Henry Way, Endre Szakal, Mia Pham, Hunter Donaldson, Chelsea Lang, Hayden Abbott, Leah Wilcynzski
Abstract:
Air pollution has the potential to modify the lifespan and intensity of thunderstorms and the properties of lightning. Using data mining and geovisualization, we investigate how background climate and weather conditions shape variability in urban air pollution and how this, in turn, shapes thunderstorms as measured by the intensity, distribution, and frequency of cloud-to-ground lightning. A spatiotemporal analysis was conducted in order to identify thunderstorms using high-resolution lightning detection network data. Over seven million lightning flashes were used to identify more than 196,000 thunderstorms that occurred between 2006 - 2020 in the Washington, DC Metropolitan Region. Each lightning flash in the dataset was grouped into thunderstorm events by means of a temporal and spatial clustering algorithm. Once the thunderstorm event database was constructed, hourly wind direction, wind speed, and atmospheric thermodynamic data were added to the initiation and dissipation times and locations for the 196,000 identified thunderstorms. Hourly aerosol and air quality data for the thunderstorm initiation times and locations were also incorporated into the dataset. Developing thunderstorm climatologies using a lightning tracking algorithm and lightning detection network data was found to be useful for visualizing the spatial and temporal distribution of urban augmented thunderstorms in the region.Keywords: lightning, urbanization, thunderstorms, climatology
Procedia PDF Downloads 772823 A Study of the Establishment of the Evaluation Index System for Tourist Attraction Disaster Resilience
Authors: Chung-Hung Tsai, Ya-Ping Li
Abstract:
Tourism industry is highly depended on the natural environment and climate. Compared to other industries, it is more susceptible to environment and climate. Taiwan belongs to a sea island country and located in the subtropical monsoon zone. The events of climate variability, frequency of typhoons and rainfalls raged are caused regularly serious disaster. In traditional disaster assessment, it usually focuses on the disaster damage and risk assessment, which is short of the features from different industries to understand the impact of the restoring force in post-disaster resilience and the main factors that constitute resilience. The object of this study is based on disaster recovery experience of tourism area and to understand the main factors affecting the tourist area of disaster resilience. The combinations of literature review and interviews with experts are prepared an early indicator system of the disaster resilience. Then, it is screened through a Fuzzy Delphi Method and Analytic Network Process for weight analysis. Finally, this study will establish the tourism disaster resilience evaluation index system considering the Taiwan's tourism industry characteristics. We hope that be able to enhance disaster resilience after tourist areas and increases the sustainability of industrial development. It is expected to provide government departments the tourism industry as the future owner of the assets in extreme climates responses.Keywords: resilience, Fuzzy Delphi Method, Analytic Network Process, industrial development
Procedia PDF Downloads 4092822 A Framework for Chinese Domain-Specific Distant Supervised Named Entity Recognition
Abstract:
The Knowledge Graphs have now become a new form of knowledge representation. However, there is no consensus in regard to a plausible and definition of entities and relationships in the domain-specific knowledge graph. Further, in conjunction with several limitations and deficiencies, various domain-specific entities and relationships recognition approaches are far from perfect. Specifically, named entity recognition in Chinese domain is a critical task for the natural language process applications. However, a bottleneck problem with Chinese named entity recognition in new domains is the lack of annotated data. To address this challenge, a domain distant supervised named entity recognition framework is proposed. The framework is divided into two stages: first, the distant supervised corpus is generated based on the entity linking model of graph attention neural network; secondly, the generated corpus is trained as the input of the distant supervised named entity recognition model to train to obtain named entities. The link model is verified in the ccks2019 entity link corpus, and the F1 value is 2% higher than that of the benchmark method. The re-pre-trained BERT language model is added to the benchmark method, and the results show that it is more suitable for distant supervised named entity recognition tasks. Finally, it is applied in the computer field, and the results show that this framework can obtain domain named entities.Keywords: distant named entity recognition, entity linking, knowledge graph, graph attention neural network
Procedia PDF Downloads 952821 Sexual Cognitive Behavioral Therapy: Psychological Performance and Openness to Experience
Authors: Alireza Monzavi Chaleshtari, Mahnaz Aliakbari Dehkordi, Amin Asadi Hieh, Majid Kazemnezhad
Abstract:
This research was conducted with the aim of determining the effectiveness of sexual cognitive behavioral therapy on psychological performance and openness to experience in women. The type of research was experimental in the form of pre-test-post-test. The statistical population of this research was made up of all working and married women with membership in the researcher's Instagram social network who had problems in marital-sexual relationships (N=900). From the statistical community, which includes working and married women who are members of the researcher's Instagram social network who have problems in marital-sexual relationships, there are 30 people including two groups (15 people in the experimental group and 15 people in the control group) as available sampling and selected randomly. They were placed in two experimental and control groups. The anxiety, stress, and depression scale (DASS) and the Costa and McCree personality questionnaire were used to collect data, and the cognitive behavioral therapy protocol of Dr. Mehrnaz Ali Akbari was used for the treatment sessions. To analyze the data, the covariance test was used in the SPSS22 software environment. The results showed that sexual cognitive behavioral therapy has a positive and significant effect on psychological performance and openness to experience in women. Conclusion: It can be concluded that interventions such as cognitive-behavioral sex can be used to treat marital problems.Keywords: sexual cognitive behavioral therapy, psychological function, openness to experience, women
Procedia PDF Downloads 782820 Preprocessing and Fusion of Multiple Representation of Finger Vein patterns using Conventional and Machine Learning techniques
Authors: Tomas Trainys, Algimantas Venckauskas
Abstract:
Application of biometric features to the cryptography for human identification and authentication is widely studied and promising area of the development of high-reliability cryptosystems. Biometric cryptosystems typically are designed for patterns recognition, which allows biometric data acquisition from an individual, extracts feature sets, compares the feature set against the set stored in the vault and gives a result of the comparison. Preprocessing and fusion of biometric data are the most important phases in generating a feature vector for key generation or authentication. Fusion of biometric features is critical for achieving a higher level of security and prevents from possible spoofing attacks. The paper focuses on the tasks of initial processing and fusion of multiple representations of finger vein modality patterns. These tasks are solved by applying conventional image preprocessing methods and machine learning techniques, Convolutional Neural Network (SVM) method for image segmentation and feature extraction. An article presents a method for generating sets of biometric features from a finger vein network using several instances of the same modality. Extracted features sets were fused at the feature level. The proposed method was tested and compared with the performance and accuracy results of other authors.Keywords: bio-cryptography, biometrics, cryptographic key generation, data fusion, information security, SVM, pattern recognition, finger vein method.
Procedia PDF Downloads 1522819 Emerging Cyber Threats and Cognitive Vulnerabilities: Cyberterrorism
Authors: Oludare Isaac Abiodun, Esther Omolara Abiodun
Abstract:
The purpose of this paper is to demonstrate that cyberterrorism is existing and poses a threat to computer security and national security. Nowadays, people have become excitedly dependent upon computers, phones, the Internet, and the Internet of things systems to share information, communicate, conduct a search, etc. However, these network systems are at risk from a different source that is known and unknown. These network systems risk being caused by some malicious individuals, groups, organizations, or governments, they take advantage of vulnerabilities in the computer system to hawk sensitive information from people, organizations, or governments. In doing so, they are engaging themselves in computer threats, crime, and terrorism, thereby making the use of computers insecure for others. The threat of cyberterrorism is of various forms and ranges from one country to another country. These threats include disrupting communications and information, stealing data, destroying data, leaking, and breaching data, interfering with messages and networks, and in some cases, demanding financial rewards for stolen data. Hence, this study identifies many ways that cyberterrorists utilize the Internet as a tool to advance their malicious mission, which negatively affects computer security and safety. One could identify causes for disparate anomaly behaviors and the theoretical, ideological, and current forms of the likelihood of cyberterrorism. Therefore, for a countermeasure, this paper proposes the use of previous and current computer security models as found in the literature to help in countering cyberterrorismKeywords: cyberterrorism, computer security, information, internet, terrorism, threat, digital forensic solution
Procedia PDF Downloads 982818 Multi-Agent Searching Adaptation Using Levy Flight and Inferential Reasoning
Authors: Sagir M. Yusuf, Chris Baber
Abstract:
In this paper, we describe how to achieve knowledge understanding and prediction (Situation Awareness (SA)) for multiple-agents conducting searching activity using Bayesian inferential reasoning and learning. Bayesian Belief Network was used to monitor agents' knowledge about their environment, and cases are recorded for the network training using expectation-maximisation or gradient descent algorithm. The well trained network will be used for decision making and environmental situation prediction. Forest fire searching by multiple UAVs was the use case. UAVs are tasked to explore a forest and find a fire for urgent actions by the fire wardens. The paper focused on two problems: (i) effective agents’ path planning strategy and (ii) knowledge understanding and prediction (SA). The path planning problem by inspiring animal mode of foraging using Lévy distribution augmented with Bayesian reasoning was fully described in this paper. Results proof that the Lévy flight strategy performs better than the previous fixed-pattern (e.g., parallel sweeps) approaches in terms of energy and time utilisation. We also introduced a waypoint assessment strategy called k-previous waypoints assessment. It improves the performance of the ordinary levy flight by saving agent’s resources and mission time through redundant search avoidance. The agents (UAVs) are to report their mission knowledge at the central server for interpretation and prediction purposes. Bayesian reasoning and learning were used for the SA and results proof effectiveness in different environments scenario in terms of prediction and effective knowledge representation. The prediction accuracy was measured using learning error rate, logarithm loss, and Brier score and the result proves that little agents mission that can be used for prediction within the same or different environment. Finally, we described a situation-based knowledge visualization and prediction technique for heterogeneous multi-UAV mission. While this paper proves linkage of Bayesian reasoning and learning with SA and effective searching strategy, future works is focusing on simplifying the architecture.Keywords: Levy flight, distributed constraint optimization problem, multi-agent system, multi-robot coordination, autonomous system, swarm intelligence
Procedia PDF Downloads 1452817 The Analysis of Internet and Social Media Behaviors of the Students in Vocational High School
Authors: Mehmet Balci, Sakir Tasdemir, Mustafa Altin, Ozlem Bozok
Abstract:
Our globalizing world has become almost a small village and everyone can access any information at any time. Everyone lets each other know who does whatever in which place. We can learn which social events occur in which place in the world. From the perspective of education, the course notes that a lecturer use in lessons in a university in any state of America can be examined by a student studying in a city of Africa or the Far East. This dizzying communication we have mentioned happened thanks to fast developments in computer technologies and in parallel with this, internet technology. While these developments in the world, has a very large young population and a rapidly evolving electronic communications infrastructure Turkey has been affected by this situation. Researches has shown that almost all young people in Turkey has an account in a social network. Especially becoming common of mobile devices causes data traffic in social networks to increase. In this study, has been surveyed on students in the different age groups and at the Selcuk University Vocational School of Technical Sciences Department of Computer Technology. Student’s opinions about the use of internet and social media has been gotten. Using the Internet and social media skills, purposes, operating frequency, access facilities and tools, social life and effects on vocational education etc. have been explored. Both internet and use of social media positive and negative effects on this department students results have been obtained by the obtained findings evaluating from various aspects. Relations and differences have been found out with statistic.Keywords: computer technologies, internet use, social network, higher vocational school
Procedia PDF Downloads 5442816 A Convolutional Neural Network-Based Model for Lassa fever Virus Prediction Using Patient Blood Smear Image
Authors: A. M. John-Otumu, M. M. Rahman, M. C. Onuoha, E. P. Ojonugwa
Abstract:
A Convolutional Neural Network (CNN) model for predicting Lassa fever was built using Python 3.8.0 programming language, alongside Keras 2.2.4 and TensorFlow 2.6.1 libraries as the development environment in order to reduce the current high risk of Lassa fever in West Africa, particularly in Nigeria. The study was prompted by some major flaws in existing conventional laboratory equipment for diagnosing Lassa fever (RT-PCR), as well as flaws in AI-based techniques that have been used for probing and prognosis of Lassa fever based on literature. There were 15,679 blood smear microscopic image datasets collected in total. The proposed model was trained on 70% of the dataset and tested on 30% of the microscopic images in avoid overfitting. A 3x3x3 convolution filter was also used in the proposed system to extract features from microscopic images. The proposed CNN-based model had a recall value of 96%, a precision value of 93%, an F1 score of 95%, and an accuracy of 94% in predicting and accurately classifying the images into clean or infected samples. Based on empirical evidence from the results of the literature consulted, the proposed model outperformed other existing AI-based techniques evaluated. If properly deployed, the model will assist physicians, medical laboratory scientists, and patients in making accurate diagnoses for Lassa fever cases, allowing the mortality rate due to the Lassa fever virus to be reduced through sound decision-making.Keywords: artificial intelligence, ANN, blood smear, CNN, deep learning, Lassa fever
Procedia PDF Downloads 1212815 Tomato-Weed Classification by RetinaNet One-Step Neural Network
Authors: Dionisio Andujar, Juan lópez-Correa, Hugo Moreno, Angela Ri
Abstract:
The increased number of weeds in tomato crops highly lower yields. Weed identification with the aim of machine learning is important to carry out site-specific control. The last advances in computer vision are a powerful tool to face the problem. The analysis of RGB (Red, Green, Blue) images through Artificial Neural Networks had been rapidly developed in the past few years, providing new methods for weed classification. The development of the algorithms for crop and weed species classification looks for a real-time classification system using Object Detection algorithms based on Convolutional Neural Networks. The site study was located in commercial corn fields. The classification system has been tested. The procedure can detect and classify weed seedlings in tomato fields. The input to the Neural Network was a set of 10,000 RGB images with a natural infestation of Cyperus rotundus l., Echinochloa crus galli L., Setaria italica L., Portulaca oeracea L., and Solanum nigrum L. The validation process was done with a random selection of RGB images containing the aforementioned species. The mean average precision (mAP) was established as the metric for object detection. The results showed agreements higher than 95 %. The system will provide the input for an online spraying system. Thus, this work plays an important role in Site Specific Weed Management by reducing herbicide use in a single step.Keywords: deep learning, object detection, cnn, tomato, weeds
Procedia PDF Downloads 1062814 Fake News Detection for Korean News Using Machine Learning Techniques
Authors: Tae-Uk Yun, Pullip Chung, Kee-Young Kwahk, Hyunchul Ahn
Abstract:
Fake news is defined as the news articles that are intentionally and verifiably false, and could mislead readers. Spread of fake news may provoke anxiety, chaos, fear, or irrational decisions of the public. Thus, detecting fake news and preventing its spread has become very important issue in our society. However, due to the huge amount of fake news produced every day, it is almost impossible to identify it by a human. Under this context, researchers have tried to develop automated fake news detection using machine learning techniques over the past years. But, there have been no prior studies proposed an automated fake news detection method for Korean news to our best knowledge. In this study, we aim to detect Korean fake news using text mining and machine learning techniques. Our proposed method consists of two steps. In the first step, the news contents to be analyzed is convert to quantified values using various text mining techniques (topic modeling, TF-IDF, and so on). After that, in step 2, classifiers are trained using the values produced in step 1. As the classifiers, machine learning techniques such as logistic regression, backpropagation network, support vector machine, and deep neural network can be applied. To validate the effectiveness of the proposed method, we collected about 200 short Korean news from Seoul National University’s FactCheck. which provides with detailed analysis reports from 20 media outlets and links to source documents for each case. Using this dataset, we will identify which text features are important as well as which classifiers are effective in detecting Korean fake news.Keywords: fake news detection, Korean news, machine learning, text mining
Procedia PDF Downloads 2762813 Participatory Planning of the III Young Sea Meeting: An Experience of the Young Albatroz Collective
Authors: Victor V. Ribeiro, Thais C. Lopes, Rafael A. A. Monteiro
Abstract:
The Albatroz, Baleia Jubarte, Coral Vivo, Golfinho Rotador and Tamar projects make up the Young Sea Network (YSN), part of the BIOMAR Network, which aims to integrate the environmental youths of the Brazilian coast. For this, three editions of the Young Sea Meeting (YSM) were performed. Seeking to stimulate belonging, self-knowledge, participation, autonomy and youth protagonism, the Albatroz Project hosted the III YSM, in Bertioga (SP), in April 2019 and aimed to collectively plan the meeting. Five pillars of Environmental Education were used: identity, community, dialogue, power to act and happiness, the OCA Method and the Young Educates Young; Young Chooses Young; and One Generation Learns from the Other principals. In December 2018, still in the II YSM, the participatory planning of the III YSM began. Two "representatives" of each group were voluntarily elected to facilitate joint decisions, propose, receive and communicate demands from their groups and coordinators. The Young Albatroz Collective (YAC) facilitated the organization process as a whole. The purpose of the meeting was collectively constructed, answering the following question: "What is the YSM for?". Only two of the five pairs of representatives responded. There was difficulty gathering the young people in each group, because it was the end of the year, with people traveling. Thus, due to the short planning time, the YAC built a pre-programming to be validated by the other groups, defining as the objective of the meeting the strengthening of youth protagonism within the YSN. In the planning process, the YAC held 20 meetings, with 60 hours of face-to-face work, in three months, and two technical visits to the headquarters of the III YSM. The participatory dynamics of consultation, when it occurred, required up to two weeks, evidencing the limits of participation. The project coordinations stated that they were not being included in the process by their young people. There is a need to work more to be able to aloud the participation, developing skills and understanding about its principles. This training must take place in an articulated way between the network, implying the important role of the five projects in jointly developing and implementing educator processes with this objective in a national dimension, but without forgetting the specificities of each young group. Finally, it is worth highlighting the great potential of the III YSM by stimulating the exercise of leading environmental youth in more than 50 young people from Brazilian coast, linked to the YSN, stimulating the learning and mobilization of young people in favor of coastal and marine conservation.Keywords: Marine Conservation, Environmental Education, Youth, Participation, Planning
Procedia PDF Downloads 1682812 A Practice of Zero Trust Architecture in Financial Transactions
Authors: Liwen Wang, Yuting Chen, Tong Wu, Shaolei Hu
Abstract:
In order to enhance the security of critical financial infrastructure, this study carries out a transformation of the architecture of a financial trading terminal to a zero trust architecture (ZTA), constructs an active defense system for cybersecurity, improves the security level of trading services in the Internet environment, enhances the ability to prevent network attacks and unknown risks, and reduces the industry and security risks brought about by cybersecurity risks. This study introduces the SDP technology of ZTA, adapts and applies it to a financial trading terminal to achieve security optimization and fine-grained business grading control. The upgraded architecture of the trading terminal moves security protection forward to the user access layer, replaces VPN to optimize remote access, and significantly improves the security protection capability of Internet transactions. The study achieves 1. deep integration with the access control architecture of the transaction system; 2. no impact on the performance of terminals and gateways, and no perception of application system upgrades; 3. customized checklist and policy configuration; 4. introduction of industry-leading security technology such as single-packet authorization (SPA) and secondary authentication. This study carries out a successful application of ZTA in the field of financial trading and provides transformation ideas for other similar systems while improving the security level of financial transaction services in the Internet environment.Keywords: zero trust, trading terminal, architecture, network security, cybersecurity
Procedia PDF Downloads 1712811 A Fuzzy Multi-Criteria Model for Sustainable Development of Community-Based Tourism through the Homestay Program in Malaysia
Authors: Azizah Ismail, Zainab Khalifah, Abbas Mardani
Abstract:
Sustainable community-based tourism through homestay programme is a growing niche market that has impacted destinations in many countries including Malaysia. With demand predicted to continue increasing, the importance of the homestay product will grow in the tourism industry. This research examines the sustainability criteria for homestay programme in Malaysia covering economic, socio-cultural and environmental dimensions. This research applied a two-stage methodology for data analysis. Specifically, the researcher implements a hybrid method which combines two multi-criteria decision making approaches. In the first stage of the methodology, the Decision Making Trial and Evaluation Laboratory (DEMATEL) technique is applied. Then, Analytical Network Process (ANP) is employed for the achievement of the objective of the current research. After factors identification and problem formulation, DEMATEL is used to detect complex relationships and to build a Network Relation Map (NRM). Then ANP is used to prioritize and find the weights of the criteria and sub-criteria of the decision model. The research verifies the framework of multi-criteria for sustainable community-based tourism from the perspective of stakeholders. The result also provides a different perspective on the importance of sustainable criteria from the view of multi-stakeholders. Practically, this research gives the framework model and helps stakeholders to improve and innovate the homestay programme and also promote community-based tourism.Keywords: community-based tourism, homestay programme, sustainable tourism criteria, sustainable tourism development
Procedia PDF Downloads 1342810 Geographic Information Systems and a Breath of Opportunities for Supply Chain Management: Results from a Systematic Literature Review
Authors: Anastasia Tsakiridi
Abstract:
Geographic information systems (GIS) have been utilized in numerous spatial problems, such as site research, land suitability, and demographic analysis. Besides, GIS has been applied in scientific fields like geography, health, and economics. In business studies, GIS has been used to provide insights and spatial perspectives in demographic trends, spending indicators, and network analysis. To date, the information regarding the available usages of GIS in supply chain management (SCM) and how these analyses can benefit businesses is limited. A systematic literature review (SLR) of the last 5-year peer-reviewed academic literature was conducted, aiming to explore the existing usages of GIS in SCM. The searches were performed in 3 databases (Web of Science, ProQuest, and Business Source Premier) and reported using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) methodology. The analysis resulted in 79 papers. The results indicate that the existing GIS applications used in SCM were in the following domains: a) network/ transportation analysis (in 53 of the papers), b) location – allocation site search/ selection (multiple-criteria decision analysis) (in 45 papers), c) spatial analysis (demographic or physical) (in 34 papers), d) combination of GIS and supply chain/network optimization tools (in 32 papers), and e) visualization/ monitoring or building information modeling applications (in 8 papers). An additional categorization of the literature was conducted by examining the usage of GIS in the supply chain (SC) by the business sectors, as indicated by the volume of the papers. The results showed that GIS is mainly being applied in the SC of the biomass biofuel/wood industry (33 papers). Other industries that are currently utilizing GIS in their SC were the logistics industry (22 papers), the humanitarian/emergency/health care sector (10 papers), the food/agro-industry sector (5 papers), the petroleum/ coal/ shale gas sector (3 papers), the faecal sludge sector (2 papers), the recycle and product footprint industry (2 papers), and the construction sector (2 papers). The results were also presented by the geography of the included studies and the GIS software used to provide critical business insights and suggestions for future research. The results showed that research case studies of GIS in SCM were conducted in 26 countries (mainly in the USA) and that the most prominent GIS software provider was the Environmental Systems Research Institute’s ArcGIS (in 51 of the papers). This study is a systematic literature review of the usage of GIS in SCM. The results showed that the GIS capabilities could offer substantial benefits in SCM decision-making by providing key insights to cost minimization, supplier selection, facility location, SC network configuration, and asset management. However, as presented in the results, only eight industries/sectors are currently using GIS in their SCM activities. These findings may offer essential tools to SC managers who seek to optimize the SC activities and/or minimize logistic costs and to consultants and business owners that want to make strategic SC decisions. Furthermore, the findings may be of interest to researchers aiming to investigate unexplored research areas where GIS may improve SCM.Keywords: supply chain management, logistics, systematic literature review, GIS
Procedia PDF Downloads 1432809 Synthesis and Characterisation of Starch-PVP as Encapsulation Material for Drug Delivery System
Authors: Nungki Rositaningsih, Emil Budianto
Abstract:
Starch has been widely used as an encapsulation material for drug delivery system. However, starch hydrogel is very easily degraded during metabolism in human stomach. Modification of this material is needed to improve the encapsulation process in drug delivery system, especially for gastrointestinal drug. In this research, three modified starch-based hydrogels are synthesized i.e. Crosslinked starch hydrogel, Semi- and Full- Interpenetrating Polymer Network (IPN) starch hydrogel using Poly(N-Vinyl-Pyrrolidone). Non-modified starch hydrogel was also synthesized as a control. All of those samples were compared as biomaterials, floating drug delivery, and their ability in loading drug test. Biomaterial characterizations were swelling test, stereomicroscopy observation, Differential Scanning Calorimetry (DSC), and Fourier Transform Infrared Spectroscopy (FTIR). Buoyancy test and stereomicroscopy scanning were done for floating drug delivery characterizations. Lastly, amoxicillin was used as test drug, and characterized with UV-Vis spectroscopy for loading drug observation. Preliminary observation showed that Full-IPN has the most dense and elastic texture, followed by Semi-IPN, Crosslinked, and Non-modified in the last position. Semi-IPN and Crosslinked starch hydrogel have the most ideal properties and will not be degraded easily during metabolism. Therefore, both hydrogels could be considered as promising candidates for encapsulation material. Further analysis and issues will be discussed in the paper.Keywords: biomaterial, drug delivery system, interpenetrating polymer network, poly(N-vinyl-pyrrolidone), starch hydrogel
Procedia PDF Downloads 2522808 Incentive-Based Motivation to Network with Coworkers: Strengthening Professional Networks via Online Social Networks
Authors: Jung Lee
Abstract:
The last decade has witnessed more people than ever before using social media and broadening their social circles. Social media users connect not only with their friends but also with professional acquaintances, primarily coworkers, and clients; personal and professional social circles are mixed within the same social media platform. Considering the positive aspect of social media in facilitating communication and mutual understanding between individuals, we infer that social media interactions with co-workers could indeed benefit one’s professional life. However, given privacy issues, sharing all personal details with one’s co-workers is not necessarily the best practice. Should one connect with coworkers via social media? Will social media connections with coworkers eventually benefit one’s long-term career? Will the benefit differ across cultures? To answer, this study examines how social media can contribute to organizational communication by tracing the foundation of user motivation based on social capital theory, leader-member exchange (LMX) theory and expectancy theory of motivation. Although social media was originally designed for personal communication, users have shown intentions to extend social media use for professional communication, especially when the proper incentive is expected. To articulate the user motivation and the mechanism of the incentive expectation scheme, this study applies those three theories and identify six antecedents and three moderators of social media use motivation including social network flaunt, shared interest, perceived social inclusion. It also hypothesizes that the moderating effects of those constructs would significantly differ based on the relationship hierarchy among the workers. To validate, this study conducted a survey of 329 active social media users with acceptable levels of job experiences. The analysis result confirms the specific roles of the three moderators in social media adoption for organizational communication. The present study contributes to the literature by developing a theoretical modeling of ambivalent employee perceptions about establishing social media connections with co-workers. This framework shows not only how both positive and negative expectations of social media connections with co-workers are formed based on expectancy theory of motivation, but also how such expectations lead to behavioral intentions using career success model. It also enhances understanding of how various relationships among employees can be influenced through social media use and such usage can potentially affect both performance and careers. Finally, it shows how cultural factors induced by social media use can influence relations among the coworkers.Keywords: the social network, workplace, social capital, motivation
Procedia PDF Downloads 1242807 Deep Learning Framework for Predicting Bus Travel Times with Multiple Bus Routes: A Single-Step Multi-Station Forecasting Approach
Authors: Muhammad Ahnaf Zahin, Yaw Adu-Gyamfi
Abstract:
Bus transit is a crucial component of transportation networks, especially in urban areas. Any intelligent transportation system must have accurate real-time information on bus travel times since it minimizes waiting times for passengers at different stations along a route, improves service reliability, and significantly optimizes travel patterns. Bus agencies must enhance the quality of their information service to serve their passengers better and draw in more travelers since people waiting at bus stops are frequently anxious about when the bus will arrive at their starting point and when it will reach their destination. For solving this issue, different models have been developed for predicting bus travel times recently, but most of them are focused on smaller road networks due to their relatively subpar performance in high-density urban areas on a vast network. This paper develops a deep learning-based architecture using a single-step multi-station forecasting approach to predict average bus travel times for numerous routes, stops, and trips on a large-scale network using heterogeneous bus transit data collected from the GTFS database. Over one week, data was gathered from multiple bus routes in Saint Louis, Missouri. In this study, Gated Recurrent Unit (GRU) neural network was followed to predict the mean vehicle travel times for different hours of the day for multiple stations along multiple routes. Historical time steps and prediction horizon were set up to 5 and 1, respectively, which means that five hours of historical average travel time data were used to predict average travel time for the following hour. The spatial and temporal information and the historical average travel times were captured from the dataset for model input parameters. As adjacency matrices for the spatial input parameters, the station distances and sequence numbers were used, and the time of day (hour) was considered for the temporal inputs. Other inputs, including volatility information such as standard deviation and variance of journey durations, were also included in the model to make it more robust. The model's performance was evaluated based on a metric called mean absolute percentage error (MAPE). The observed prediction errors for various routes, trips, and stations remained consistent throughout the day. The results showed that the developed model could predict travel times more accurately during peak traffic hours, having a MAPE of around 14%, and performed less accurately during the latter part of the day. In the context of a complicated transportation network in high-density urban areas, the model showed its applicability for real-time travel time prediction of public transportation and ensured the high quality of the predictions generated by the model.Keywords: gated recurrent unit, mean absolute percentage error, single-step forecasting, travel time prediction.
Procedia PDF Downloads 732806 A Study of the Planning and Designing of the Built Environment under the Green Transit-Oriented Development
Authors: Wann-Ming Wey
Abstract:
In recent years, the problems of global climate change and natural disasters have induced the concerns and attentions of environmental sustainability issues for the public. Aside from the environmental planning efforts done for human environment, Transit-Oriented Development (TOD) has been widely used as one of the future solutions for the sustainable city development. In order to be more consistent with the urban sustainable development, the development of the built environment planning based on the concept of Green TOD which combines both TOD and Green Urbanism is adapted here. The connotation of the urban development under the green TOD including the design toward environment protect, the maximum enhancement resources and the efficiency of energy use, use technology to construct green buildings and protected areas, natural ecosystems and communities linked, etc. Green TOD is not only to provide the solution to urban traffic problems, but to direct more sustainable and greener consideration for future urban development planning and design. In this study, we use both the TOD and Green Urbanism concepts to proceed to the study of the built environment planning and design. Fuzzy Delphi Technique (FDT) is utilized to screen suitable criteria of the green TOD. Furthermore, Fuzzy Analytic Network Process (FANP) and Quality Function Deployment (QFD) were then developed to evaluate the criteria and prioritize the alternatives. The study results can be regarded as the future guidelines of the built environment planning and designing under green TOD development in Taiwan.Keywords: green TOD, built environment, fuzzy delphi technique, quality function deployment, fuzzy analytic network process
Procedia PDF Downloads 3852805 Optimal Wind Based DG Placement Considering Monthly Changes Modeling in Wind Speed
Authors: Belal Mohamadi Kalesar, Raouf Hasanpour
Abstract:
Proper placement of Distributed Generation (DG) units such as wind turbine generators in distribution system are still very challenging issue for obtaining their maximum potential benefits because inappropriate placement may increase the system losses. This paper proposes Particle Swarm Optimization (PSO) technique for optimal placement of wind based DG (WDG) in the primary distribution system to reduce energy losses and voltage profile improvement with four different wind levels modeling in year duration. Also, wind turbine is modeled as a DFIG that will be operated at unity power factor and only one wind turbine tower will be considered to install at each bus of network. Finally, proposed method will be implemented on widely used 69 bus power distribution system in MATLAB software environment under four scenario (without, one, two and three WDG units) and for capability test of implemented program it is supposed that all buses of standard system can be candidate for WDG installing (large search space), though this program can consider predetermined number of candidate location in WDG placement to model financial limitation of project. Obtained results illustrate that wind speed increasing in some months will increase output power generated but this can increase / decrease power loss in some wind level, also results show that it is required about 3MW WDG capacity to install in different buses but when this is distributed in overall network (more number of WDG) it can cause better solution from point of view of power loss and voltage profile.Keywords: wind turbine, DG placement, wind levels effect, PSO algorithm
Procedia PDF Downloads 4492804 Riverine Urban Heritage: A Basis for Green Infrastructure
Authors: Ioanna H. Lioliou, Despoina D. Zavraka
Abstract:
The radical reformation that Greek urban space, has undergone over the last century, due to the socio-historical developments, technological development and political–geographic factors, has left its imprint on the urban landscape. While the big cities struggle to regain urban landscape balance, small towns are considered to offer high quality lifescapes, ensuring sustainable development potential. However, their unplanned urbanization process led to the loss of significant areas of nature, lack of essential infrastructure, chaotic built environment, incompatible land uses and urban cohesiveness. Natural environment reference points, such as springs, streams, rivers, forests, suburban greenbelts, and etc.; seems to be detached from urban space, while the public, open and green spaces, unequally distributed in the built environment, they are no longer able to offer a complete experience of nature in the city. This study focuses on Greek mainland, a small town Elassona, and aims to restore spatial coherence between the city’s homonymous river and its urban space surroundings. The existence of a linear aquatic ecosystem, is considered a precious greenway, also referred as blueway, able to initiate natural penetrations and ecosystems empowering. The integration of disconnected natural ecosystems forms the basis of a strategic intervention scheme, where the river becomes the urban integration tool / feature, constituting the main urban corridor and an indispensible part of a wider green network that connects open and green spaces, ensuring the function of all the established networks (transportation, commercial, social) of the town. The proposed intervention, introduces a green network highlighting the old stone bridge at the ‘entrance’ of the river in the town and expanding throughout the town with strategic uses and activities, providing accessibility for all the users. The methodology used, is based on the collection of design tools used in related urban river-design interventions around the world. The reinstallation/reactivation of the balance between natural and urban landscape, besides the environmental benefits, contributes decisively to the illustration/projection of urban green identity and re-enhancement of the quality of lifescape qualities and social interaction.Keywords: green network, rehabilitation scheme, urban landscape, urban streams
Procedia PDF Downloads 2812803 Exploring Time-Series Phosphoproteomic Datasets in the Context of Network Models
Authors: Sandeep Kaur, Jenny Vuong, Marcel Julliard, Sean O'Donoghue
Abstract:
Time-series data are useful for modelling as they can enable model-evaluation. However, when reconstructing models from phosphoproteomic data, often non-exact methods are utilised, as the knowledge regarding the network structure, such as, which kinases and phosphatases lead to the observed phosphorylation state, is incomplete. Thus, such reactions are often hypothesised, which gives rise to uncertainty. Here, we propose a framework, implemented via a web-based tool (as an extension to Minardo), which given time-series phosphoproteomic datasets, can generate κ models. The incompleteness and uncertainty in the generated model and reactions are clearly presented to the user via the visual method. Furthermore, we demonstrate, via a toy EGF signalling model, the use of algorithmic verification to verify κ models. Manually formulated requirements were evaluated with regards to the model, leading to the highlighting of the nodes causing unsatisfiability (i.e. error causing nodes). We aim to integrate such methods into our web-based tool and demonstrate how the identified erroneous nodes can be presented to the user via the visual method. Thus, in this research we present a framework, to enable a user to explore phosphorylation proteomic time-series data in the context of models. The observer can visualise which reactions in the model are highly uncertain, and which nodes cause incorrect simulation outputs. A tool such as this enables an end-user to determine the empirical analysis to perform, to reduce uncertainty in the presented model - thus enabling a better understanding of the underlying system.Keywords: κ-models, model verification, time-series phosphoproteomic datasets, uncertainty and error visualisation
Procedia PDF Downloads 2592802 Assessing the Environmental Efficiency of China’s Power System: A Spatial Network Data Envelopment Analysis Approach
Authors: Jianli Jiang, Bai-Chen Xie
Abstract:
The climate issue has aroused global concern. Achieving sustainable development is a good path for countries to mitigate environmental and climatic pressures, although there are many difficulties. The first step towards sustainable development is to evaluate the environmental efficiency of the energy industry with proper methods. The power sector is a major source of CO2, SO2, and NOx emissions. Evaluating the environmental efficiency (EE) of power systems is the premise to alleviate the terrible situation of energy and the environment. Data Envelopment Analysis (DEA) has been widely used in efficiency studies. However, measuring the efficiency of a system (be it a nation, region, sector, or business) is a challenging task. The classic DEA takes the decision-making units (DMUs) as independent, which neglects the interaction between DMUs. While ignoring these inter-regional links may result in a systematic bias in the efficiency analysis; for instance, the renewable power generated in a certain region may benefit the adjacent regions while the SO2 and CO2 emissions act oppositely. This study proposes a spatial network DEA (SNDEA) with a slack measure that can capture the spatial spillover effects of inputs/outputs among DMUs to measure efficiency. This approach is used to study the EE of China's power system, which consists of generation, transmission, and distribution departments, using a panel dataset from 2014 to 2020. In the empirical example, the energy and patent inputs, the undesirable CO2 output, and the renewable energy (RE) power variables are tested for a significant spatial spillover effect. Compared with the classic network DEA, the SNDEA result shows an obvious difference tested by the global Moran' I index. From a dynamic perspective, the EE of the power system experiences a visible surge from 2015, then a sharp downtrend from 2019, which keeps the same trend with the power transmission department. This phenomenon benefits from the market-oriented reform in the Chinese power grid enacted in 2015. The rapid decline in the environmental efficiency of the transmission department in 2020 was mainly due to the Covid-19 epidemic, which hinders economic development seriously. While the EE of the power generation department witnesses a declining trend overall, this is reasonable, taking the RE power into consideration. The installed capacity of RE power in 2020 is 4.40 times that in 2014, while the power generation is 3.97 times; in other words, the power generation per installed capacity shrank. In addition, the consumption cost of renewable power increases rapidly with the increase of RE power generation. These two aspects make the EE of the power generation department show a declining trend. Incorporation of the interactions among inputs/outputs into the DEA model, this paper proposes an efficiency evaluation method on the basis of the DEA framework, which sheds some light on efficiency evaluation in regional studies. Furthermore, the SNDEA model and the spatial DEA concept can be extended to other fields, such as industry, country, and so on.Keywords: spatial network DEA, environmental efficiency, sustainable development, power system
Procedia PDF Downloads 1112801 Convolutional Neural Networks versus Radiomic Analysis for Classification of Breast Mammogram
Authors: Mehwish Asghar
Abstract:
Breast Cancer (BC) is a common type of cancer among women. Its screening is usually performed using different imaging modalities such as magnetic resonance imaging, mammogram, X-ray, CT, etc. Among these modalities’ mammogram is considered a powerful tool for diagnosis and screening of breast cancer. Sophisticated machine learning approaches have shown promising results in complementing human diagnosis. Generally, machine learning methods can be divided into two major classes: one is Radiomics analysis (RA), where image features are extracted manually; and the other one is the concept of convolutional neural networks (CNN), in which the computer learns to recognize image features on its own. This research aims to improve the incidence of early detection, thus reducing the mortality rate caused by breast cancer through the latest advancements in computer science, in general, and machine learning, in particular. It has also been aimed to ease the burden of doctors by improving and automating the process of breast cancer detection. This research is related to a relative analysis of different techniques for the implementation of different models for detecting and classifying breast cancer. The main goal of this research is to provide a detailed view of results and performances between different techniques. The purpose of this paper is to explore the potential of a convolutional neural network (CNN) w.r.t feature extractor and as a classifier. Also, in this research, it has been aimed to add the module of Radiomics for comparison of its results with deep learning techniques.Keywords: breast cancer (BC), machine learning (ML), convolutional neural network (CNN), radionics, magnetic resonance imaging, artificial intelligence
Procedia PDF Downloads 2282800 The Morphogenesis of an Informal Settlement: An Examination of Street Networks through the Informal Development Stages Framework
Authors: Judith Margaret Tymon
Abstract:
As cities struggle to incorporate informal settlements into the fabric of urban areas, the focus has often been on the provision of housing. This study explores the underlying structure of street networks, with the goal of understanding the morphogenesis of informal settlements through the lens of the access network. As the stages of development progress from infill to consolidation and eventually, to a planned in-situ settlement, the access networks retain the form of the core segments; however, a majority of street patterns are adapted to a grid design to support infrastructure in the final upgraded phase. A case study is presented to examine the street network in the informal settlement of Gobabis Namibia as it progresses from its initial stages to a planned, in-situ, and permanently upgraded development. The Informal Development Stages framework of foundation, infill, and consolidation, as developed by Dr. Jota Samper, is utilized to examine the evolution of street networks. Data is gathered from historical Google Earth satellite images for the time period between 2003 and 2022. The results demonstrate that during the foundation through infill stages, incremental changes follow similar patterns, with pathways extended, lengthened, and densified as housing is created and the settlement grows. In the final stage of consolidation, the resulting street layout is transformed to support the installation of infrastructure; however, some elements of the original street patterns remain. The core pathways remain intact to accommodate the installation of infrastructure and the creation of housing plots, defining the shape of the settlement and providing the basis of the urban form. The adaptations, growth, and consolidation of the street network are critical to the eventual formation of the spatial layout of the settlement. This study will include a comparative analysis of findings with those of recent research performed by Kamalipour, Dovey, and others regarding incremental urbanism within informal settlements. Further comparisons will also include studies of street networks of well-established urban centers that have shown links between the morphogenesis of access networks and the eventual spatial layout of the city. The findings of the study can be used to guide and inform strategies for in-situ upgrading and can contribute to the sustainable development of informal settlements.Keywords: Gobabis Namibia, incremental urbanism, informal development stages, informal settlements, street networks
Procedia PDF Downloads 662799 A Collective Intelligence Approach to Safe Artificial General Intelligence
Authors: Craig A. Kaplan
Abstract:
If AGI proves to be a “winner-take-all” scenario where the first company or country to develop AGI dominates, then the first AGI must also be the safest. The safest, and fastest, path to Artificial General Intelligence (AGI) may be to harness the collective intelligence of multiple AI and human agents in an AGI network. This approach has roots in seminal ideas from four of the scientists who founded the field of Artificial Intelligence: Allen Newell, Marvin Minsky, Claude Shannon, and Herbert Simon. Extrapolating key insights from these founders of AI, and combining them with the work of modern researchers, results in a fast and safe path to AGI. The seminal ideas discussed are: 1) Society of Mind (Minsky), 2) Information Theory (Shannon), 3) Problem Solving Theory (Newell & Simon), and 4) Bounded Rationality (Simon). Society of Mind describes a collective intelligence approach that can be used with AI and human agents to create an AGI network. Information theory helps address the critical issue of how an AGI system will increase its intelligence over time. Problem Solving Theory provides a universal framework that AI and human agents can use to communicate efficiently, effectively, and safely. Bounded Rationality helps us better understand not only the capabilities of SuperIntelligent AGI but also how humans can remain relevant in a world where the intelligence of AGI vastly exceeds that of its human creators. Each key idea can be combined with recent work in the fields of Artificial Intelligence, Machine Learning, and Large Language Models to accelerate the development of a working, safe, AGI system.Keywords: AI Agents, Collective Intelligence, Minsky, Newell, Shannon, Simon, AGI, AGI Safety
Procedia PDF Downloads 93