Search results for: recurrent artificial neural network
4334 Location Choice: The Effects of Network Configuration upon the Distribution of Economic Activities in the Chinese City of Nanning
Authors: Chuan Yang, Jing Bie, Zhong Wang, Panagiotis Psimoulis
Abstract:
Contemporary studies investigating the association between the spatial configuration of the urban network and economic activities at the street level were mostly conducted within space syntax conceptual framework. These findings supported the theory of 'movement economy' and demonstrated the impact of street configuration on the distribution of pedestrian movement and land-use shaping, especially retail activities. However, the effects varied between different urban contexts. In this paper, the relationship between economic activity distribution and the urban configurational characters was examined at the segment level. In the study area, three kinds of neighbourhood types, urban, suburban, and rural neighbourhood, were included. And among all neighbourhoods, three kinds of urban network form, 'tree-like', grid, and organic pattern, were recognised. To investigate the nested effects of urban configuration measured by space syntax approach and urban context, multilevel zero-inflated negative binomial (ZINB) regression models were constructed. Additionally, considering the spatial autocorrelation, spatial lag was also concluded in the model as an independent variable. The random effect ZINB model shows superiority over the ZINB model or multilevel linear (ML) model in the explanation of economic activities pattern shaping over the urban environment. And after adjusting for the neighbourhood type and network form effects, connectivity and syntax centrality significantly affect economic activities clustering. The comparison between accumulative and new established economic activities illustrated the different preferences for economic activity location choice.Keywords: space syntax, economic activities, multilevel model, Chinese city
Procedia PDF Downloads 1254333 A Comparative Soft Computing Approach to Supplier Performance Prediction Using GEP and ANN Models: An Automotive Case Study
Authors: Seyed Esmail Seyedi Bariran, Khairul Salleh Mohamed Sahari
Abstract:
In multi-echelon supply chain networks, optimal supplier selection significantly depends on the accuracy of suppliers’ performance prediction. Different methods of multi criteria decision making such as ANN, GA, Fuzzy, AHP, etc have been previously used to predict the supplier performance but the “black-box” characteristic of these methods is yet a major concern to be resolved. Therefore, the primary objective in this paper is to implement an artificial intelligence-based gene expression programming (GEP) model to compare the prediction accuracy with that of ANN. A full factorial design with %95 confidence interval is initially applied to determine the appropriate set of criteria for supplier performance evaluation. A test-train approach is then utilized for the ANN and GEP exclusively. The training results are used to find the optimal network architecture and the testing data will determine the prediction accuracy of each method based on measures of root mean square error (RMSE) and correlation coefficient (R2). The results of a case study conducted in Supplying Automotive Parts Co. (SAPCO) with more than 100 local and foreign supply chain members revealed that, in comparison with ANN, gene expression programming has a significant preference in predicting supplier performance by referring to the respective RMSE and R-squared values. Moreover, using GEP, a mathematical function was also derived to solve the issue of ANN black-box structure in modeling the performance prediction.Keywords: Supplier Performance Prediction, ANN, GEP, Automotive, SAPCO
Procedia PDF Downloads 4214332 Automatic Classification of Lung Diseases from CT Images
Authors: Abobaker Mohammed Qasem Farhan, Shangming Yang, Mohammed Al-Nehari
Abstract:
Pneumonia is a kind of lung disease that creates congestion in the chest. Such pneumonic conditions lead to loss of life of the severity of high congestion. Pneumonic lung disease is caused by viral pneumonia, bacterial pneumonia, or Covidi-19 induced pneumonia. The early prediction and classification of such lung diseases help to reduce the mortality rate. We propose the automatic Computer-Aided Diagnosis (CAD) system in this paper using the deep learning approach. The proposed CAD system takes input from raw computerized tomography (CT) scans of the patient's chest and automatically predicts disease classification. We designed the Hybrid Deep Learning Algorithm (HDLA) to improve accuracy and reduce processing requirements. The raw CT scans have pre-processed first to enhance their quality for further analysis. We then applied a hybrid model that consists of automatic feature extraction and classification. We propose the robust 2D Convolutional Neural Network (CNN) model to extract the automatic features from the pre-processed CT image. This CNN model assures feature learning with extremely effective 1D feature extraction for each input CT image. The outcome of the 2D CNN model is then normalized using the Min-Max technique. The second step of the proposed hybrid model is related to training and classification using different classifiers. The simulation outcomes using the publically available dataset prove the robustness and efficiency of the proposed model compared to state-of-art algorithms.Keywords: CT scan, Covid-19, deep learning, image processing, lung disease classification
Procedia PDF Downloads 1594331 Fundamental Theory of the Evolution Force: Gene Engineering utilizing Synthetic Evolution Artificial Intelligence
Authors: L. K. Davis
Abstract:
The effects of the evolution force are observable in nature at all structural levels ranging from small molecular systems to conversely enormous biospheric systems. However, the evolution force and work associated with formation of biological structures has yet to be described mathematically or theoretically. In addressing the conundrum, we consider evolution from a unique perspective and in doing so we introduce the “Fundamental Theory of the Evolution Force: FTEF”. We utilized synthetic evolution artificial intelligence (SYN-AI) to identify genomic building blocks and to engineer 14-3-3 ζ docking proteins by transforming gene sequences into time-based DNA codes derived from protein hierarchical structural levels. The aforementioned served as templates for random DNA hybridizations and genetic assembly. The application of hierarchical DNA codes allowed us to fast forward evolution, while dampening the effect of point mutations. Natural selection was performed at each hierarchical structural level and mutations screened using Blosum 80 mutation frequency-based algorithms. Notably, SYN-AI engineered a set of three architecturally conserved docking proteins that retained motion and vibrational dynamics of native Bos taurus 14-3-3 ζ.Keywords: 14-3-3 docking genes, synthetic protein design, time-based DNA codes, writing DNA code from scratch
Procedia PDF Downloads 1174330 Optimization of Feeder Bus Routes at Urban Rail Transit Stations Based on Link Growth Probability
Authors: Yu Song, Yuefei Jin
Abstract:
Urban public transportation can be integrated when there is an efficient connection between urban rail lines, however, there are currently no effective or quick solutions being investigated for this connection. This paper analyzes the space-time distribution and travel demand of passenger connection travel based on taxi track data and data from the road network, excavates potential bus connection stations based on potential connection demand data, and introduces the link growth probability model in the complex network to solve the basic connection bus lines in order to ascertain the direction of the bus lines that are the most connected given the demand characteristics. Then, a tree view exhaustive approach based on constraints is suggested based on graph theory, which can hasten the convergence of findings while doing chain calculations. This study uses WEI QU NAN Station, the Xi'an Metro Line 2 terminal station in Shaanxi Province, as an illustration, to evaluate the model's and the solution method's efficacy. According to the findings, 153 prospective stations have been dug up in total, the feeder bus network for the entire line has been laid out, and the best route adjustment strategy has been found.Keywords: feeder bus, route optimization, link growth probability, the graph theory
Procedia PDF Downloads 784329 Virtualization and Visualization Based Driver Configuration in Operating System
Authors: Pavan Shah
Abstract:
In an Embedded system, Virtualization and visualization technology can provide us an effective response and measurable work in a software development environment. In addition to work of virtualization and virtualization can be easily deserved to provide the best resource sharing between real-time hardware applications and a healthy environment. However, the virtualization is noticeable work to minimize the I/O work and utilize virtualization & virtualization technology for either a software development environment (SDE) or a runtime environment of real-time embedded systems (RTMES) or real-time operating system (RTOS) eras. In this Paper, we particularly focus on virtualization and visualization overheads data of network which generates the I/O and implementation of standardized I/O (i.e., Virto), which can work as front-end network driver in a real-time operating system (RTOS) hardware module. Even there have been several work studies are available based on the virtualization operating system environment, but for the Virto on a general-purpose OS, my implementation is on the open-source Virto for a real-time operating system (RTOS). In this paper, the measurement results show that implementation which can improve the bandwidth and latency of memory management of the real-time operating system environment (RTMES) for getting more accuracy of the trained model.Keywords: virtualization, visualization, network driver, operating system
Procedia PDF Downloads 1354328 Enhancement of Capacity in a MC-CDMA based Cognitive Radio Network Using Non-Cooperative Game Model
Authors: Kalyani Kulkarni, Bharat Chaudhari
Abstract:
This paper addresses the issue of resource allocation in the emerging cognitive technology. Focusing the quality of service (QoS) of primary users (PU), a novel method is proposed for the resource allocation of secondary users (SU). In this paper, we propose the unique utility function in the game theoretic model of Cognitive Radio which can be maximized to increase the capacity of the cognitive radio network (CRN) and to minimize the interference scenario. The utility function is formulated to cater the need of PUs by observing Signal to Noise ratio. The existence of Nash equilibrium is for the postulated game is established.Keywords: cognitive networks, game theory, Nash equilibrium, resource allocation
Procedia PDF Downloads 4814327 Network Governance and Renewable Energy Transition in Sub-Saharan Africa: Contextual Evidence from Ghana
Authors: Kyere Francis, Sun Dongying, Asante Dennis, Nkrumah Nana Kwame Edmund, Naana Yaa Gyamea Kumah
Abstract:
With a focus on renewable energy to achieve low-carbon transition objectives, there is a greater demand for effective collaborative strategies for planning, strategic decision mechanisms, and long-term policy designs to steer the transitions. Government agencies, NGOs, the private sector, and individual citizens play an important role in sustainable energy production. In Ghana, however, such collaboration is fragile in the fight against climate change. This current study seeks to re-examine the position or potential of network governance in Ghana's renewable energy transition. The study adopted a qualitative approach and employed semi-structured interviews for data gathering. To explore network governance and low carbon transitions in Ghana, we examine key themes such as political environment and impact, actor cooperation and stakeholder interactions, financing and the transition, market design and renewable energy integration, existing regulation and policy gaps for renewable energy transition, clean cooking accessibility, and affordability. The findings reveal the following; Lack of comprehensive consultations with relevant stakeholders leads to lower acceptance of the policy model and sometimes lack of policy awareness. Again, the unavailability and affordability of renewable energy technologies and access to credit facilities is a significant hurdle to long-term renewable transition. Ghana's renewable energy transitions require strong networking and interaction among the public, private, and non-governmental organizations. The study participants believe that the involvement of relevant energy experts and stakeholders devoid of any political biases is instrumental in accelerating renewable energy transitions, as emphasized in the proposed framework. The study recommends that the national renewable energy transition plan be evident to all stakeholders and political administrators. Such policy may encourage renewable energy investment through stable and fixed lending rates by the financial institutions and build a network with international organizations and corporations. These findings could serve as valuable information for the transition-based energy process, primarily aiming to govern sustainability changes through network governance.Keywords: actors, development, sustainable energy, network governance, renewable energy transition
Procedia PDF Downloads 914326 Analyzing the Impact of DCF and PCF on WLAN Network Standards 802.11a, 802.11b, and 802.11g
Authors: Amandeep Singh Dhaliwal
Abstract:
Networking solutions, particularly wireless local area networks have revolutionized the technological advancement. Wireless Local Area Networks (WLANs) have gained a lot of popularity as they provide location-independent network access between computing devices. There are a number of access methods used in Wireless Networks among which DCF and PCF are the fundamental access methods. This paper emphasizes on the impact of DCF and PCF access mechanisms on the performance of the IEEE 802.11a, 802.11b and 802.11g standards. On the basis of various parameters viz. throughput, delay, load etc performance is evaluated between these three standards using above mentioned access mechanisms. Analysis revealed a superior throughput performance with low delays for 802.11g standard as compared to 802.11 a/b standard using both DCF and PCF access methods.Keywords: DCF, IEEE, PCF, WLAN
Procedia PDF Downloads 4264325 Automating Self-Representation in the Caribbean: AI Autoethnography and Cultural Analysis
Authors: Steffon Campbell
Abstract:
This research explores the potential of using artificial intelligence (AI) autoethnographies to study, document, explore, and understand aspects of Caribbean culture. As a digital research methodology, AI autoethnography merges computer science and technology with ethnography, providing a fresh approach to collecting and analyzing data to generate novel insights. This research investigates how AI autoethnography can best be applied to understanding the various complexities and nuances of Caribbean culture, as well as examining how technology can be a valuable tool for enriching study of the region. By applying AI autoethnography to Caribbean studies, the research aims to produce new and innovative ways of discovering, understanding, and appreciating the Caribbean. The study found that AI autoethnographies can offer a valuable method for exploring Caribbean culture. Specifically, AI autoethnographies can facilitate experiences of self-reflection, facilitate reconciliation with the past, and provide a platform to explore and understand the cultural, social, political, and economic concerns of Caribbean people. Findings also reveal that these autoethnographies can create a space for people to reimagine and reframe the conversation around Caribbean culture by enabling them to actively participate in the process of knowledge creation. The study also finds that AI autoethnography offers the potential for cross-cultural dialogue, allowing participants to connect with one another over cultural considerations and engage in meaningful discourse.Keywords: artificial intelligence, autoethnography, caribbean, culture
Procedia PDF Downloads 294324 Anomaly Detection Based on System Log Data
Authors: M. Kamel, A. Hoayek, M. Batton-Hubert
Abstract:
With the increase of network virtualization and the disparity of vendors, the continuous monitoring and detection of anomalies cannot rely on static rules. An advanced analytical methodology is needed to discriminate between ordinary events and unusual anomalies. In this paper, we focus on log data (textual data), which is a crucial source of information for network performance. Then, we introduce an algorithm used as a pipeline to help with the pretreatment of such data, group it into patterns, and dynamically label each pattern as an anomaly or not. Such tools will provide users and experts with continuous real-time logs monitoring capability to detect anomalies and failures in the underlying system that can affect performance. An application of real-world data illustrates the algorithm.Keywords: logs, anomaly detection, ML, scoring, NLP
Procedia PDF Downloads 964323 A Global Business Network Built on Hive: Two Use Cases: Buying and Selling of Products and Services and Carrying Out of Social Impact Projects
Authors: Gheyzer Villegas, Edgardo Cedeño, Veruska Mata, Edmundo Chauran
Abstract:
One of the most significant changes that occurred in global commerce was the emergence of cryptocurrencies and blockchain technology. There is still much debate about the adoption of the economic model based on crypto assets, and myriad international projects and initiatives are being carried out to try and explore the potential that this new field offers. The Hive blockchain is a prime example of this, featuring two use cases: of how trade based on its native currencies can give successful results in the exchange of goods and services and in the financing of social impact projects. Its decentralized management model and visionary administration of its development fund have become a key part of its success.Keywords: Hive, business, network, blockchain
Procedia PDF Downloads 704322 Multi Tier Data Collection and Estimation, Utilizing Queue Model in Wireless Sensor Networks
Authors: Amirhossein Mohajerzadeh, Abolghasem Mohajerzadeh
Abstract:
In this paper, target parameter is estimated with desirable precision in hierarchical wireless sensor networks (WSN) while the proposed algorithm also tries to prolong network lifetime as much as possible, using efficient data collecting algorithm. Target parameter distribution function is considered unknown. Sensor nodes sense the environment and send the data to the base station called fusion center (FC) using hierarchical data collecting algorithm. FC builds underlying phenomena based on collected data. Considering the aggregation level, x, the goal is providing the essential infrastructure to find the best value for aggregation level in order to prolong network lifetime as much as possible, while desirable accuracy is guaranteed (required sample size is fully depended on desirable precision). First, the sample size calculation algorithm is discussed, second, the average queue length based on M/M[x]/1/K queue model is determined and it is used for energy consumption calculation. Nodes can decrease transmission cost by aggregating incoming data. Furthermore, the performance of the new algorithm is evaluated in terms of lifetime and estimation accuracy.Keywords: aggregation, estimation, queuing, wireless sensor network
Procedia PDF Downloads 1874321 An Application of Graph Theory to The Electrical Circuit Using Matrix Method
Authors: Samai'la Abdullahi
Abstract:
A graph is a pair of two set and so that a graph is a pictorial representation of a system using two basic element nodes and edges. A node is represented by a circle (either hallo shade) and edge is represented by a line segment connecting two nodes together. In this paper, we present a circuit network in the concept of graph theory application and also circuit models of graph are represented in logical connection method were we formulate matrix method of adjacency and incidence of matrix and application of truth table.Keywords: euler circuit and path, graph representation of circuit networks, representation of graph models, representation of circuit network using logical truth table
Procedia PDF Downloads 5644320 Retrospective Assessment of the Safety and Efficacy of Percutaneous Microwave Ablation in the Management of Hepatic Lesions
Authors: Suang K. Lau, Ismail Goolam, Rafid Al-Asady
Abstract:
Background: The majority of patients with hepatocellular carcinoma (HCC) are not suitable for curative treatment, in the form of surgical resection or transplantation, due to tumour extent and underlying liver dysfunction. In these non-resectable cases, a variety of non-surgical therapies are available, including microwave ablation (MWA), which has shown increasing popularity due to its low morbidity, low reported complication rate, and the ability to perform multiple ablations simultaneously. Objective: The aim of this study was to evaluate the validity of MWA as a viable treatment option in the management of HCC and hepatic metastatic disease, by assessing its efficacy and complication rate at a tertiary hospital situated in Westmead (Australia). Methods: A retrospective observational study was performed evaluating patients that underwent MWA between 1/1/2017–31/12/2018 at Westmead Hospital, NSW, Australia. Outcome measures, including residual disease, recurrence rates, as well as major and minor complication rates, were retrospectively analysed over a 12-months period following MWA treatment. Excluded patients included those whose lesions were treated on the basis of residual or recurrent disease from previous treatment, which occurred prior to the study window (11 patients) and those who were lost to follow up (2 patients). Results: Following treatment of 106 new hepatic lesions, the complete response rate (CR) was 86% (91/106) at 12 months follow up. 10 patients had the residual disease at post-treatment follow up imaging, corresponding to an incomplete response (ICR) rate of 9.4% (10/106). The local recurrence rate (LRR) was 4.6% (5/106) with follow-up period up to 12 months. The minor complication rate was 9.4% (10/106) including asymptomatic pneumothorax (n=2), asymptomatic pleural effusions (n=2), right lower lobe pneumonia (n=3), pain requiring admission (n=1), hypotension (n=1), cellulitis (n=1) and intraparenchymal hematoma (n=1). There was 1 major complication reported, with pleuro-peritoneal fistula causing recurrent large pleural effusion necessitating repeated thoracocentesis (n=1). There was no statistically significant association between tumour size, location or ablation factors, and risk of recurrence or residual disease. A subset analysis identified 6 segment VIII lesions, which were treated via a trans-pleural approach. This cohort demonstrated an overall complication rate of 33% (2/6), including 1 minor complication of asymptomatic pneumothorax and 1 major complication of pleuro-peritoneal fistula. Conclusions: Microwave ablation therapy is an effective and safe treatment option in cases of non-resectable hepatocellular carcinoma and liver metastases, with good local tumour control and low complication rates. A trans-pleural approach for high segment VIII lesions is associated with a higher complication rate and warrants greater caution.Keywords: hepatocellular carcinoma, liver metastases, microwave ablation, trans-pleural approach
Procedia PDF Downloads 1394319 Nighttime Dehaze - Enhancement
Authors: Harshan Baskar, Anirudh S. Chakravarthy, Prateek Garg, Divyam Goel, Abhijith S. Raj, Kshitij Kumar, Lakshya, Ravichandra Parvatham, V. Sushant, Bijay Kumar Rout
Abstract:
In this paper, we introduce a new computer vision task called nighttime dehaze-enhancement. This task aims to jointly perform dehazing and lightness enhancement. Our task fundamentally differs from nighttime dehazing – our goal is to jointly dehaze and enhance scenes, while nighttime dehazing aims to dehaze scenes under a nighttime setting. In order to facilitate further research on this task, we release a new benchmark dataset called Reside-β Night dataset, consisting of 4122 nighttime hazed images from 2061 scenes and 2061 ground truth images. Moreover, we also propose a new network called NDENet (Nighttime Dehaze-Enhancement Network), which jointly performs dehazing and low-light enhancement in an end-to-end manner. We evaluate our method on the proposed benchmark and achieve SSIM of 0.8962 and PSNR of 26.25. We also compare our network with other baseline networks on our benchmark to demonstrate the effectiveness of our approach. We believe that nighttime dehaze-enhancement is an essential task, particularly for autonomous navigation applications, and we hope that our work will open up new frontiers in research. Our dataset and code will be made publicly available upon acceptance of our paper.Keywords: dehazing, image enhancement, nighttime, computer vision
Procedia PDF Downloads 1594318 Quasiperiodic Magnetic Chains as Spin Filters
Authors: Arunava Chakrabarti
Abstract:
A one-dimensional chain of magnetic atoms, representative of a quantum gas in an artificial quasi-periodic potential and modeled by the well-known Aubry-Andre function and its variants are studied in respect of its capability of working as a spin filter for arbitrary spins. The basic formulation is explained in terms of a perfectly periodic chain first, where it is shown that a definite correlation between the spin S of the incoming particles and the magnetic moment h of the substrate atoms can open up a gap in the energy spectrum. This is crucial for a spin filtering action. The simple one-dimensional chain is shown to be equivalent to a 2S+1 strand ladder network. This equivalence is exploited to work out the condition for the opening of gaps. The formulation is then applied for a one-dimensional chain with quasi-periodic variation in the site potentials, the magnetic moments and their orientations following an Aubry-Andre modulation and its variants. In addition, we show that a certain correlation between the system parameters can generate absolutely continuous bands in such systems populated by Bloch like extended wave functions only, signaling the possibility of a metal-insulator transition. This is a case of correlated disorder (a deterministic one), and the results provide a non-trivial variation to the famous Anderson localization problem. We have worked within a tight binding formalism and have presented explicit results for the spin half, spin one, three halves and spin five half particles incident on the magnetic chain to explain our scheme and the central results.Keywords: Aubry-Andre model, correlated disorder, localization, spin filter
Procedia PDF Downloads 3574317 Raising Intercultural Awareness in Colombia Classrooms: A Descriptive Review
Authors: Angela Yicely Castro Garces
Abstract:
Aware of the relevance that intercultural education has gained in foreign language learning and teaching, and acknowledging the need to make it part of our classroom practices, this literature review explores studies that have been published in the Colombian context from the years 2012 to 2019. The inquiry was done in six national peer-reviewed journals, in order to examine the population benefited, types of studies and most recurrent topics of concern for educators. The findings present a promising panorama as teacher educators from public universities are leading the way in conducting research projects aimed at fostering intercultural awareness and building a critical intercultural discourse. Nonetheless, more studies that involve the different stakeholders and contexts need to be developed, in order to make intercultural education more visible in Colombian elementary and high school classrooms.Keywords: Colombian scholarship, foreign language learning, foreign language teaching, intercultural awareness
Procedia PDF Downloads 1454316 Internal and External Influences on the Firm Objective
Authors: A. Briseno, A, Zorrilla
Abstract:
Firms are increasingly responding to social and environmental claims from society. Practices oriented to attend issues such as poverty, work equality, or renewable energy, are being implemented more frequently by firms to address impacts on sustainability. However, questions remain on how the responses of firms vary across industries and regions between the social and the economic objectives. Using concepts from organizational theory and social network theory, this paper aims to create a theoretical framework that explains the internal and external influences that make a firm establish its objective. The framework explains why firms might have a different objective orientation in terms of its economic and social prioritization.Keywords: organizational identity, social network theory, firm objective, value maximization, social responsibility
Procedia PDF Downloads 3114315 Cyber-Social Networks in Preventing Terrorism: Topological Scope
Authors: Alessandra Rossodivita, Alexei Tikhomirov, Andrey Trufanov, Nikolay Kinash, Olga Berestneva, Svetlana Nikitina, Fabio Casati, Alessandro Visconti, Tommaso Saporito
Abstract:
It is well known that world and national societies are exposed to diverse threats: anthropogenic, technological, and natural. Anthropogenic ones are of greater risks and, thus, attract special interest to researchers within wide spectrum of disciplines in efforts to lower the pertinent risks. Some researchers showed by means of multilayered, complex network models how media promotes the prevention of disease spread. To go further, not only are mass-media sources included in scope the paper suggests but also personificated social bots (socbots) linked according to reflexive theory. The novel scope considers information spread over conscious and unconscious agents while counteracting both natural and man-made threats, i.e., infections and terrorist hazards. Contrary to numerous publications on misinformation disseminated by ‘bad’ bots within social networks, this study focuses on ‘good’ bots, which should be mobilized to counter the former ones. These social bots deployed mixture with real social actors that are engaged in concerted actions at spreading, receiving and analyzing information. All the contemporary complex network platforms (multiplexes, interdependent networks, combined stem networks et al.) are comprised to describe and test socbots activities within competing information sharing tools, namely mass-media hubs, social networks, messengers, and e-mail at all phases of disasters. The scope and concomitant techniques present evidence that embedding such socbots into information sharing process crucially change the network topology of actor interactions. The change might improve or impair robustness of social network environment: it depends on who and how controls the socbots. It is demonstrated that the topological approach elucidates techno-social processes within the field and outline the roadmap to a safer world.Keywords: complex network platform, counterterrorism, information sharing topology, social bots
Procedia PDF Downloads 1654314 Analytic Network Process in Location Selection and Its Application to a Real Life Problem
Authors: Eylem Koç, Hasan Arda Burhan
Abstract:
Location selection presents a crucial decision problem in today’s business world where strategic decision making processes have critical importance. Thus, location selection has strategic importance for companies in boosting their strength regarding competition, increasing corporate performances and efficiency in addition to lowering production and transportation costs. A right choice in location selection has a direct impact on companies’ commercial success. In this study, a store location selection problem of Carglass Turkey which operates in vehicle glass branch is handled. As this problem includes both tangible and intangible criteria, Analytic Network Process (ANP) was accepted as the main methodology. The model consists of control hierarchy and BOCR subnetworks which include clusters of actors, alternatives and criteria. In accordance with the management’s choices, five different locations were selected. In addition to the literature review, a strict cooperation with the actor group was ensured and maintained while determining the criteria and during whole process. Obtained results were presented to the management as a report and its feasibility was confirmed accordingly.Keywords: analytic network process (ANP), BOCR, multi-actor decision making, multi-criteria decision making, real-life problem, location selection
Procedia PDF Downloads 4714313 Emotional Artificial Intelligence and the Right to Privacy
Authors: Emine Akar
Abstract:
The majority of privacy-related regulation has traditionally focused on concepts that are perceived to be well-understood or easily describable, such as certain categories of data and personal information or images. In the past century, such regulation appeared reasonably suitable for its purposes. However, technologies such as AI, combined with ever-increasing capabilities to collect, process, and store “big data”, not only require calibration of these traditional understandings but may require re-thinking of entire categories of privacy law. In the presentation, it will be explained, against the background of various emerging technologies under the umbrella term “emotional artificial intelligence”, why modern privacy law will need to embrace human emotions as potentially private subject matter. This argument can be made on a jurisprudential level, given that human emotions can plausibly be accommodated within the various concepts that are traditionally regarded as the underlying foundation of privacy protection, such as, for example, dignity, autonomy, and liberal values. However, the practical reasons for regarding human emotions as potentially private subject matter are perhaps more important (and very likely more convincing from the perspective of regulators). In that respect, it should be regarded as alarming that, according to most projections, the usefulness of emotional data to governments and, particularly, private companies will not only lead to radically increased processing and analysing of such data but, concerningly, to an exponential growth in the collection of such data. In light of this, it is also necessity to discuss options for how regulators could address this emerging threat.Keywords: AI, privacy law, data protection, big data
Procedia PDF Downloads 894312 Alphabet Recognition Using Pixel Probability Distribution
Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay
Abstract:
Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix
Procedia PDF Downloads 3904311 Constructing a Semi-Supervised Model for Network Intrusion Detection
Authors: Tigabu Dagne Akal
Abstract:
While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.Keywords: intrusion detection, data mining, computer science, data mining
Procedia PDF Downloads 2984310 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System
Authors: Dong Seop Lee, Byung Sik Kim
Abstract:
In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.Keywords: disaster information management, unstructured data, optical character recognition, machine learning
Procedia PDF Downloads 1314309 Collaborative Rural Governance Strategy to Enhance Rural Economy Through Village-Owned Enterprise Using Soft System Methodology and Textual Network Analysis
Authors: Robert Saputra, Tomas Havlicek
Abstract:
This study discusses the design of collaborative rural governance strategies to enhance the rural economy through Village-owned Enterprises (VOE) in Riau Province, Indonesia. Using Soft Systems Methodology (SSM) combined with Textual Network Analysis (TNA) in the Rich Picture stage of SSM, we investigated the current state of VOE management. Significant obstacles identified include insufficient business feasibility analyses, lack of managerial skills, misalignment between strategy and practice, and inadequate oversight. To address these challenges, we propose a collaborative strategy involving regional governments, academic institutions, NGOs, and the private sector. This strategy emphasizes community needs assessments, efficient resource mobilization, and targeted training programs. A dedicated working group will ensure continuous monitoring and iterative improvements. Our research highlights the novel integration of SSM with TNA, providing a robust framework for improving VOE management and demonstrating the potential of collaborative efforts in driving rural economic development.Keywords: village-owned enterprises (VOE), rural economic development, soft system methodology (SSM), textual network analysis (TNA), collaborative governance
Procedia PDF Downloads 204308 Parameter Identification Analysis in the Design of Rock Fill Dams
Authors: G. Shahzadi, A. Soulaimani
Abstract:
This research work aims to identify the physical parameters of the constitutive soil model in the design of a rockfill dam by inverse analysis. The best parameters of the constitutive soil model, are those that minimize the objective function, defined as the difference between the measured and numerical results. The Finite Element code (Plaxis) has been utilized for numerical simulation. Polynomial and neural network-based response surfaces have been generated to analyze the relationship between soil parameters and displacements. The performance of surrogate models has been analyzed and compared by evaluating the root mean square error. A comparative study has been done based on objective functions and optimization techniques. Objective functions are categorized by considering measured data with and without uncertainty in instruments, defined by the least square method, which estimates the norm between the predicted displacements and the measured values. Hydro Quebec provided data sets for the measured values of the Romaine-2 dam. Stochastic optimization, an approach that can overcome local minima, and solve non-convex and non-differentiable problems with ease, is used to obtain an optimum value. Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and Differential Evolution (DE) are compared for the minimization problem, although all these techniques take time to converge to an optimum value; however, PSO provided the better convergence and best soil parameters. Overall, parameter identification analysis could be effectively used for the rockfill dam application and has the potential to become a valuable tool for geotechnical engineers for assessing dam performance and dam safety.Keywords: Rockfill dam, parameter identification, stochastic analysis, regression, PLAXIS
Procedia PDF Downloads 1474307 The Proposal for a Framework to Face Opacity and Discrimination ‘Sins’ Caused by Consumer Creditworthiness Machines in the EU
Authors: Diogo José Morgado Rebelo, Francisco António Carneiro Pacheco de Andrade, Paulo Jorge Freitas de Oliveira Novais
Abstract:
Not everything in AI-power consumer credit scoring turns out to be a wonder. When using AI in Creditworthiness Assessment (CWA), opacity and unfairness ‘sins’ must be considered to the task be deemed Responsible. AI software is not always 100% accurate, which can lead to misclassification. Discrimination of some groups can be exponentiated. A hetero personalized identity can be imposed on the individual(s) affected. Also, autonomous CWA sometimes lacks transparency when using black box models. However, for this intended purpose, human analysts ‘on-the-loop’ might not be the best remedy consumers are looking for in credit. This study seeks to explore the legality of implementing a Multi-Agent System (MAS) framework in consumer CWA to ensure compliance with the regulation outlined in Article 14(4) of the Proposal for an Artificial Intelligence Act (AIA), dated 21 April 2021 (as per the last corrigendum by the European Parliament on 19 April 2024), Especially with the adoption of Art. 18(8)(9) of the EU Directive 2023/2225, of 18 October, which will go into effect on 20 November 2026, there should be more emphasis on the need for hybrid oversight in AI-driven scoring to ensure fairness and transparency. In fact, the range of EU regulations on AI-based consumer credit will soon impact the AI lending industry locally and globally, as shown by the broad territorial scope of AIA’s Art. 2. Consequently, engineering the law of consumer’s CWA is imperative. Generally, the proposed MAS framework consists of several layers arranged in a specific sequence, as follows: firstly, the Data Layer gathers legitimate predictor sets from traditional sources; then, the Decision Support System Layer, whose Neural Network model is trained using k-fold Cross Validation, provides recommendations based on the feeder data; the eXplainability (XAI) multi-structure comprises Three-Step-Agents; and, lastly, the Oversight Layer has a 'Bottom Stop' for analysts to intervene in a timely manner. From the analysis, one can assure a vital component of this software is the XAY layer. It appears as a transparent curtain covering the AI’s decision-making process, enabling comprehension, reflection, and further feasible oversight. Local Interpretable Model-agnostic Explanations (LIME) might act as a pillar by offering counterfactual insights. SHapley Additive exPlanation (SHAP), another agent in the XAI layer, could address potential discrimination issues, identifying the contribution of each feature to the prediction. Alternatively, for thin or no file consumers, the Suggestion Agent can promote financial inclusion. It uses lawful alternative sources such as the share of wallet, among others, to search for more advantageous solutions to incomplete evaluation appraisals based on genetic programming. Overall, this research aspires to bring the concept of Machine-Centered Anthropocentrism to the table of EU policymaking. It acknowledges that, when put into service, credit analysts no longer exert full control over the data-driven entities programmers have given ‘birth’ to. With similar explanatory agents under supervision, AI itself can become self-accountable, prioritizing human concerns and values. AI decisions should not be vilified inherently. The issue lies in how they are integrated into decision-making and whether they align with non-discrimination principles and transparency rules.Keywords: creditworthiness assessment, hybrid oversight, machine-centered anthropocentrism, EU policymaking
Procedia PDF Downloads 364306 Feeding Behavior of Sweetpotato Weevil, Cylas formicarius (Fabricius) (Coleoptera:Brentidae) on Three Sweetpotato, Ipomoea batatas L. Cultivars Grown in Tarlac Philippines
Authors: Jerah Mystica B. Novenario, Flor A. Ceballo-Alcantara
Abstract:
Sweetpotato is grown in tropical countries for its edible tubers, which became an important source of food. It is usually propagated through vine cutting which may be obtained from harvested plants or from nurseries intended for cutting production only. The recurrent use of vines may cause increased weevil infestation. The crop is known to be infested with insect pests, more importantly, the sweetpotato weevil, Cylasformicarius, which targets the tubers and thus cause economic losses. Sweetpotato farmers in Tarlac claim that only one sweetpotato cultivar is being attacked by C. formicarius. However, in was found in this experiment that feeding and feeding behavior of the weevil were not affected by the cultivar provided; such that no significant differences were observed on the average amount of tuber consumed by both male (F=0.86; df=2; P=0.45) and female (F=2.71; df=2; P=0.11) and feeding time (F=0.9; df=2; P=0.43). Conversely, in terms of damage assessment, significantly different (F=1.64; df=2; P=0.23) results were noted.Keywords: cylas formicarius, feeding behavior, insect pest, sweetpotato
Procedia PDF Downloads 984305 Detection, Analysis and Determination of the Origin of Copy Number Variants (CNVs) in Intellectual Disability/Developmental Delay (ID/DD) Patients and Autistic Spectrum Disorders (ASD) Patients by Molecular and Cytogenetic Methods
Authors: Pavlina Capkova, Josef Srovnal, Vera Becvarova, Marie Trkova, Zuzana Capkova, Andrea Stefekova, Vaclava Curtisova, Alena Santava, Sarka Vejvalkova, Katerina Adamova, Radek Vodicka
Abstract:
ASDs are heterogeneous and complex developmental diseases with a significant genetic background. Recurrent CNVs are known to be a frequent cause of ASD. These CNVs can have, however, a variable expressivity which results in a spectrum of phenotypes from asymptomatic to ID/DD/ASD. ASD is associated with ID in ~75% individuals. Various platforms are used to detect pathogenic mutations in the genome of these patients. The performed study is focused on a determination of the frequency of pathogenic mutations in a group of ASD patients and a group of ID/DD patients using various strategies along with a comparison of their detection rate. The possible role of the origin of these mutations in aetiology of ASD was assessed. The study included 35 individuals with ASD and 68 individuals with ID/DD (64 males and 39 females in total), who underwent rigorous genetic, neurological and psychological examinations. Screening for pathogenic mutations involved karyotyping, screening for FMR1 mutations and for metabolic disorders, a targeted MLPA test with probe mixes Telomeres 3 and 5, Microdeletion 1 and 2, Autism 1, MRX and a chromosomal microarray analysis (CMA) (Illumina or Affymetrix). Chromosomal aberrations were revealed in 7 (1 in the ASD group) individuals by karyotyping. FMR1 mutations were discovered in 3 (1 in the ASD group) individuals. The detection rate of pathogenic mutations in ASD patients with a normal karyotype was 15.15% by MLPA and CMA. The frequencies of the pathogenic mutations were 25.0% by MLPA and 35.0% by CMA in ID/DD patients with a normal karyotype. CNVs inherited from asymptomatic parents were more abundant than de novo changes in ASD patients (11.43% vs. 5.71%) in contrast to the ID/DD group where de novo mutations prevailed over inherited ones (26.47% vs. 16.18%). ASD patients shared more frequently their mutations with their fathers than patients from ID/DD group (8.57% vs. 1.47%). Maternally inherited mutations predominated in the ID/DD group in comparison with the ASD group (14.7% vs. 2.86 %). CNVs of an unknown significance were found in 10 patients by CMA and in 3 patients by MLPA. Although the detection rate is the highest when using CMA, recurrent CNVs can be easily detected by MLPA. CMA proved to be more efficient in the ID/DD group where a larger spectrum of rare pathogenic CNVs was revealed. This study determined that maternally inherited highly penetrant mutations and de novo mutations more often resulted in ID/DD without ASD in patients. The paternally inherited mutations could be, however, a source of the greater variability in the genome of the ASD patients and contribute to the polygenic character of the inheritance of ASD. As the number of the subjects in the group is limited, a larger cohort is needed to confirm this conclusion. Inherited CNVs have a role in aetiology of ASD possibly in combination with additional genetic factors - the mutations elsewhere in the genome. The identification of these interactions constitutes a challenge for the future. Supported by MH CZ – DRO (FNOl, 00098892), IGA UP LF_2016_010, TACR TE02000058 and NPU LO1304.Keywords: autistic spectrum disorders, copy number variant, chromosomal microarray, intellectual disability, karyotyping, MLPA, multiplex ligation-dependent probe amplification
Procedia PDF Downloads 353