Search results for: law enforcement intelligence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1785

Search results for: law enforcement intelligence

675 The Effect of Artificial Intelligence on Construction Development

Authors: Shady Gamal Aziz Shehata

Abstract:

Difficulty in defining construction quality arises due to perception based on the nature and requirements of the market, the different partners themselves and the results they want. Quantitative research was used in this constructivist research. A case-based study was conducted to assess the structures of positive attitudes and expectations in the context of quality improvement. A survey based on expert opinions was analyzed among construction organizations/companies operating in the construction industry in Pakistan. The financial strength, management structure and construction experience of the construction companies formed the basis of their selection. A good concept is visible at the project level and is seen as the most valuable part of the construction project. Each quality improvement technique was expected to increase the user's profits by improving the efficiency of the construction project. The Survey is useful for construction professionals to evaluate current construction concepts and expectations for the application of quality improvement techniques in construction projects.

Keywords: correlation analysis, lean construction tools, lean construction, logistic regression analysis, risk management, safety construction quality, expectation, improvement, perception

Procedia PDF Downloads 27
674 Application of Artificial Neural Network in Assessing Fill Slope Stability

Authors: An-Jui. Li, Kelvin Lim, Chien-Kuo Chiu, Benson Hsiung

Abstract:

This paper details the utilization of artificial intelligence (AI) in the field of slope stability whereby quick and convenient solutions can be obtained using the developed tool. The AI tool used in this study is the artificial neural network (ANN), while the slope stability analysis methods are the finite element limit analysis methods. The developed tool allows for the prompt prediction of the safety factors of fill slopes and their corresponding probability of failure (depending on the degree of variation of the soil parameters), which can give the practicing engineer a reasonable basis in their decision making. In fact, the successful use of the Extreme Learning Machine (ELM) algorithm shows that slope stability analysis is no longer confined to the conventional methods of modeling, which at times may be tedious and repetitive during the preliminary design stage where the focus is more on cost saving options rather than detailed design. Therefore, similar ANN-based tools can be further developed to assist engineers in this aspect.

Keywords: landslide, limit analysis, artificial neural network, soil properties

Procedia PDF Downloads 183
673 Intrusion Detection Using Dual Artificial Techniques

Authors: Rana I. Abdulghani, Amera I. Melhum

Abstract:

With the abnormal growth of the usage of computers over networks and under the consideration or agreement of most of the computer security experts who said that the goal of building a secure system is never achieved effectively, all these points led to the design of the intrusion detection systems(IDS). This research adopts a comparison between two techniques for network intrusion detection, The first one used the (Particles Swarm Optimization) that fall within the field (Swarm Intelligence). In this Act, the algorithm Enhanced for the purpose of obtaining the minimum error rate by amending the cluster centers when better fitness function is found through the training stages. Results show that this modification gives more efficient exploration of the original algorithm. The second algorithm used a (Back propagation NN) algorithm. Finally a comparison between the results of two methods used were based on (NSL_KDD) data sets for the construction and evaluation of intrusion detection systems. This research is only interested in clustering the two categories (Normal and Abnormal) for the given connection records. Practices experiments result in intrude detection rate (99.183818%) for EPSO and intrude detection rate (69.446416%) for BP neural network.

Keywords: IDS, SI, BP, NSL_KDD, PSO

Procedia PDF Downloads 365
672 Digital Immunity System for Healthcare Data Security

Authors: Nihar Bheda

Abstract:

Protecting digital assets such as networks, systems, and data from advanced cyber threats is the aim of Digital Immunity Systems (DIS), which are a subset of cybersecurity. With features like continuous monitoring, coordinated reactions, and long-term adaptation, DIS seeks to mimic biological immunity. This minimizes downtime by automatically identifying and eliminating threats. Traditional security measures, such as firewalls and antivirus software, are insufficient for enterprises, such as healthcare providers, given the rapid evolution of cyber threats. The number of medical record breaches that have occurred in recent years is proof that attackers are finding healthcare data to be an increasingly valuable target. However, obstacles to enhancing security include outdated systems, financial limitations, and a lack of knowledge. DIS is an advancement in cyber defenses designed specifically for healthcare settings. Protection akin to an "immune system" is produced by core capabilities such as anomaly detection, access controls, and policy enforcement. Coordination of responses across IT infrastructure to contain attacks is made possible by automation and orchestration. Massive amounts of data are analyzed by AI and machine learning to find new threats. After an incident, self-healing enables services to resume quickly. The implementation of DIS is consistent with the healthcare industry's urgent requirement for resilient data security in light of evolving risks and strict guidelines. With resilient systems, it can help organizations lower business risk, minimize the effects of breaches, and preserve patient care continuity. DIS will be essential for protecting a variety of environments, including cloud computing and the Internet of medical devices, as healthcare providers quickly adopt new technologies. DIS lowers traditional security overhead for IT departments and offers automated protection, even though it requires an initial investment. In the near future, DIS may prove to be essential for small clinics, blood banks, imaging centers, large hospitals, and other healthcare organizations. Cyber resilience can become attainable for the whole healthcare ecosystem with customized DIS implementations.

Keywords: digital immunity system, cybersecurity, healthcare data, emerging technology

Procedia PDF Downloads 47
671 Machine Learning Automatic Detection on Twitter Cyberbullying

Authors: Raghad A. Altowairgi

Abstract:

With the wide spread of social media platforms, young people tend to use them extensively as the first means of communication due to their ease and modernity. But these platforms often create a fertile ground for bullies to practice their aggressive behavior against their victims. Platform usage cannot be reduced, but intelligent mechanisms can be implemented to reduce the abuse. This is where machine learning comes in. Understanding and classifying text can be helpful in order to minimize the act of cyberbullying. Artificial intelligence techniques have expanded to formulate an applied tool to address the phenomenon of cyberbullying. In this research, machine learning models are built to classify text into two classes; cyberbullying and non-cyberbullying. After preprocessing the data in 4 stages; removing characters that do not provide meaningful information to the models, tokenization, removing stop words, and lowering text. BoW and TF-IDF are used as the main features for the five classifiers, which are; logistic regression, Naïve Bayes, Random Forest, XGboost, and Catboost classifiers. Each of them scores 92%, 90%, 92%, 91%, 86% respectively.

Keywords: cyberbullying, machine learning, Bag-of-Words, term frequency-inverse document frequency, natural language processing, Catboost

Procedia PDF Downloads 109
670 Estimating Poverty Levels from Satellite Imagery: A Comparison of Human Readers and an Artificial Intelligence Model

Authors: Ola Hall, Ibrahim Wahab, Thorsteinn Rognvaldsson, Mattias Ohlsson

Abstract:

The subfield of poverty and welfare estimation that applies machine learning tools and methods on satellite imagery is a nascent but rapidly growing one. This is in part driven by the sustainable development goal, whose overarching principle is that no region is left behind. Among other things, this requires that welfare levels can be accurately and rapidly estimated at different spatial scales and resolutions. Conventional tools of household surveys and interviews do not suffice in this regard. While they are useful for gaining a longitudinal understanding of the welfare levels of populations, they do not offer adequate spatial coverage for the accuracy that is needed, nor are their implementation sufficiently swift to gain an accurate insight into people and places. It is this void that satellite imagery fills. Previously, this was near-impossible to implement due to the sheer volume of data that needed processing. Recent advances in machine learning, especially the deep learning subtype, such as deep neural networks, have made this a rapidly growing area of scholarship. Despite their unprecedented levels of performance, such models lack transparency and explainability and thus have seen limited downstream applications as humans generally are apprehensive of techniques that are not inherently interpretable and trustworthy. While several studies have demonstrated the superhuman performance of AI models, none has directly compared the performance of such models and human readers in the domain of poverty studies. In the present study, we directly compare the performance of human readers and a DL model using different resolutions of satellite imagery to estimate the welfare levels of demographic and health survey clusters in Tanzania, using the wealth quintile ratings from the same survey as the ground truth data. The cluster-level imagery covers all 608 cluster locations, of which 428 were classified as rural. The imagery for the human readers was sourced from the Google Maps Platform at an ultra-high resolution of 0.6m per pixel at zoom level 18, while that of the machine learning model was sourced from the comparatively lower resolution Sentinel-2 10m per pixel data for the same cluster locations. Rank correlation coefficients of between 0.31 and 0.32 achieved by the human readers were much lower when compared to those attained by the machine learning model – 0.69-0.79. This superhuman performance by the model is even more significant given that it was trained on the relatively lower 10-meter resolution satellite data while the human readers estimated welfare levels from the higher 0.6m spatial resolution data from which key markers of poverty and slums – roofing and road quality – are discernible. It is important to note, however, that the human readers did not receive any training before ratings, and had this been done, their performance might have improved. The stellar performance of the model also comes with the inevitable shortfall relating to limited transparency and explainability. The findings have significant implications for attaining the objective of the current frontier of deep learning models in this domain of scholarship – eXplainable Artificial Intelligence through a collaborative rather than a comparative framework.

Keywords: poverty prediction, satellite imagery, human readers, machine learning, Tanzania

Procedia PDF Downloads 78
669 Key Performance Indicators and the Model for Achieving Digital Inclusion for Smart Cities

Authors: Khalid Obaed Mahmod, Mesut Cevik

Abstract:

The term smart city has appeared recently and was accompanied by many definitions and concepts, but as a simplified and clear definition, it can be said that the smart city is a geographical location that has gained efficiency and flexibility in providing public services to citizens through its use of technological and communication technologies, and this is what distinguishes it from other cities. Smart cities connect the various components of the city through the main and sub-networks in addition to a set of applications and thus be able to collect data that is the basis for providing technological solutions to manage resources and provide services. The basis of the work of the smart city is the use of artificial intelligence and the technology of the Internet of Things. The work presents the concept of smart cities, the pillars, standards, and evaluation indicators on which smart cities depend, and the reasons that prompted the world to move towards its establishment. It also provides a simplified hypothetical way to measure the ideal smart city model by defining some indicators and key pillars, simulating them with logic circuits, and testing them to determine if the city can be considered an ideal smart city or not.

Keywords: factors, indicators, logic gates, pillars, smart city

Procedia PDF Downloads 127
668 Exploring the Risks and Vulnerabilities of Child Trafficking in West Java, Indonesia

Authors: B. Rusyidi, D. Mariana

Abstract:

Although reforms in trafficking regulations have taken place since 2007, Indonesia is still struggling to fight child trafficking. This study aimed to identify and assess risk factors and vulnerabilities in the life of trafficked children prior to, during, and after being trafficked in order to inform the child protection system and its policies. The study was qualitative and utilized in-depth interviews to collect data. Data were gathered in 2014 and 2015 from 15 trafficked and sexually exploited girls aged 14 to 17 years originating from West Java. Social workers, safe home personnel and parents were also included as informants. Data analysis was guided by the ecological perspective and theme analyses. The study found that risks and vulnerabilities of the victims were associated with conditions at various levels of the environment. At the micro level, risk factors and vulnerabilities included young age, family conflict/violence, involvement with the “wrong” circle of friends/peers, family poverty, lack of social and economic support for the victim’s family, and psychological damages due to trafficking experiences. At the mezzo level, the lack of structured activities after school, economic inequality, stigma towards victims, lack of services for victims, and minimum public education on human trafficking were among the community hazards that increased the vulnerability and risks. Gender inequality, consumerism, the view of children as assets, corruption, weak law enforcement, the lack of institutional support, and community-wide ignorance regarding trafficking were found as factors that increased risks and vulnerabilities at the macro level. The findings from the study underline the necessity to reduce risk factors and promote protective factors at the individual, family, community and societal levels. Shifting the current focus from tertiary to primary/prevention policies and improving institutional efforts are pressing needs in the context of reducing child trafficking in Indonesia. The roles of human service providers including social work also should be promoted.

Keywords: child trafficking, child sexual exploitation, ecological perspective, risks and vulnerabilities

Procedia PDF Downloads 256
667 A Hybrid Derivative-Free Optimization Method for Pass Schedule Calculation in Cold Rolling Mill

Authors: Mohammadhadi Mirmohammadi, Reza Safian, Hossein Haddad

Abstract:

This paper presents an innovative solution for complex multi-objective optimization problem which is a part of efforts toward maximizing rolling mill throughput and minimizing processing costs in tandem cold rolling. This computational intelligence based optimization has been applied to the rolling schedules of tandem cold rolling mill. This method involves the combination of two derivative-free optimization procedures in the form of nested loops. The first optimization loop is based on Improving Hit and Run method which focus on balance of power, force and reduction distribution in rolling schedules. The second loop is a real-coded genetic algorithm based optimization procedure which optimizes energy consumption and productivity. An experimental result of application to five stand tandem cold rolling mill is presented.

Keywords: derivative-free optimization, Improving Hit and Run method, real-coded genetic algorithm, rolling schedules of tandem cold rolling mill

Procedia PDF Downloads 675
666 Enhancing Academic Achievement of University Student through Stress Management Training: A Study from Southern Punjab, Pakistan

Authors: Rizwana Amin, Afshan Afroze Bhatti

Abstract:

The study was a quasi-experimental pre-post test design including two groups. Data was collected from 127 students through non-probability random sampling from Bahaudin Zakariya University Multan. The groups were given pre-test using perceived stress scale and information about academic achievement was taken by self-report. After screening, 27 participants didn’t meet the criterion. Remaining 100 participants were divided into two groups (experimental and control). Further, 4 students of experimental group denied taking intervention. Then 46 understudies were separated into three subgroups (16, 15 and 15 in each) for training. The experimental groups were given the stress management training, each of experimental group attended one 3-hour training sessions separately while the control group was only given pre-post assessment. The data were analyzed using ANCOVA method (analysis of covariance) t–test. Results of the study indicate that stress training will lead to increased emotional intelligence and academic achievement of students.

Keywords: stress, stress management, academic achievement, students

Procedia PDF Downloads 320
665 Short-Term and Working Memory Differences Across Age and Gender in Children

Authors: Farzaneh Badinloo, Niloufar Jalali-Moghadam, Reza Kormi-Nouri

Abstract:

The aim of this study was to explore the short-term and working memory performances across age and gender in school aged children. Most of the studies have been interested in looking into memory changes in adult subjects. This study was instead focused on exploring both short-term and working memories of children over time. Totally 410 school child participants belonging to four age groups (approximately 8, 10, 12 and 14 years old) among which were 201 girls and 208 boys were employed in the study. digits forward and backward tests of the Wechsler children intelligence scale-revised were conducted respectively as short-term and working memory measures. According to results, there was found a general increment in both short-term and working memory scores across age (p ˂ .05) by which whereas short-term memory performance was shown to increase up to 12 years old, working memory scores showed no significant increase after 10 years old of age. No difference was observed in terms of gender (p ˃ .05). In conclusion, this study suggested that both short-term and working memories improve across age in children where 12 and 10 years of old are likely the crucial age periods in terms of short-term and working memories development.

Keywords: age, gender, short-term memory, working memory

Procedia PDF Downloads 457
664 Hybrid Strategies of Crisis Intervention for Sexualized Violence Using Digital Media

Authors: Katharina Kargel, Frederic Vobbe

Abstract:

Sexualized violence against children and adolescents using digital media poses particular challenges for practitioners with a focus on crisis intervention (social work, psychotherapy, law enforcement). The technical delimitation of violence increases the burden on those affected and increases the complexity of interdisciplinary cooperation. Urgently needed recommendations for practical action do not yet exist in Germany. Funded by the Federal Ministry of Education and Research, these recommendations for action are being developed in the HUMAN project together with science and practice. The presentation introduces the participatory approach of the HUMAN project. We discuss the application-oriented, casuistic approach of the project and present its results using the example of concrete case-based recommendations for Action. The participants will be presented with concrete prototypical case studies from the project, which will be used to illustrate quality criteria for crisis intervention in cases of sexualized violence using digital media. On the basis of case analyses, focus group interviews and interviews with victims of violence, we present the six central challenges of sexualized violence with the use of digital media, namely: • Diffusion (Ambiguities regarding the extent and significance of violence) , • Transcendence (Space and time independence of the dynamics of violence, omnipresence), • omnipresent anxiety (considering diffusion and transcendence), • being haunted (repeated confrontation with digital memories of violence or the perpetrator), • disparity (conflicts of interpretative power between those affected and the social environment) • simultaneity (of all other factors). We point out generalizable principles with which these challenges can be dealt with professionally. Dealing professionally with sexualized violence using digital media requires a stronger networking of professional actors. A clear distinction must be made between their own mission and the mission of the network partners. Those affected by violence must be shown options for crisis intervention in the context of the aid networks. The different competencies and the professional mission of the offers of help are to be made transparent. The necessity of technical possibilities for deleting abuse images beyond criminal prosecution will be discussed. Those affected are stabilized by multimodal strategies such as a combination of rational emotive therapy, legal support and technical assistance.

Keywords: sexualized violence, intervention, digital media, children and youth

Procedia PDF Downloads 212
663 Risk Tolerance and Individual Worthiness Based on Simultaneous Analysis of the Cognitive Performance and Emotional Response to a Multivariate Situational Risk Assessment

Authors: Frederic Jumelle, Kelvin So, Didan Deng

Abstract:

A method and system for neuropsychological performance test, comprising a mobile terminal, used to interact with a cloud server which stores user information and is logged into by the user through the terminal device; the user information is directly accessed through the terminal device and is processed by artificial neural network, and the user information comprises user facial emotions information, performance test answers information and user chronometrics. This assessment is used to evaluate the cognitive performance and emotional response of the subject to a series of dichotomous questions describing various situations of daily life and challenging the users' knowledge, values, ethics, and principles. In industrial applications, the timing of this assessment will depend on the users' need to obtain a service from a provider, such as opening a bank account, getting a mortgage or an insurance policy, authenticating clearance at work, or securing online payments.

Keywords: artificial intelligence, neurofinance, neuropsychology, risk management

Procedia PDF Downloads 117
662 Performance Comparison of ADTree and Naive Bayes Algorithms for Spam Filtering

Authors: Thanh Nguyen, Andrei Doncescu, Pierre Siegel

Abstract:

Classification is an important data mining technique and could be used as data filtering in artificial intelligence. The broad application of classification for all kind of data leads to be used in nearly every field of our modern life. Classification helps us to put together different items according to the feature items decided as interesting and useful. In this paper, we compare two classification methods Naïve Bayes and ADTree use to detect spam e-mail. This choice is motivated by the fact that Naive Bayes algorithm is based on probability calculus while ADTree algorithm is based on decision tree. The parameter settings of the above classifiers use the maximization of true positive rate and minimization of false positive rate. The experiment results present classification accuracy and cost analysis in view of optimal classifier choice for Spam Detection. It is point out the number of attributes to obtain a tradeoff between number of them and the classification accuracy.

Keywords: classification, data mining, spam filtering, naive bayes, decision tree

Procedia PDF Downloads 393
661 A Review on Water Models of Surface Water Environment

Authors: Shahbaz G. Hassan

Abstract:

Water quality models are very important to predict the changes in surface water quality for environmental management. The aim of this paper is to give an overview of the water qualities, and to provide directions for selecting models in specific situation. Water quality models include one kind of model based on a mechanistic approach, while other models simulate water quality without considering a mechanism. Mechanistic models can be widely applied and have capabilities for long-time simulation, with highly complexity. Therefore, more spaces are provided to explain the principle and application experience of mechanistic models. Mechanism models have certain assumptions on rivers, lakes and estuaries, which limits the application range of the model, this paper introduces the principles and applications of water quality model based on the above three scenarios. On the other hand, mechanistic models are more easily to compute, and with no limit to the geographical conditions, but they cannot be used with confidence to simulate long term changes. This paper divides the empirical models into two broad categories according to the difference of mathematical algorithm, models based on artificial intelligence and models based on statistical methods.

Keywords: empirical models, mathematical, statistical, water quality

Procedia PDF Downloads 243
660 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 144
659 Balance of Natural Resources to Manage Land Use Changes in Subosukawonosraten Area

Authors: Sri E. Wati, D. Roswidyatmoko, N. Maslahatun, Gunawan, Andhika B. Taji

Abstract:

Natural resource is the main sources to fulfill human needs. Its utilization must consider not only human prosperity but also sustainability. Balance of natural resources is a tool to manage natural wealth and to control land use change. This tool is needed to organize land use planning as stated on spatial plan in a certain region. Balance of natural resources can be calculated by comparing two-series of natural resource data obtained at different year. In this case, four years data period of land and forest were used (2010 and 2014). Land use data were acquired through satellite image interpretation and field checking. By means of GIS analysis, its result was then assessed with land use plan. It is intended to evaluate whether existing land use is suitable with land use plan. If it is improper, what kind of efforts and policies must be done to overcome the situation. Subosukawonosraten is rapid developed areas in Central Java Province. This region consists of seven regencies/cities which are Sukoharjo Regency, Boyolali Regency, Surakarta City, Karanganyar Regency, Wonogiri Regency, Sragen Regency, and Klaten Regency. This region is regarding to several former areas under Karasidenan Surakarta and their location is adjacent to Surakarta. Balance of forest resources show that width of forest area is not significantly changed. Some land uses within the area are slightly changed. Some rice field areas are converted into settlement (0.03%) whereas water bodies become vacant areas (0.09%). On the other hand, balance of land resources state that there are many land use changes in this region. Width area of rice field decreases 428 hectares and more than 50% of them have been transformed into settlement area and 11.21% is converted into buildings such as factories, hotels, and other infrastructures. It occurs mostly in Sragen, Sukoharjo, and Karanganyar Regency. The results illustrate that land use change in this region is mostly influenced by increasing of population number. Some agricultural lands have been converted into built-up area since demand of settlement, industrial area, and other infrastructures also increases. Unfortunately, recent utilization of more than a half of total area is not appropriate with land use plan declared in spatial planning document. It means, local government shall develop a strict regulation and law enforcement related to any violation in land use management.

Keywords: balance, forest, land, spatial plan

Procedia PDF Downloads 304
658 Software-Defined Networks in Utility Power Networks

Authors: Ava Salmanpour, Hanieh Saeedi, Payam Rouhi, Elahe Hamzeil, Shima Alimohammadi, Siamak Hossein Khalaj, Mohammad Asadian

Abstract:

Software-defined network (SDN) is a network architecture designed to control network using software application in a central manner. This ability enables remote control of the whole network regardless of the network technology. In fact, in this architecture network intelligence is separated from physical infrastructure, it means that required network components can be implemented virtually using software applications. Today, power networks are characterized by a high range of complexity with a large number of intelligent devices, processing both huge amounts of data and important information. Therefore, reliable and secure communication networks are required. SDNs are the best choice to meet this issue. In this paper, SDN networks capabilities and characteristics will be reviewed and different basic controllers will be compared. The importance of using SDNs to escalate efficiency and reliability in utility power networks is going to be discussed and the comparison between the SDN-based power networks and traditional networks will be explained.

Keywords: software-defined network, SDNs, utility network, open flow, communication, gas and electricity, controller

Procedia PDF Downloads 88
657 Health Percentage Evaluation for Satellite Electrical Power System Based on Linear Stresses Accumulation Damage Theory

Authors: Lin Wenli, Fu Linchun, Zhang Yi, Wu Ming

Abstract:

To meet the demands of long-life and high-intelligence for satellites, the electrical power system should be provided with self-health condition evaluation capability. Any over-stress events in operations should be recorded. Based on Linear stresses accumulation damage theory, accumulative damage analysis was performed on thermal-mechanical-electrical united stresses for three components including the solar array, the batteries and the power conditioning unit. Then an overall health percentage evaluation model for satellite electrical power system was built. To obtain the accurate quantity for system health percentage, an automatic feedback closed-loop correction method for all coefficients in the evaluation model was present. The evaluation outputs could be referred as taking earlier fault-forecast and interventions for Ground Control Center or Satellites self.

Keywords: satellite electrical power system, health percentage, linear stresses accumulation damage, evaluation model

Procedia PDF Downloads 375
656 Walls against Legal Identity: A Qualitative Study on Children of Refugees without Birth Registration in Malaysia

Authors: Rodziana M. Razali, Tamara J. Duraisingham

Abstract:

Malaysia is not a signatory to the 1951 Refugee Convention and its 1967 Protocol despite receiving the largest share of refugee inflows in Southeast Asia aside from Thailand. In Peninsular Malaysia, the majority of refugees and asylum seekers are from Myanmar, with Rohingya refugees recording the highest number compared to all other ethnicities. In the eastern state of Sabah, the presence of refugees who have long established themselves in the state is connected to those who escaped military persecution in southern Philippines in the 1970’s and 1980’s. A combination of legal and non-legal factors has created and sustained an adverse atmosphere of deprivation of legal identity for children of migrants including refugees born in Malaysia. This paper aims to qualitatively analyse the barriers to birth registration as the cornerstone of every person’s legal identity for children of refugees born in this country, together with the associated human rights implications. Data obtained through semi-structured interviews with refugees in Kota Kinabalu, Sabah and Rohingya refugees in Peninsular Malaysia shall be studied alongside secondary sources. Results show that births out of medical facilities, suspension of birth records, illiteracy, lack of awareness on the importance and procedures of birth registration, inability to meet documentary requirements, as well as fear of immigration enforcement, are the key factors hindering birth registration. These challenges exist against the backdrop of restrictive integration policy to avoid destabilising demographic and racial balance, political sentiment stirring xenophobic prejudices, as well as other economic and national security considerations. With no proof of their legal identity, the affected children grow up in a legal limbo, facing multiple human rights violations across generations. This research concludes that the country’s framework and practice concerning birth registration is in need of serious reform and improvement to reflect equality and universality of access to its birth registration system. Such would contribute significantly towards meeting its commitments to the post-2015 sustainable development agenda that pledges to 'Leave no one behind', as well as its recently announced National Human Rights Action Plan.

Keywords: birth registration, children, Malaysia, refugees

Procedia PDF Downloads 151
655 The Implementation of Inclusive Education in Collaboration between Teachers of Special Education Classes and Regular Classes in a Preschool

Authors: Chiou-Shiue Ko

Abstract:

As is explicitly stipulated in Article 7 of the Enforcement Rules of the Special Education Act as amended in 1998, "in principle, children with disabilities should be integrated with normal children for preschool education". Since then, all cities and counties have been committed to promoting preschool inclusive education. The Education Department, New Taipei City Government, has been actively recruiting advisory groups of professors to assist in the implementation of inclusive education in preschools since 2001. Since 2011, the author of this study has been guiding Preschool Rainbow to implement inclusive education. Through field observations, meetings, and teaching demonstration seminars, this study explored the process of how inclusive education has been successfully implemented in collaboration with teachers of special education classes and regular classes in Preschool Rainbow. The implementation phases for inclusive education in a single academic year include the following: 1) Preparatory stage. Prior to implementation, teachers in special education and regular classes discuss ways of conducting inclusive education and organize reading clubs to read books related to curriculum modifications that integrate the eight education strategies, early treatment and education, and early childhood education programs to enhance their capacity to implement and compose teaching plans for inclusive education. In addition to the general objectives of inclusive education, the objective of inclusive education for special children is also embedded into the Individualized Education Program (IEP). 2) Implementation stage. Initially, a promotional program for special education is implemented for the children to allow all the children in the preschool to understand their own special qualities and those of special children. After the implementation of three weeks of reverse inclusion, the children in the special education classes are put into groups and enter the regular classes twice a week to implement adjustments to their inclusion in the learning area and the curriculum. In 2013, further cooperation was carried out with adjacent hospitals to perform development screening activities for the early detection of children with developmental delays. 3) Review and reflection stage. After the implementation of inclusive education, all teachers in the preschool are divided into two groups to record their teaching plans and the lessons learned during implementation. The effectiveness of implementing the objective of inclusive education is also reviewed. With the collaboration of all teachers, in 2015, Preschool Rainbow won New Taipei City’s “Preschool Light” award as an exceptional model for inclusive education. Its model of implementing inclusive education can be used as a reference for other preschools.

Keywords: collaboration, inclusive education, preschool, teachers, special education classes, regular classes

Procedia PDF Downloads 402
654 Inferential Reasoning for Heterogeneous Multi-Agent Mission

Authors: Sagir M. Yusuf, Chris Baber

Abstract:

We describe issues bedeviling the coordination of heterogeneous (different sensors carrying agents) multi-agent missions such as belief conflict, situation reasoning, etc. We applied Bayesian and agents' presumptions inferential reasoning to solve the outlined issues with the heterogeneous multi-agent belief variation and situational-base reasoning. Bayesian Belief Network (BBN) was used in modeling the agents' belief conflict due to sensor variations. Simulation experiments were designed, and cases from agents’ missions were used in training the BBN using gradient descent and expectation-maximization algorithms. The output network is a well-trained BBN for making inferences for both agents and human experts. We claim that the Bayesian learning algorithm prediction capacity improves by the number of training data and argue that it enhances multi-agents robustness and solve agents’ sensor conflicts.

Keywords: distributed constraint optimization problem, multi-agent system, multi-robot coordination, autonomous system, swarm intelligence

Procedia PDF Downloads 122
653 Heterogeneous Intelligence Traders and Market Efficiency: New Evidence from Computational Approach in Artificial Stock Markets

Authors: Yosra Mefteh Rekik

Abstract:

A computational agent-based model of financial markets stresses interactions and dynamics among a very diverse set of traders. The growing body of research in this area relies heavily on computational tools which by-pass the restrictions of an analytical method. The main goal of this research is to understand how the stock market operates and behaves how to invest in the stock market and to study traders’ behavior within the context of the artificial stock markets populated by heterogeneous agents. All agents are characterized by adaptive learning behavior represented by the Artificial Neuron Networks. By using agent-based simulations on artificial market, we show that the existence of heterogeneous agents can explain the price dynamics in the financial market. We investigate the relation between market diversity and market efficiency. Our empirical findings demonstrate that greater market heterogeneity play key roles in market efficiency.

Keywords: agent-based modeling, artificial stock market, heterogeneous expectations, financial stylized facts, computational finance

Procedia PDF Downloads 412
652 A Method for Reduction of Association Rules in Data Mining

Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa

Abstract:

The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.

Keywords: data mining, association rules, rules reduction, artificial intelligence

Procedia PDF Downloads 142
651 Phytopathology Prediction in Dry Soil Using Artificial Neural Networks Modeling

Authors: F. Allag, S. Bouharati, M. Belmahdi, R. Zegadi

Abstract:

The rapid expansion of deserts in recent decades as a result of human actions combined with climatic changes has highlighted the necessity to understand biological processes in arid environments. Whereas physical processes and the biology of flora and fauna have been relatively well studied in marginally used arid areas, knowledge of desert soil micro-organisms remains fragmentary. The objective of this study is to conduct a diversity analysis of bacterial communities in unvegetated arid soils. Several biological phenomena in hot deserts related to microbial populations and the potential use of micro-organisms for restoring hot desert environments. Dry land ecosystems have a highly heterogeneous distribution of resources, with greater nutrient concentrations and microbial densities occurring in vegetated than in bare soils. In this work, we found it useful to use techniques of artificial intelligence in their treatment especially artificial neural networks (ANN). The use of the ANN model, demonstrate his capability for addressing the complex problems of uncertainty data.

Keywords: desert soil, climatic changes, bacteria, vegetation, artificial neural networks

Procedia PDF Downloads 377
650 Impacts of Present and Future Climate Variability on Forest Ecosystem in Mediterranean Region

Authors: Orkan Ozcan, Nebiye Musaoglu, Murat Turkes

Abstract:

Climate change is largely recognized as one of the real, pressing and significant global problems. The concept of ‘climate change vulnerability’ helps us to better comprehend the cause/effect relationships behind climate change and its impact on human societies, socioeconomic sectors, physiographical and ecological systems. In this study, multifactorial spatial modeling was applied to evaluate the vulnerability of a Mediterranean forest ecosystem to climate change. As a result, the geographical distribution of the final Environmental Vulnerability Areas (EVAs) of the forest ecosystem is based on the estimated final Environmental Vulnerability Index (EVI) values. This revealed that at current levels of environmental degradation, physical, geographical, policy enforcement and socioeconomic conditions, the area with a ‘very low’ vulnerability degree covered mainly the town, its surrounding settlements and the agricultural lands found mainly over the low and flat travertine plateau and the plains at the east and southeast of the district. The spatial magnitude of the EVAs over the forest ecosystem under the current environmental degradation was also determined. This revealed that the EVAs classed as ‘very low’ account for 21% of the total area of the forest ecosystem, those classed as ‘low’ account for 36%, those classed as ‘medium’ account for 20%, and those classed as ‘high’ account for 24%. Based on regionally averaged future climate assessments and projected future climate indicators, both the study site and the western Mediterranean sub-region of Turkey will probably become associated with a drier, hotter, more continental and more water-deficient climate. This analysis holds true for all future scenarios, with the exception of RCP4.5 for the period from 2015 to 2030. However, the present dry-sub humid climate dominating this sub-region and the study area shows a potential for change towards more dry climatology and for it to become a semiarid climate in the period between 2031 and 2050 according to the RCP8.5 high emission scenario. All the observed and estimated results and assessments summarized in the study show clearly that the densest forest ecosystem in the southern part of the study site, which is characterized by mainly Mediterranean coniferous and some mixed forest and the maquis vegetation, will very likely be influenced by medium and high degrees of vulnerability to future environmental degradation, climate change and variability.

Keywords: forest ecosystem, Mediterranean climate, RCP scenarios, vulnerability analysis

Procedia PDF Downloads 336
649 Transformer Design Optimization Using Artificial Intelligence Techniques

Authors: Zakir Husain

Abstract:

Main objective of a power transformer design optimization problem requires minimizing the total overall cost and/or mass of the winding and core material by satisfying all possible constraints obligatory by the standards and transformer user requirement. The constraints include appropriate limits on winding fill factor, temperature rise, efficiency, no-load current and voltage regulation. The design optimizations tasks are a constrained minimum cost and/or mass solution by optimally setting the parameters, geometry and require magnetic properties of the transformer. In this paper, present the above design problems have been formulated by using genetic algorithm (GA) and simulated annealing (SA) on the MATLAB platform. The importance of the presented approach is stems for two main features. First, proposed technique provides reliable and efficient solution for the problem of design optimization with several variables. Second, it guaranteed to obtained solution is global optimum. This paper includes a demonstration of the application of the genetic programming GP technique to transformer design.

Keywords: optimization, power transformer, genetic algorithm (GA), simulated annealing technique (SA)

Procedia PDF Downloads 560
648 Unmasking Virtual Empathy: A Philosophical Examination of AI-Mediated Emotional Practices in Healthcare

Authors: Eliana Bergamin

Abstract:

This philosophical inquiry, influenced by the seminal works of Annemarie Mol and Jeannette Pols, critically examines the transformative impact of artificial intelligence (AI) on emotional caregiving practices within virtual healthcare. Rooted in the traditions of philosophy of care, philosophy of emotions, and applied philosophy, this study seeks to unravel nuanced shifts in the moral and emotional fabric of healthcare mediated by AI-powered technologies. Departing from traditional empirical studies, the approach embraces the foundational principles of care ethics and phenomenology, offering a focused exploration of the ethical and existential dimensions of AI-mediated emotional caregiving. At its core, this research addresses the introduction of AI-powered technologies mediating emotional and care practices in the healthcare sector. By drawing on Mol and Pols' insights, the study offers a focused exploration of the ethical and existential dimensions of AI-mediated emotional caregiving. Anchored in ethnographic research within a pioneering private healthcare company in the Netherlands, this critical philosophical inquiry provides a unique lens into the dynamics of AI-mediated emotional practices. The study employs in-depth, semi-structured interviews with virtual caregivers and care receivers alongside ongoing ethnographic observations spanning approximately two and a half months. Delving into the lived experiences of those at the forefront of this technological evolution, the research aims to unravel subtle shifts in the emotional and moral landscape of healthcare, critically examining the implications of AI in reshaping the philosophy of care and human connection in virtual healthcare. Inspired by Mol and Pols' relational approach, the study prioritizes the lived experiences of individuals within the virtual healthcare landscape, offering a deeper understanding of the intertwining of technology, emotions, and the philosophy of care. In the realm of philosophy of care, the research elucidates how virtual tools, particularly those driven by AI, mediate emotions such as empathy, sympathy, and compassion—the bedrock of caregiving. Focusing on emotional nuances, the study contributes to the broader discourse on the ethics of care in the context of technological mediation. In the philosophy of emotions, the investigation examines how the introduction of AI alters the phenomenology of emotional experiences in caregiving. Exploring the interplay between human emotions and machine-mediated interactions, the nuanced analysis discerns implications for both caregivers and caretakers, contributing to the evolving understanding of emotional practices in a technologically mediated healthcare environment. Within applied philosophy, the study transcends empirical observations, positioning itself as a reflective exploration of the moral implications of AI in healthcare. The findings are intended to inform ethical considerations and policy formulations, bridging the gap between technological advancements and the enduring values of caregiving. In conclusion, this focused philosophical inquiry aims to provide a foundational understanding of the evolving landscape of virtual healthcare, drawing on the works of Mol and Pols to illuminate the essence of human connection, care, and empathy amid technological advancements.

Keywords: applied philosophy, artificial intelligence, healthcare, philosophy of care, philosophy of emotions

Procedia PDF Downloads 42
647 Applying Theory of Self-Efficacy in Intelligent Transportation Systems by Potential Usage of Vehicle as a Sensor

Authors: Aby Nesan Raj, Sumil K. Raj, Sumesh Jayan

Abstract:

The objective of the study is to formulate a self-regulation model that shall enhance the usage of Intelligent Transportation Systems by understanding the theory of self-efficacy. The core logic of the self-regulation model shall monitor driver's behavior based on the situations related to the various sources of Self Efficacy like enactive mastery, vicarious experience, verbal persuasion and physiological arousal in addition to the vehicle data. For this study, four different vehicle data, speed, drowsiness, diagnostic data and surround camera views are considered. This data shall be given to the self-regulation model for evaluation. The oddness, which is the output of self-regulation model, shall feed to Intelligent Transportation Systems where appropriate actions are being taken. These actions include warning to the user as well as the input to the related transportation systems. It is also observed that the usage of vehicle as a sensor reduces the wastage of resource utilization or duplication. Altogether, this approach enhances the intelligence of the transportation systems especially in safety, productivity and environmental performance.

Keywords: emergency management, intelligent transportation system, self-efficacy, traffic management

Procedia PDF Downloads 225
646 Informal Economy: Case Study of Street Vendors in Bangkok

Authors: Kangrij Roeksiripat

Abstract:

Street vending is one of the informal economy activities which considered significance to Thai people in the economic and the day-to-day social life. It had been believed that the street vendor is a group of the poor and uneducated people. With the increasing numbers of the street vendor occupying space on public sidewalks especially in central business districts, it becomes unclear whether street vending continues as a solution to unemployment for access labors. This research attempts to study and analyze types of street vendors in Bangkok under the informal economy framework. The debate on the heterogeneous informal economy has categorized into four schools; the dualism, the structuralism, the legalism and the voluntarism. The examination also embodies with market concept with Porter’s Five Forces of Competitive Position Model analysis and the interviews with the street vendors in three case study areas: Inner zone (Pathumwan district - the sidewalk on the opposite side of Siam Paragon mall), Middle zone (Ramkhamhaeng district - the sidewalk on the opposite side of Ramkhamhaeng University) and Outer zone (Minburi district- the sidewalk of Sriburanukit Road). The result indicates that most of street vendors in Siam square are voluntarily choose to make a living in vending on a sidewalk and tend to take it as a long-term occupation even though they can be in formal wage employment. Moreover, average income and positive attitude towards self-employed are the important factors that drive them to operate street vending businesses. Meanwhile, street vending is often a family enterprise in Ramkhamhaeng area and most vendors do not wish to transform their businesses into the formal sectors. Whereas the survey conducted in Sriburankit Road reveals that almost all of street vendors migrated from other provinces and were previously paid as the unskilled workers in formal sectors. They moved to informal trades because of the uncertainty of employment in the mainstream sectors and the inconsistent income with knowledge support of friends and relatives from the same hometown. In particular, the result reveals a common pattern that street vending is the very first occupation of some group of vendors and they will continue to engage in this activity. Thus, it is important for the government to design optimal policy which not only integrating informal workers into the formal economy but also monitoring the enforcement of regulations on the modern informal economy.

Keywords: informal economy, sidewalks, street vendors, occupation

Procedia PDF Downloads 262