Search results for: intelligent programming tutors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1751

Search results for: intelligent programming tutors

1241 Intelligent Indoor Localization Using WLAN Fingerprinting

Authors: Gideon C. Joseph

Abstract:

The ability to localize mobile devices is quite important, as some applications may require location information of these devices to operate or deliver better services to the users. Although there are several ways of acquiring location data of mobile devices, the WLAN fingerprinting approach has been considered in this work. This approach uses the Received Signal Strength Indicator (RSSI) measurement as a function of the position of the mobile device. RSSI is a quantitative technique of describing the radio frequency power carried by a signal. RSSI may be used to determine RF link quality and is very useful in dense traffic scenarios where interference is of major concern, for example, indoor environments. This research aims to design a system that can predict the location of a mobile device, when supplied with the mobile’s RSSIs. The developed system takes as input the RSSIs relating to the mobile device, and outputs parameters that describe the location of the device such as the longitude, latitude, floor, and building. The relationship between the Received Signal Strengths (RSSs) of mobile devices and their corresponding locations is meant to be modelled; hence, subsequent locations of mobile devices can be predicted using the developed model. It is obvious that describing mathematical relationships between the RSSIs measurements and localization parameters is one option to modelling the problem, but the complexity of such an approach is a serious turn-off. In contrast, we propose an intelligent system that can learn the mapping of such RSSIs measurements to the localization parameters to be predicted. The system is capable of upgrading its performance as more experiential knowledge is acquired. The most appealing consideration to using such a system for this task is that complicated mathematical analysis and theoretical frameworks are excluded or not needed; the intelligent system on its own learns the underlying relationship in the supplied data (RSSI levels) that corresponds to the localization parameters. These localization parameters to be predicted are of two different tasks: Longitude and latitude of mobile devices are real values (regression problem), while the floor and building of the mobile devices are of integer values or categorical (classification problem). This research work presents artificial neural network based intelligent systems to model the relationship between the RSSIs predictors and the mobile device localization parameters. The designed systems were trained and validated on the collected WLAN fingerprint database. The trained networks were then tested with another supplied database to obtain the performance of trained systems on achieved Mean Absolute Error (MAE) and error rates for the regression and classification tasks involved therein.

Keywords: indoor localization, WLAN fingerprinting, neural networks, classification, regression

Procedia PDF Downloads 347
1240 Customer Experience Management in Food and Beverage Outlet at Indian School of Business: Methodology and Recommendations

Authors: Anupam Purwar

Abstract:

In conventional consumer product industry, stockouts are taken care by carrying buffer stock to check underserving caused by changes in customer demand, incorrect forecast or variability in lead times. But, for food outlets, the alternate of carrying buffer stock is unviable because of indispensable need to serve freshly cooked meals. Besides, the food outlet being the sole provider has no incentives to reduce stockouts, as they have no fear of losing revenue, gross profit, customers and market share. Hence, innovative, easy to implement and practical ways of addressing the twin problem of long queues and poor customer experience needs to be investigated. Current work analyses the demand pattern of 11 different food items across a routine day. Based on this optimum resource allocation for all food items has been carried out by solving a linear programming problem with cost minimization as the objective. Concurrently, recommendations have been devised to address this demand and supply side problem keeping in mind their practicability. Currently, the recommendations are being discussed and implemented at ISB (Indian School of Business) Hyderabad campus.

Keywords: F&B industry, resource allocation, demand management, linear programming, LP, queuing analysis

Procedia PDF Downloads 137
1239 Fuzzy Adaptive Control of an Intelligent Hybrid HPS (Pvwindbat), Grid Power System Applied to a Dwelling

Authors: A. Derrouazin, N. Mekkakia-M, R. Taleb, M. Helaimi, A. Benbouali

Abstract:

Nowadays the use of different sources of renewable energy for the production of electricity is the concern of everyone, as, even impersonal domestic use of the electricity in isolated sites or in town. As the conventional sources of energy are shrinking, a need has arisen to look for alternative sources of energy with more emphasis on its optimal use. This paper presents design of a sustainable Hybrid Power System (PV-Wind-Storage) assisted by grid as supplementary sources applied to case study residential house, to meet its entire energy demand. A Fuzzy control system model has been developed to optimize and control flow of power from these sources. This energy requirement is mainly fulfilled from PV and Wind energy stored in batteries module for critical load of a residential house and supplemented by grid for base and peak load. The system has been developed for maximum daily households load energy of 3kWh and can be scaled to any higher value as per requirement of individual /community house ranging from 3kWh/day to 10kWh/day, as per the requirement. The simulation work, using intelligent energy management, has resulted in an optimal yield leading to average reduction in cost of electricity by 50% per day.

Keywords: photovoltaic (PV), wind turbine, battery, microcontroller, fuzzy control (FC), Matlab

Procedia PDF Downloads 648
1238 Computational Neurosciences: An Inspiration from Biological Neurosciences

Authors: Harsh Sadawarti, Kamal Malik

Abstract:

Humans are the unique and the most powerful creature on this planet just because of the high level of intelligence gifted by nature. Computational Intelligence is highly influenced by the term natural intelligence, neurosciences and mathematics. To deal with the in-depth study of computational intelligence and to utilize it in real-life applications, it is quite important to understand its simulation with the human brain. In this paper, the three important parts, Frontal Lobe, Occipital Lobe and Parietal Lobe of the human brain, are compared with the ANN(Artificial Neural Network), CNN(Convolutional Neural network), and RNN(Recurrent Neural Network), respectively. Intelligent computational systems are created by combining deductive reasoning, logical concepts and high-level algorithms with the simulation and study of the human brain. Human brain is a combination of Physiology, Psychology, emotions, calculations and many other parameters which are of utmost importance that determines the overall intelligence. To create intelligent algorithms, smart machines and to simulate the human brain in an effective manner, it is quite important to have an insight into the human brain and the basic concepts of biological neurosciences.

Keywords: computational intelligence, neurosciences, convolutional neural network, recurrent neural network, artificial neural network, frontal lobe, occipital lobe, parietal lobe

Procedia PDF Downloads 111
1237 A Parallel Computation Based on GPU Programming for a 3D Compressible Fluid Flow Simulation

Authors: Sugeng Rianto, P.W. Arinto Yudi, Soemarno Muhammad Nurhuda

Abstract:

A computation of a 3D compressible fluid flow for virtual environment with haptic interaction can be a non-trivial issue. This is especially how to reach good performances and balancing between visualization, tactile feedback interaction, and computations. In this paper, we describe our approach of computation methods based on parallel programming on a GPU. The 3D fluid flow solvers have been developed for smoke dispersion simulation by using combinations of the cubic interpolated propagation (CIP) based fluid flow solvers and the advantages of the parallelism and programmability of the GPU. The fluid flow solver is generated in the GPU-CPU message passing scheme to get rapid development of haptic feedback modes for fluid dynamic data. A rapid solution in fluid flow solvers is developed by applying cubic interpolated propagation (CIP) fluid flow solvers. From this scheme, multiphase fluid flow equations can be solved simultaneously. To get more acceleration in the computation, the Navier-Stoke Equations (NSEs) is packed into channels of texel, where computation models are performed on pixels that can be considered to be a grid of cells. Therefore, despite of the complexity of the obstacle geometry, processing on multiple vertices and pixels can be done simultaneously in parallel. The data are also shared in global memory for CPU to control the haptic in providing kinaesthetic interaction and felling. The results show that GPU based parallel computation approaches provide effective simulation of compressible fluid flow model for real-time interaction in 3D computer graphic for PC platform. This report has shown the feasibility of a new approach of solving the compressible fluid flow equations on the GPU. The experimental tests proved that the compressible fluid flowing on various obstacles with haptic interactions on the few model obstacles can be effectively and efficiently simulated on the reasonable frame rate with a realistic visualization. These results confirm that good performances and balancing between visualization, tactile feedback interaction, and computations can be applied successfully.

Keywords: CIP, compressible fluid, GPU programming, parallel computation, real-time visualisation

Procedia PDF Downloads 432
1236 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement

Authors: Hadi Ardiny, Amir Mohammad Beigzadeh

Abstract:

Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.

Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems

Procedia PDF Downloads 123
1235 Applications Using Geographic Information System for Planning and Development of Energy Efficient and Sustainable Living for Smart-Cities

Authors: Javed Mohammed

Abstract:

As urbanization process has been and will be happening in an unprecedented scale worldwide, strong requirements from academic research and practical fields for smart management and intelligent planning of cities are pressing to handle increasing demands of infrastructure and potential risks of inhabitants agglomeration in disaster management. Geo-spatial data and Geographic Information System (GIS) are essential components for building smart cities in a basic way that maps the physical world into virtual environment as a referencing framework. On higher level, GIS has been becoming very important in smart cities on different sectors. In the digital city era, digital maps and geospatial databases have long been integrated in workflows in land management, urban planning and transportation in government. People have anticipated GIS to be more powerful not only as an archival and data management tool but also as spatial models for supporting decision-making in intelligent cities. The purpose of this project is to offer observations and analysis based on a detailed discussion of Geographic Information Systems( GIS) driven Framework towards the development of Smart and Sustainable Cities through high penetration of Renewable Energy Technologies.

Keywords: digital maps, geo-spatial, geographic information system, smart cities, renewable energy, urban planning

Procedia PDF Downloads 526
1234 Factors Influencing International Second Language Student's Perceptions of Academic Writing Practices

Authors: A. Shannaq

Abstract:

English is the accepted lingua franca of the academic world, and English medium higher education institutions host many second-language speakers of English (L2) who wish to pursue their studies through the medium of English. Assessment in higher education institutions is largely done in writing, which makes the mastery of academic writing essential. While such mastery can be, and often is, difficult for students who speak English as a first language, it is undoubtedly more so for L2 students attempting to adopt Anglophone academic written norms. There does not appear to be a great deal of research with regard to L2 students’ perceptions of their academic writing practices. This research investigates the writing practices of international L2 students in their first year of undergraduate study at NZ universities. Qualitative longitudinal data in the form of semi-structured interviews and documentation (assignments’ written instructions, students’ written assignments, tutors’ feedback on the students’ assignments) were collected from 4 undergraduate international L2 students at the beginning, middle, and end of the academic year 2017. Findings reveal that motivation, agency, and self-efficacy impact students’ perceptions of their academic writing practices and define the course of actions learners take under the time constraints which are set for their assignments.

Keywords: academic writing, English as a second language, international second language students, undergraduate writing practices

Procedia PDF Downloads 139
1233 A Mixed Integer Programming Model for Optimizing the Layout of an Emergency Department

Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee

Abstract:

During the recent years, demand for healthcare services has dramatically increased. As the demand for healthcare services increases, so does the necessity of constructing new healthcare buildings and redesigning and renovating existing ones. Increasing demands necessitate the use of optimization techniques to improve the overall service efficiency in healthcare settings. However, high complexity of care processes remains the major challenge to accomplish this goal. This study proposes a method based on process mining results to address the high complexity of care processes and to find the optimal layout of the various medical centers in an emergency department. ProM framework is used to discover clinical pathway patterns and relationship between activities. Sequence clustering plug-in is used to remove infrequent events and to derive the process model in the form of Markov chain. The process mining results served as an input for the next phase which consists of the development of the optimization model. Comparison of the current ED design with the one obtained from the proposed method indicated that a carefully designed layout can significantly decrease the distances that patients must travel.

Keywords: Mixed Integer programming, Facility layout problem, Process Mining, Healthcare Operation Management

Procedia PDF Downloads 339
1232 The Trajectory of the Ball in Football Game

Authors: Mahdi Motahari, Mojtaba Farzaneh, Ebrahim Sepidbar

Abstract:

Tracking of moving and flying targets is one of the most important issues in image processing topic. Estimating of trajectory of desired object in short-term and long-term scale is more important than tracking of moving and flying targets. In this paper, a new way of identifying and estimating of future trajectory of a moving ball in long-term scale is estimated by using synthesis and interaction of image processing algorithms including noise removal and image segmentation, Kalman filter algorithm in order to estimating of trajectory of ball in football game in short-term scale and intelligent adaptive neuro-fuzzy algorithm based on time series of traverse distance. The proposed system attain more than 96% identify accuracy by using aforesaid methods and relaying on aforesaid algorithms and data base video in format of synthesis and interaction. Although the present method has high precision, it is time consuming. By comparing this method with other methods we realize the accuracy and efficiency of that.

Keywords: tracking, signal processing, moving targets and flying, artificial intelligent systems, estimating of trajectory, Kalman filter

Procedia PDF Downloads 459
1231 ePA-Coach: Design of the Intelligent Virtual Learning Coach for Senior Learners in Support of Digital Literacy in the Context of Electronic Patient Record

Authors: Ilona Buchem, Carolin Gellner

Abstract:

Over the last few years, the call for the support of senior learners in the development of their digital literacy has become prevalent, mainly due to the progression towards ageing societies paired with advances in digitalisation in all spheres of life, including e-health and electronic patient record (EPA). While major research efforts in supporting senior learners in developing digital literacy have been invested so far in e-learning focusing on knowledge acquisition and cognitive tasks, little research exists in learning models which target virtual mentoring and coaching with the help of pedagogical agents and address the social dimensions of learning. Research from studies with students in the context of formal education has already provided methods for designing intelligent virtual agents in support of personalised learning. However, this research has mostly focused on cognitive skills and has not yet been applied to the context of mentoring/coaching of senior learners, who have different characteristics and learn in different contexts. In this paper, we describe how insights from previous research can be used to develop an intelligent virtual learning coach (agent) for senior learners with a focus on building the social relationship between the agent and the learner and the key task of the agent to socialize learners to the larger context of digital literacy with a focus on electronic health records. Following current approaches to mentoring and coaching, the agent is designed not to enhance and monitor the cognitive performance of the learner but to serve as a trusted friend and advisor, whose role is to provide one-to-one guidance and support sharing of experiences among learners (peers). Based on literature review and synopsis of research on virtual agents and current coaching/mentoring models under consideration of the specific characteristics and requirements of senior learners, we describe the design framework which was applied to design an intelligent virtual learning coach as part of the e-learning system for digital literacy of senior learners in the ePA-Coach project founded by the German Ministry of Education and Research. This paper also presents the results from the evaluation study, which compared the use of the first prototype of the virtual learning coach designed according to the design framework with a voice narration in a multimedia learning environment with senior learners. The focus of the study was to validate the agent design in the context of the persona effect (Lester et al., 1997). Since the persona effect is related to the hypothesis that animated agents are perceived as more socially engaging, the study evaluated possible impacts of agent coaching in comparison with voice coaching on motivation, engagement, experience, and digital literacy.

Keywords: virtual learning coach, virtual mentor, pedagogical agent, senior learners, digital literacy, electronic health records

Procedia PDF Downloads 117
1230 A Hybrid Expert System for Generating Stock Trading Signals

Authors: Hosein Hamisheh Bahar, Mohammad Hossein Fazel Zarandi, Akbar Esfahanipour

Abstract:

In this paper, a hybrid expert system is developed by using fuzzy genetic network programming with reinforcement learning (GNP-RL). In this system, the frame-based structure of the system uses the trading rules extracted by GNP. These rules are extracted by using technical indices of the stock prices in the training time period. For developing this system, we applied fuzzy node transition and decision making in both processing and judgment nodes of GNP-RL. Consequently, using these method not only did increase the accuracy of node transition and decision making in GNP's nodes, but also extended the GNP's binary signals to ternary trading signals. In the other words, in our proposed Fuzzy GNP-RL model, a No Trade signal is added to conventional Buy or Sell signals. Finally, the obtained rules are used in a frame-based system implemented in Kappa-PC software. This developed trading system has been used to generate trading signals for ten companies listed in Tehran Stock Exchange (TSE). The simulation results in the testing time period shows that the developed system has more favorable performance in comparison with the Buy and Hold strategy.

Keywords: fuzzy genetic network programming, hybrid expert system, technical trading signal, Tehran stock exchange

Procedia PDF Downloads 332
1229 The Effectiveness of Homeschooling: A Stakeholder's Perception in East London Education District

Authors: N. M. Zukani, E. O. Adu

Abstract:

Homeschooling has been a primary method for parents to educate their children. It has become a growing educational phenomenon across the globe. However, homeschooling is, therefore, an alternative form of education in which children are instructed at home rather than in mainstream schools. This study evaluated the effectiveness of homeschooling in East London Education District, looking at the stakeholder’s perceptions, reviewing issues that impact on this as reflected in literature. This is a qualitative study done in selected homeschools. Semi structured interviews were used as a form of collecting data. Data was scrutinized and grouped into themes. The study revealed the importance of differentiation of instruction, and the need for flexibility in the process of homeschooling for children who faced difficulties, special needs in learning in mainstream schooling. It is therefore concluded that the participants in the study clearly showed that homeschooling is an educational choice for parents who have concerns about the quality of education of their children. Furthermore, homeschooling has the potential to be the most learner centered, nurturing educational approach. It was recommended that an effective homeschooling practice mainly, the practice should consider attention to children-parent’s goals and learning structure. Although homeschooling looks at how to overcome the drawbacks of mainstream schooling, there are also cases that reflected, the incompetency of parents or tutors conducting the homeschooling and also a need for the support material and other educational supports from the government.

Keywords: homeschooling, effectiveness, stakeholders, parents, perception

Procedia PDF Downloads 138
1228 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 164
1227 Legal Personality and Responsibility of Robots

Authors: Mehrnoosh Abouzari, Shahrokh Sahraei

Abstract:

Arrival of artificial intelligence or smart robots in the modern world put them in charge on pericise and at risk. So acting human activities with robots makes criminal or civil responsibilities for their acts or behavior. The practical usage of smart robots has entered them in to a unique situation when naturalization happens and smart robots are identifies as members of society. There would be some legal situation by adopting these new smart citizens. The first situation is about legal responsibility of robots. Recognizing the naturalization of robot involves some basic right , so humans have the rights of employment, property, housing, using energy and other human rights may be employed for robots. So how would be the practice of these rights in the society and if some problems happens with these rights, how would the civil responsibility and punishment? May we consider them as population and count on the social programs? The second episode is about the criminal responsibility of robots in important activity instead of human that is the aim of inventing robots with handling works in AI technology , but the problem arises when some accidents are happened by robots who are in charge of important activities like army, surgery, transporting, judgement and so on. Moreover, recognizing independent identification for robots in the legal world by register ID cards, naturalization and civilian rights makes and prepare the same rights and obligations of human. So, the civil responsibility is not avoidable and if the robot commit a crime it would have criminal responsibility and have to be punished. The basic component of criminal responsibility may changes in so situation. For example, if designation for criminal responsibility bounds to human by sane, maturity, voluntariness, it would be for robots by being intelligent, good programming, not being hacked and so on. So it is irrational to punish robots by prisoning , execution and other human punishments for body. We may determine to make digital punishments like changing or repairing programs, exchanging some parts of its body or wreck it down completely. Finally the responsibility of the smart robot creators, programmers, the boss in chief, the organization who employed robot, the government which permitted to use robot in important bases and activities , will be analyzing and investigating in their article.

Keywords: robot, artificial intelligence, personality, responsibility

Procedia PDF Downloads 147
1226 A Tuning Method for Microwave Filter via Complex Neural Network and Improved Space Mapping

Authors: Shengbiao Wu, Weihua Cao, Min Wu, Can Liu

Abstract:

This paper presents an intelligent tuning method of microwave filter based on complex neural network and improved space mapping. The tuning process consists of two stages: the initial tuning and the fine tuning. At the beginning of the tuning, the return loss of the filter is transferred to the passband via the error of phase. During the fine tuning, the phase shift caused by the transmission line and the higher order mode is removed by the curve fitting. Then, an Cauchy method based on the admittance parameter (Y-parameter) is used to extract the coupling matrix. The influence of the resonant cavity loss is eliminated during the parameter extraction process. By using processed data pairs (the amount of screw variation and the variation of the coupling matrix), a tuning model is established by the complex neural network. In view of the improved space mapping algorithm, the mapping relationship between the actual model and the ideal model is established, and the amplitude and direction of the tuning is constantly updated. Finally, the tuning experiment of the eight order coaxial cavity filter shows that the proposed method has a good effect in tuning time and tuning precision.

Keywords: microwave filter, scattering parameter, coupling matrix, intelligent tuning

Procedia PDF Downloads 311
1225 Blockchain-Resilient Framework for Cloud-Based Network Devices within the Architecture of Self-Driving Cars

Authors: Mirza Mujtaba Baig

Abstract:

Artificial Intelligence (AI) is evolving rapidly, and one of the areas in which this field has influenced is automation. The automobile, healthcare, education, and robotic industries deploy AI technologies constantly, and the automation of tasks is beneficial to allow time for knowledge-based tasks and also introduce convenience to everyday human endeavors. The paper reviews the challenges faced with the current implementations of autonomous self-driving cars by exploring the machine learning, robotics, and artificial intelligence techniques employed for the development of this innovation. The controversy surrounding the development and deployment of autonomous machines, e.g., vehicles, begs the need for the exploration of the configuration of the programming modules. This paper seeks to add to the body of knowledge of research assisting researchers in decreasing the inconsistencies in current programming modules. Blockchain is a technology of which applications are mostly found within the domains of financial, pharmaceutical, manufacturing, and artificial intelligence. The registering of events in a secured manner as well as applying external algorithms required for the data analytics are especially helpful for integrating, adapting, maintaining, and extending to new domains, especially predictive analytics applications.

Keywords: artificial intelligence, automation, big data, self-driving cars, machine learning, neural networking algorithm, blockchain, business intelligence

Procedia PDF Downloads 119
1224 The Mathematics of Fractal Art: Using a Derived Cubic Method and the Julia Programming Language to Make Fractal Zoom Videos

Authors: Darsh N. Patel, Eric Olson

Abstract:

Fractals can be found everywhere, whether it be the shape of a leaf or a system of blood vessels. Fractals are used to help study and understand different physical and mathematical processes; however, their artistic nature is also beautiful to simply explore. This project explores fractals generated by a cubically convergent extension to Newton's method. With this iteration as a starting point, a complex plane spanning from -2 to 2 is created with a color wheel mapped onto it. Next, the polynomial whose roots the fractal will generate from is established. From the Fundamental Theorem of Algebra, it is known that any polynomial has as many roots (counted by multiplicity) as its degree. When generating the fractals, each root will receive its own color. The complex plane can then be colored to indicate the basins of attraction that converge to each root. From a computational point of view, this project’s code identifies which points converge to which roots and then obtains fractal images. A zoom path into the fractal was implemented to easily visualize the self-similar structure. This path was obtained by selecting keyframes at different magnifications through which a path is then interpolated. Using parallel processing, many images were generated and condensed into a video. This project illustrates how practical techniques used for scientific visualization can also have an artistic side.

Keywords: fractals, cubic method, Julia programming language, basin of attraction

Procedia PDF Downloads 251
1223 The Development of Online Lessons in Integration Model

Authors: Chalermpol Tapsai

Abstract:

The objectives of this research were to develop and find the efficiency of integrated online lessons by investigating the usage of online lessons, the relationship between learners’ background knowledge, and the achievement after learning with online lessons. The sample group in this study consisted of 97 students randomly selected from 121 students registering in 1/2012 at Trimitwittayaram Learning Center. The sample technique employed stratified sample technique of 4 groups according to their proficiency, i.e. high, moderate, low, and non-knowledge. The research instrument included online lessons in integration model on the topic of Java Programming, test after each lesson, the achievement test at the end of the course, and the questionnaires to find learners’ satisfaction. The results showed that the efficiency of online lessons was 90.20/89.18 with the achievement of after learning with the lessons higher than that before the lessons at the statistically significant level of 0.05. Moreover, the background knowledge of the learners on the programming showed the positive relationship with the achievement learning at the statistically significant level at 0.05. Learners with high background knowledge employed less exercises and samples than those with lower background knowledge. While learners with different background in the group of moderate and low did not show the significant difference in employing samples and exercises.

Keywords: integration model, online lessons, learners’ background knowledge, efficiency

Procedia PDF Downloads 359
1222 Assignment of Airlines Technical Members under Disruption

Authors: Walid Moudani

Abstract:

The Crew Reserve Assignment Problem (CRAP) considers the assignment of the crew members to a set of reserve activities covering all the scheduled flights in order to ensure a continuous plan so that operations costs are minimized while its solution must meet hard constraints resulting from the safety regulations of Civil Aviation as well as from the airlines internal agreements. The problem considered in this study is of highest interest for airlines and may have important consequences on the service quality and on the economic return of the operations. In this communication, a new mathematical formulation for the CRAP is proposed which takes into account the regulations and the internal agreements. While current solutions make use of Artificial Intelligence techniques run on main frame computers, a low cost approach is proposed to provide on-line efficient solutions to face perturbed operating conditions. The proposed solution method uses a dynamic programming approach for the duties scheduling problem and when applied to the case of a medium airline while providing efficient solutions, shows good potential acceptability by the operations staff. This optimization scheme can then be considered as the core of an on-line Decision Support System for crew reserve assignment operations management.

Keywords: airlines operations management, combinatorial optimization, dynamic programming, crew scheduling

Procedia PDF Downloads 354
1221 JaCoText: A Pretrained Model for Java Code-Text Generation

Authors: Jessica Lopez Espejel, Mahaman Sanoussi Yahaya Alassan, Walid Dahhane, El Hassane Ettifouri

Abstract:

Pretrained transformer-based models have shown high performance in natural language generation tasks. However, a new wave of interest has surged: automatic programming language code generation. This task consists of translating natural language instructions to a source code. Despite the fact that well-known pre-trained models on language generation have achieved good performance in learning programming languages, effort is still needed in automatic code generation. In this paper, we introduce JaCoText, a model based on Transformer neural network. It aims to generate java source code from natural language text. JaCoText leverages the advantages of both natural language and code generation models. More specifically, we study some findings from state of the art and use them to (1) initialize our model from powerful pre-trained models, (2) explore additional pretraining on our java dataset, (3) lead experiments combining the unimodal and bimodal data in training, and (4) scale the input and output length during the fine-tuning of the model. Conducted experiments on CONCODE dataset show that JaCoText achieves new state-of-the-art results.

Keywords: java code generation, natural language processing, sequence-to-sequence models, transformer neural networks

Procedia PDF Downloads 284
1220 Deep Routing Strategy: Deep Learning based Intelligent Routing in Software Defined Internet of Things.

Authors: Zabeehullah, Fahim Arif, Yawar Abbas

Abstract:

Software Defined Network (SDN) is a next genera-tion networking model which simplifies the traditional network complexities and improve the utilization of constrained resources. Currently, most of the SDN based Internet of Things(IoT) environments use traditional network routing strategies which work on the basis of max or min metric value. However, IoT network heterogeneity, dynamic traffic flow and complexity demands intelligent and self-adaptive routing algorithms because traditional routing algorithms lack the self-adaptions, intelligence and efficient utilization of resources. To some extent, SDN, due its flexibility, and centralized control has managed the IoT complexity and heterogeneity but still Software Defined IoT (SDIoT) lacks intelligence. To address this challenge, we proposed a model called Deep Routing Strategy (DRS) which uses Deep Learning algorithm to perform routing in SDIoT intelligently and efficiently. Our model uses real-time traffic for training and learning. Results demonstrate that proposed model has achieved high accuracy and low packet loss rate during path selection. Proposed model has also outperformed benchmark routing algorithm (OSPF). Moreover, proposed model provided encouraging results during high dynamic traffic flow.

Keywords: SDN, IoT, DL, ML, DRS

Procedia PDF Downloads 110
1219 Proposal of Commutation Protocol in Hybrid Sensors and Vehicular Networks for Intelligent Transport Systems

Authors: Taha Bensiradj, Samira Moussaoui

Abstract:

Hybrid Sensors and Vehicular Networks (HSVN), represent a hybrid network, which uses several generations of Ad-Hoc networks. It is used especially in Intelligent Transport Systems (ITS). The HSVN allows making collaboration between the Wireless Sensors Network (WSN) deployed on the border of the road and the Vehicular Network (VANET). This collaboration is defined by messages exchanged between the two networks for the purpose to inform the drivers about the state of the road, provide road safety information and more information about traffic on the road. Moreover, this collaboration created by HSVN, also allows the use of a network and the advantage of improving another network. For example, the dissemination of information between the sensors quickly decreases its energy, and therefore, we can use vehicles that do not have energy constraint to disseminate the information between sensors. On the other hand, to solve the disconnection problem in VANET, the sensors can be used as gateways that allow sending the messages received by one vehicle to another. However, because of the short communication range of the sensor and its low capacity of storage and processing of data, it is difficult to ensure the exchange of road messages between it and the vehicle, which can be moving at high speed at the time of exchange. This represents the time where the vehicle is in communication range with the sensor. This work is the proposition of a communication protocol between the sensors and the vehicle used in HSVN. The latter has as the purpose to ensure the exchange of road messages in the available time of exchange.

Keywords: HSVN, ITS, VANET, WSN

Procedia PDF Downloads 361
1218 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming

Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero

Abstract:

Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.

Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up

Procedia PDF Downloads 244
1217 Prediction of Temperature Distribution during Drilling Process Using Artificial Neural Network

Authors: Ali Reza Tahavvor, Saeed Hosseini, Nazli Jowkar, Afshin Karimzadeh Fard

Abstract:

Experimental & numeral study of temperature distribution during milling process, is important in milling quality and tools life aspects. In the present study the milling cross-section temperature is determined by using Artificial Neural Networks (ANN) according to the temperature of certain points of the work piece and the points specifications and the milling rotational speed of the blade. In the present work, at first three-dimensional model of the work piece is provided and then by using the Computational Heat Transfer (CHT) simulations, temperature in different nods of the work piece are specified in steady-state conditions. Results obtained from CHT are used for training and testing the ANN approach. Using reverse engineering and setting the desired x, y, z and the milling rotational speed of the blade as input data to the network, the milling surface temperature determined by neural network is presented as output data. The desired points temperature for different milling blade rotational speed are obtained experimentally and by extrapolation method for the milling surface temperature is obtained and a comparison is performed among the soft programming ANN, CHT results and experimental data and it is observed that ANN soft programming code can be used more efficiently to determine the temperature in a milling process.

Keywords: artificial neural networks, milling process, rotational speed, temperature

Procedia PDF Downloads 405
1216 An Intelligent Tutoring System Enriched with 3D Virtual Reality for Dentistry Students

Authors: Meltem Eryılmaz

Abstract:

With the emergence of the COVID-19 infection outbreak, the socio-cultural, political, economic, educational systems dynamics of the world have gone through a major change, especially in the educational field, specifically dentistry preclinical education, where the students must have a certain amount of real-time experience in endodontics and other various procedures. The totality of the digital and physical elements that make our five sense organs feel as if we really exist in a virtual world is called virtual reality. Virtual reality, which is very popular today, has started to be used in education. With the inclusion of developing technology in education and training environments, virtual learning platforms have been designed to enrich students' learning experiences. The field of health is also affected by these current developments, and the number of virtual reality applications developed for students studying dentistry is increasing day by day. The most widely used tools of this technology are virtual reality glasses. With virtual reality glasses, you can look any way you want in a world designed in 3D and navigate as you wish. With this project, solutions that will respond to different types of dental practices of students who study dentistry with virtual reality applications are produced. With this application, students who cannot find the opportunity to work with patients in distance education or who want to improve themselves at home have unlimited trial opportunities. Unity 2021, Visual Studio 2019, Cardboard SDK are used in the study.

Keywords: dentistry, intelligent tutoring system, virtual reality, online learning, COVID-19

Procedia PDF Downloads 203
1215 Automation of AAA Game Development Using AI

Authors: Branden Heng, Harsheni Siddharthan, Allison Tseng, Paul Toprac, Sarah Abraham, Etienne Vouga

Abstract:

The goal of this project was to evaluate and document the capabilities and limitations of AI tools for empowering small teams to create high-budget, high-profile (AAA) 3D games typically developed by large studios. Two teams of novice game developers attempted to create two different games using AI and Unreal Engine 5.3. First, the teams evaluated 60 AI art, design, sound, and programming tools by considering their capability, ease of use, cost, and license restrictions. Then, the teams used a shortlist of 12 AI tools for game development. During this process, the following tools were found to be the most productive: (i) ChatGPT 4.0 for both game and narrative concepts and documentation; (ii) Dall-E 3 and OpenArt for concept art; (iii) Beatoven for music drafting; (iv) ChatGPT 4.0 and Github Copilot for generating simple code and to complement human-made tutorials as an additional learning resource. While current generative AI may appear impressive at first glance, the assets they produce fall short of AAA industry standards. Generative AI tools are helpful when brainstorming ideas such as concept art and basic storylines, but they still cannot replace human input or creativity at this time. Regarding programming, AI can only effectively generate simple code and act as an additional learning resource. Thus, generative AI tools are, at best, tools to enhance developer productivity rather than as a system to replace developers.

Keywords: AAA games, AI, automation tools, game development

Procedia PDF Downloads 26
1214 Higher Language Education in Australia: Uncovering Language Positioning

Authors: Mobina Sahraee Juybari

Abstract:

There are around 300 languages spoken in Australia, and more than one-fifth of the population speaks a language other than English at home. The presence of international students in schools raises this number still further. Although the multilingual and multicultural status of Australia has been acknowledged by the government in education policy, the strong focus on English in institutional settings threatens the maintenance and learning of other languages. This is particularly true of universities’ language provisions. To cope with the financial impact of Covid-19, the government has cut funding for a number of Asian languages, such as Indonesian, Japanese and Chinese. This issue threats the maintenance of other languages in Australia and leaves students unprepared for the future job market. By taking account of the current reality of Australia’s diverse cultural and lingual makeup, this research intends to uncover the positioning of languages by having a historical look at Australia’s language policy and examining the value of languages and the probable impact of Covid-19 on the place of languages taught in Australian universities. A qualitative study will be adopted with language program tutors and course coordinators, with semi-structured interviews and government language policy analysis. This research hopes to provide insights into both the maintenance and learning of international language programs in tertiary language education in Australia and more widely.

Keywords: Australia, COVID-19, higher education sector, language maintenance, language and culture diversity

Procedia PDF Downloads 105
1213 Intelligent Transport System: Classification of Traffic Signs Using Deep Neural Networks in Real Time

Authors: Anukriti Kumar, Tanmay Singh, Dinesh Kumar Vishwakarma

Abstract:

Traffic control has been one of the most common and irritating problems since the time automobiles have hit the roads. Problems like traffic congestion have led to a significant time burden around the world and one significant solution to these problems can be the proper implementation of the Intelligent Transport System (ITS). It involves the integration of various tools like smart sensors, artificial intelligence, position technologies and mobile data services to manage traffic flow, reduce congestion and enhance driver's ability to avoid accidents during adverse weather. Road and traffic signs’ recognition is an emerging field of research in ITS. Classification problem of traffic signs needs to be solved as it is a major step in our journey towards building semi-autonomous/autonomous driving systems. The purpose of this work focuses on implementing an approach to solve the problem of traffic sign classification by developing a Convolutional Neural Network (CNN) classifier using the GTSRB (German Traffic Sign Recognition Benchmark) dataset. Rather than using hand-crafted features, our model addresses the concern of exploding huge parameters and data method augmentations. Our model achieved an accuracy of around 97.6% which is comparable to various state-of-the-art architectures.

Keywords: multiclass classification, convolution neural network, OpenCV

Procedia PDF Downloads 176
1212 Evolving Credit Scoring Models using Genetic Programming and Language Integrated Query Expression Trees

Authors: Alexandru-Ion Marinescu

Abstract:

There exist a plethora of methods in the scientific literature which tackle the well-established task of credit score evaluation. In its most abstract form, a credit scoring algorithm takes as input several credit applicant properties, such as age, marital status, employment status, loan duration, etc. and must output a binary response variable (i.e. “GOOD” or “BAD”) stating whether the client is susceptible to payment return delays. Data imbalance is a common occurrence among financial institution databases, with the majority being classified as “GOOD” clients (clients that respect the loan return calendar) alongside a small percentage of “BAD” clients. But it is the “BAD” clients we are interested in since accurately predicting their behavior is crucial in preventing unwanted loss for loan providers. We add to this whole context the constraint that the algorithm must yield an actual, tractable mathematical formula, which is friendlier towards financial analysts. To this end, we have turned to genetic algorithms and genetic programming, aiming to evolve actual mathematical expressions using specially tailored mutation and crossover operators. As far as data representation is concerned, we employ a very flexible mechanism – LINQ expression trees, readily available in the C# programming language, enabling us to construct executable pieces of code at runtime. As the title implies, they model trees, with intermediate nodes being operators (addition, subtraction, multiplication, division) or mathematical functions (sin, cos, abs, round, etc.) and leaf nodes storing either constants or variables. There is a one-to-one correspondence between the client properties and the formula variables. The mutation and crossover operators work on a flattened version of the tree, obtained via a pre-order traversal. A consequence of our chosen technique is that we can identify and discard client properties which do not take part in the final score evaluation, effectively acting as a dimensionality reduction scheme. We compare ourselves with state of the art approaches, such as support vector machines, Bayesian networks, and extreme learning machines, to name a few. The data sets we benchmark against amount to a total of 8, of which we mention the well-known Australian credit and German credit data sets, and the performance indicators are the following: percentage correctly classified, area under curve, partial Gini index, H-measure, Brier score and Kolmogorov-Smirnov statistic, respectively. Finally, we obtain encouraging results, which, although placing us in the lower half of the hierarchy, drive us to further refine the algorithm.

Keywords: expression trees, financial credit scoring, genetic algorithm, genetic programming, symbolic evolution

Procedia PDF Downloads 117