Search results for: savings algorithm
2539 The Impacts of New Digital Technology Transformation on Singapore Healthcare Sector: Case Study of a Public Hospital in Singapore from a Management Accounting Perspective
Authors: Junqi Zou
Abstract:
As one of the world’s most tech-ready countries, Singapore has initiated the Smart Nation plan to harness the full power and potential of digital technologies to transform the way people live and work, through the more efficient government and business processes, to make the economy more productive. The key evolutions of digital technology transformation in healthcare and the increasing deployment of Internet of Things (IoTs), Big Data, AI/cognitive, Robotic Process Automation (RPA), Electronic Health Record Systems (EHR), Electronic Medical Record Systems (EMR), Warehouse Management System (WMS in the most recent decade have significantly stepped up the move towards an information-driven healthcare ecosystem. The advances in information technology not only bring benefits to patients but also act as a key force in changing management accounting in healthcare sector. The aim of this study is to investigate the impacts of digital technology transformation on Singapore’s healthcare sector from a management accounting perspective. Adopting a Balanced Scorecard (BSC) analysis approach, this paper conducted an exploratory case study of a newly launched Singapore public hospital, which has been recognized as amongst the most digitally advanced healthcare facilities in Asia-Pacific region. Specifically, this study gains insights on how the new technology is changing healthcare organizations’ management accounting from four perspectives under the Balanced Scorecard approach, 1) Financial Perspective, 2) Customer (Patient) Perspective, 3) Internal Processes Perspective, and 4) Learning and Growth Perspective. Based on a thorough review of archival records from the government and public, and the interview reports with the hospital’s CIO, this study finds the improvements from all the four perspectives under the Balanced Scorecard framework as follows: 1) Learning and Growth Perspective: The Government (Ministry of Health) works with the hospital to open up multiple training pathways to health professionals that upgrade and develops new IT skills among the healthcare workforce to support the transformation of healthcare services. 2) Internal Process Perspective: The hospital achieved digital transformation through Project OneCare to integrate clinical, operational, and administrative information systems (e.g., EHR, EMR, WMS, EPIB, RTLS) that enable the seamless flow of data and the implementation of JIT system to help the hospital operate more effectively and efficiently. 3) Customer Perspective: The fully integrated EMR suite enhances the patient’s experiences by achieving the 5 Rights (Right Patient, Right Data, Right Device, Right Entry and Right Time). 4) Financial Perspective: Cost savings are achieved from improved inventory management and effective supply chain management. The use of process automation also results in a reduction of manpower costs and logistics cost. To summarize, these improvements identified under the Balanced Scorecard framework confirm the success of utilizing the integration of advanced ICT to enhance healthcare organization’s customer service, productivity efficiency, and cost savings. Moreover, the Big Data generated from this integrated EMR system can be particularly useful in aiding management control system to optimize decision making and strategic planning. To conclude, the new digital technology transformation has moved the usefulness of management accounting to both financial and non-financial dimensions with new heights in the area of healthcare management.Keywords: balanced scorecard, digital technology transformation, healthcare ecosystem, integrated information system
Procedia PDF Downloads 1612538 Inverse Mode Shape Problem of Hand-Arm Vibration (Humerus Bone) for Bio-Dynamic Response Using Varying Boundary Conditions
Authors: Ajay R, Rammohan B, Sridhar K S S, Gurusharan N
Abstract:
The objective of the work is to develop a numerical method to solve the inverse mode shape problem by determining the cross-sectional area of a structure for the desired mode shape via the vibration response study of the humerus bone, which is in the form of a cantilever beam with anisotropic material properties. The humerus bone is the long bone in the arm that connects the shoulder to the elbow. The mode shape is assumed to be a higher-order polynomial satisfying a prescribed set of boundary conditions to converge the numerical algorithm. The natural frequency and the mode shapes are calculated for different boundary conditions to find the cross-sectional area of humerus bone from Eigenmode shape with the aid of the inverse mode shape algorithm. The cross-sectional area of humerus bone validates the mode shapes of specific boundary conditions. The numerical method to solve the inverse mode shape problem is validated in the biomedical application by finding the cross-sectional area of a humerus bone in the human arm.Keywords: Cross-sectional area, Humerus bone, Inverse mode shape problem, Mode shape
Procedia PDF Downloads 1272537 A Dynamic Ensemble Learning Approach for Online Anomaly Detection in Alibaba Datacenters
Authors: Wanyi Zhu, Xia Ming, Huafeng Wang, Junda Chen, Lu Liu, Jiangwei Jiang, Guohua Liu
Abstract:
Anomaly detection is a first and imperative step needed to respond to unexpected problems and to assure high performance and security in large data center management. This paper presents an online anomaly detection system through an innovative approach of ensemble machine learning and adaptive differentiation algorithms, and applies them to performance data collected from a continuous monitoring system for multi-tier web applications running in Alibaba data centers. We evaluate the effectiveness and efficiency of this algorithm with production traffic data and compare with the traditional anomaly detection approaches such as a static threshold and other deviation-based detection techniques. The experiment results show that our algorithm correctly identifies the unexpected performance variances of any running application, with an acceptable false positive rate. This proposed approach has already been deployed in real-time production environments to enhance the efficiency and stability in daily data center operations.Keywords: Alibaba data centers, anomaly detection, big data computation, dynamic ensemble learning
Procedia PDF Downloads 2002536 Flood Monitoring in the Vietnamese Mekong Delta Using Sentinel-1 SAR with Global Flood Mapper
Authors: Ahmed S. Afifi, Ahmed Magdy
Abstract:
Satellite monitoring is an essential tool to study, understand, and map large-scale environmental changes that affect humans, climate, and biodiversity. The Sentinel-1 Synthetic Aperture Radar (SAR) instrument provides a high collection of data in all-weather, short revisit time, and high spatial resolution that can be used effectively in flood management. Floods occur when an overflow of water submerges dry land that requires to be distinguished from flooded areas. In this study, we use global flood mapper (GFM), a new google earth engine application that allows users to quickly map floods using Sentinel-1 SAR. The GFM enables the users to adjust manually the flood map parameters, e.g., the threshold for Z-value for VV and VH bands and the elevation and slope mask threshold. The composite R:G:B image results by coupling the bands of Sentinel-1 (VH:VV:VH) reduces false classification to a large extent compared to using one separate band (e.g., VH polarization band). The flood mapping algorithm in the GFM and the Otsu thresholding are compared with Sentinel-2 optical data. And the results show that the GFM algorithm can overcome the misclassification of a flooded area in An Giang, Vietnam.Keywords: SAR backscattering, Sentinel-1, flood mapping, disaster
Procedia PDF Downloads 1052535 Understanding the Qualitative Nature of Product Reviews by Integrating Text Processing Algorithm and Usability Feature Extraction
Authors: Cherry Yieng Siang Ling, Joong Hee Lee, Myung Hwan Yun
Abstract:
The quality of a product to be usable has become the basic requirement in consumer’s perspective while failing the requirement ends up the customer from not using the product. Identifying usability issues from analyzing quantitative and qualitative data collected from usability testing and evaluation activities aids in the process of product design, yet the lack of studies and researches regarding analysis methodologies in qualitative text data of usability field inhibits the potential of these data for more useful applications. While the possibility of analyzing qualitative text data found with the rapid development of data analysis studies such as natural language processing field in understanding human language in computer, and machine learning field in providing predictive model and clustering tool. Therefore, this research aims to study the application capability of text processing algorithm in analysis of qualitative text data collected from usability activities. This research utilized datasets collected from LG neckband headset usability experiment in which the datasets consist of headset survey text data, subject’s data and product physical data. In the analysis procedure, which integrated with the text-processing algorithm, the process includes training of comments onto vector space, labeling them with the subject and product physical feature data, and clustering to validate the result of comment vector clustering. The result shows 'volume and music control button' as the usability feature that matches best with the cluster of comment vectors where centroid comments of a cluster emphasized more on button positions, while centroid comments of the other cluster emphasized more on button interface issues. When volume and music control buttons are designed separately, the participant experienced less confusion, and thus, the comments mentioned only about the buttons' positions. While in the situation where the volume and music control buttons are designed as a single button, the participants experienced interface issues regarding the buttons such as operating methods of functions and confusion of functions' buttons. The relevance of the cluster centroid comments with the extracted feature explained the capability of text processing algorithms in analyzing qualitative text data from usability testing and evaluations.Keywords: usability, qualitative data, text-processing algorithm, natural language processing
Procedia PDF Downloads 2852534 Accelerated Evaluation of Structural Reliability under Tsunami Loading
Authors: Sai Hung Cheung, Zhe Shao
Abstract:
It is of our great interest to quantify the risk to structural dynamic systems due to earthquake-induced tsunamis in view of recent earthquake-induced tsunamis in Padang, 2004 and Tohoku, 2011 which brought huge losses of lives and properties. Despite continuous advancement in computational simulation of the tsunami and wave-structure interaction modeling, it still remains computationally challenging to evaluate the reliability of a structural dynamic system when uncertainties related to the system and its modeling are taken into account. The failure of the structure in a tsunami-wave-structural system is defined as any response quantities of the system exceeding specified thresholds during the time when the structure is subjected to dynamic wave impact due to earthquake-induced tsunamis. In this paper, an approach based on a novel integration of a recently proposed moving least squares response surface approach for stochastic sampling and the Subset Simulation algorithm is proposed. The effectiveness of the proposed approach is discussed by comparing its results with those obtained from the Subset Simulation algorithm without using the response surface approach.Keywords: response surface, stochastic simulation, structural reliability tsunami, risk
Procedia PDF Downloads 6752533 Efficient Semi-Systolic Finite Field Multiplier Using Redundant Basis
Authors: Hyun-Ho Lee, Kee-Won Kim
Abstract:
The arithmetic operations over GF(2m) have been extensively used in error correcting codes and public-key cryptography schemes. Finite field arithmetic includes addition, multiplication, division and inversion operations. Addition is very simple and can be implemented with an extremely simple circuit. The other operations are much more complex. The multiplication is the most important for cryptosystems, such as the elliptic curve cryptosystem, since computing exponentiation, division, and computing multiplicative inverse can be performed by computing multiplication iteratively. In this paper, we present a parallel computation algorithm that operates Montgomery multiplication over finite field using redundant basis. Also, based on the multiplication algorithm, we present an efficient semi-systolic multiplier over finite field. The multiplier has less space and time complexities compared to related multipliers. As compared to the corresponding existing structures, the multiplier saves at least 5% area, 50% time, and 53% area-time (AT) complexity. Accordingly, it is well suited for VLSI implementation and can be easily applied as a basic component for computing complex operations over finite field, such as inversion and division operation.Keywords: finite field, Montgomery multiplication, systolic array, cryptography
Procedia PDF Downloads 2942532 Performance Evaluation of MIMO-OFDM Communication Systems
Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany
Abstract:
This paper evaluates the bit error rate (BER) performance of MIMO-OFDM communication system. MIMO system uses multiple transmitting and receiving antennas with different coding techniques to either enhance the transmission diversity or spatial multiplexing gain. Utilizing alamouti algorithm were the same information transmitted over multiple antennas at different time intervals and then collected again at the receivers to minimize the probability of error, combat fading and thus improve the received signal to noise ratio. While utilizing V-BLAST algorithm, the transmitted signals are divided into different transmitting channels and transferred over the channel to be received by different receiving antennas to increase the transmitted data rate and achieve higher throughput. The paper provides a study of different diversity gain coding schemes and spatial multiplexing coding for MIMO systems. A comparison of various channels' estimation and equalization techniques are given. The simulation is implemented using MATLAB, and the results had shown the performance of transmission models under different channel environments.Keywords: MIMO communication, BER, space codes, channels, alamouti, V-BLAST
Procedia PDF Downloads 1752531 A Constructivist Approach and Tool for Autonomous Agent Bottom-up Sequential Learning
Authors: Jianyong Xue, Olivier L. Georgeon, Salima Hassas
Abstract:
During the initial phase of cognitive development, infants exhibit amazing abilities to generate novel behaviors in unfamiliar situations, and explore actively to learn the best while lacking extrinsic rewards from the environment. These abilities set them apart from even the most advanced autonomous robots. This work seeks to contribute to understand and replicate some of these abilities. We propose the Bottom-up hiErarchical sequential Learning algorithm with Constructivist pAradigm (BEL-CA) to design agents capable of learning autonomously and continuously through interactions. The algorithm implements no assumption about the semantics of input and output data. It does not rely upon a model of the world given a priori in the form of a set of states and transitions as well. Besides, we propose a toolkit to analyze the learning process at run time called GAIT (Generating and Analyzing Interaction Traces). We use GAIT to report and explain the detailed learning process and the structured behaviors that the agent has learned on each decision making. We report an experiment in which the agent learned to successfully interact with its environment and to avoid unfavorable interactions using regularities discovered through interaction.Keywords: cognitive development, constructivist learning, hierarchical sequential learning, self-adaptation
Procedia PDF Downloads 1812530 Isolation and Classification of Red Blood Cells in Anemic Microscopic Images
Authors: Jameela Ali Alkrimi, Abdul Rahim Ahmad, Azizah Suliman, Loay E. George
Abstract:
Red blood cells (RBCs) are among the most commonly and intensively studied type of blood cells in cell biology. The lack of RBCs is a condition characterized by lower than normal hemoglobin level; this condition is referred to as 'anemia'. In this study, a software was developed to isolate RBCs by using a machine learning approach to classify anemic RBCs in microscopic images. Several features of RBCs were extracted using image processing algorithms, including principal component analysis (PCA). With the proposed method, RBCs were isolated in 34 second from an image containing 18 to 27 cells. We also proposed that PCA could be performed to increase the speed and efficiency of classification. Our classifier algorithm yielded accuracy rates of 100%, 99.99%, and 96.50% for K-nearest neighbor (K-NN) algorithm, support vector machine (SVM), and neural network ANN, respectively. Classification was evaluated in highly sensitivity, specificity, and kappa statistical parameters. In conclusion, the classification results were obtained for a short time period with more efficient when PCA was used.Keywords: red blood cells, pre-processing image algorithms, classification algorithms, principal component analysis PCA, confusion matrix, kappa statistical parameters, ROC
Procedia PDF Downloads 4052529 IT Workforce Enablement: How Cloud Computing Changes the Competence Mix of the IT Workforce
Authors: Dominik Krimpmann
Abstract:
Cloud computing has provided the impetus for change in the demand, sourcing, and consumption of IT-enabled services. The technology developed from an emerging trend towards a ‘must-have’. Many organizations harnessed on the quick-wins of cloud computing within the last five years but nowadays reach a plateau when it comes to sustainable savings and performance. This study aims to investigate what is needed from an organizational perspective to make cloud computing a sustainable success. The study was carried out in Germany among senior IT professionals, both in management and delivery positions. Our research shows that IT executives must be prepared to realign their IT workforce to sustain the advantage of cloud computing for today and the near future. While new roles will undoubtedly emerge, roles alone cannot ensure the success of cloud deployments. What is needed is a change in the IT workforce’s business behaviour, or put more simply, the ways in which the IT personnel works. It gives clear guidance on which dimensions of an employees’ working behaviour need to be adapted. The practical implications are drawn from a series of semi-structured interviews, resulting in a high-level workforce enablement plan. Lastly, it elaborates on tools and gives clear guidance on which pitfalls might arise along the proposed workforce enablement process.Keywords: cloud computing, organization design, organizational change, workforce enablement
Procedia PDF Downloads 3102528 An Assessment of Poland's Current Macroeconomic Conditions to Determine Whether It Is in a Middle Income Trap
Authors: Bozena Leven
Abstract:
The middle-income trap (MIT) describes a situation faced by countries at a relatively mature stage of development that often poses an obstacle to sustainable long-term growth. MIT is characterized by declining factor productivity from the exhaustion of labor intensive, import and Foreign Direct Investment (FDI) based strategies when middle-income status is achieved. In this paper, we focus on MIT and Poland. In the past two decades, Poland experienced steady growth based largely on imported technologies and low-cost labor. Recently, that economic growth has slowed, prompting economists to ask whether Poland is experiencing MIT. To answer this question, we analyze changes in investment in Poland; specifically- its growth and composition – as well as savings, FDI, educational attainments of the labor force, development of new technologies and products, the role of imports, diversification of exports, and product complexity. We also examine the development of modern infrastructure, institutions (including legal environment) and demographic changes in Poland that support growth. Our findings indicate that certain factors consistent with MIT are gaining importance in Poland, and represent a challenge to that country’s future growth rate.Keywords: engines of growth, factor productivity, middle income trap, sustainable development
Procedia PDF Downloads 2112527 A QoE-driven Cross-layer Resource Allocation Scheme for High Traffic Service over Open Wireless Network Downlink
Authors: Liya Shan, Qing Liao, Qinyue Hu, Shantao Jiang, Tao Wang
Abstract:
In this paper, a Quality of Experience (QoE)-driven cross-layer resource allocation scheme for high traffic service over Open Wireless Network (OWN) downlink is proposed, and the related problem about the users in the whole cell including the users in overlap region of different cells has been solved.A method, in which assess models of the BestEffort service and the no-reference assess algorithm for video service are adopted, to calculate the Mean Opinion Score (MOS) value for high traffic service has been introduced. The cross-layer architecture considers the parameters in application layer, media access control layer and physical layer jointly. Based on this architecture and the MOS value, the Binary Constrained Particle Swarm Optimization (B_CPSO) algorithm is used to solve the cross-layer resource allocation problem. In addition,simulationresults show that the proposed scheme significantly outperforms other schemes in terms of maximizing average users’ MOS value for the whole system as well as maintaining fairness among users.Keywords: high traffic service, cross-layer resource allocation, QoE, B_CPSO, OWN
Procedia PDF Downloads 5412526 A Genetic Algorithm Approach to Solve a Weaving Job Scheduling Problem, Aiming Tardiness Minimization
Authors: Carolina Silva, João Nuno Oliveira, Rui Sousa, João Paulo Silva
Abstract:
This study uses genetic algorithms to solve a job scheduling problem in a weaving factory. The underline problem regards an NP-Hard problem concerning unrelated parallel machines, with sequence-dependent setup times. This research uses real data regarding a weaving industry located in the North of Portugal, with a capacity of 96 looms and a production, on average, of 440000 meters of fabric per month. Besides, this study includes a high level of complexity once most of the real production constraints are applied, and several real data instances are tested. Topics such as data analyses and algorithm performance are addressed and tested, to offer a solution that can generate reliable and due date results. All the approaches will be tested in the operational environment, and the KPIs monitored, to understand the solution's impact on the production, with a particular focus on the total number of weeks of late deliveries to clients. Thus, the main goal of this research is to develop a solution that allows for the production of automatically optimized production plans, aiming to the tardiness minimizing.Keywords: genetic algorithms, textile industry, job scheduling, optimization
Procedia PDF Downloads 1572525 Investigation of Passive Solutions of Thermal Comfort in Housing Aiming to Reduce Energy Consumption
Authors: Josiane R. Pires, Marco A. S. González, Bruna L. Brenner, Luciana S. Roos
Abstract:
The concern with sustainability brought the need for optimization of the buildings to reduce consumption of natural resources. Almost 1/3 of energy demanded by Brazilian housings is used to provide thermal solutions. AEC sector may contribute applying bioclimatic strategies on building design. The aim of this research is to investigate the viability of applying some alternative solutions in residential buildings. The research was developed with computational simulation on single family social housing, examining envelope type, absorptance, and insolation. The analysis of the thermal performance applied both Brazilian standard NBR 15575 and degree-hour method, in the scenery of Porto Alegre, a southern Brazilian city. We used BIM modeling through Revit/Autodesk and used Energy Plus to thermal simulation. The payback of the investment was calculated comparing energy savings and building costs, in a period of 50 years. The results shown that with the increment of envelope’s insulation there is thermal comfort improvement and energy economy, with a pay-back period of 24 to 36 years, in some cases.Keywords: civil construction, design, thermal performance, energy, economic analysis
Procedia PDF Downloads 5522524 Study of University Course Scheduling for Crowd Gathering Risk Prevention and Control in the Context of Routine Epidemic Prevention
Authors: Yuzhen Hu, Sirui Wang
Abstract:
As a training base for intellectual talents, universities have a large number of students. Teaching is a primary activity in universities, and during the teaching process, a large number of people gather both inside and outside the teaching buildings, posing a strong risk of close contact. The class schedule is the fundamental basis for teaching activities in universities and plays a crucial role in the management of teaching order. Different class schedules can lead to varying degrees of indoor gatherings and trajectories of class attendees. In recent years, highly contagious diseases have frequently occurred worldwide, and how to reduce the risk of infection has always been a hot issue related to public safety. "Reducing gatherings" is one of the core measures in epidemic prevention and control, and it can be controlled through scientific scheduling in specific environments. Therefore, the scientific prevention and control goal can be achieved by considering the reduction of the risk of excessive gathering of people during the course schedule arrangement. Firstly, we address the issue of personnel gathering in various pathways on campus, with the goal of minimizing congestion and maximizing teaching effectiveness, establishing a nonlinear mathematical model. Next, we design an improved genetic algorithm, incorporating real-time evacuation operations based on tracking search and multidimensional positive gradient cross-mutation operations, considering the characteristics of outdoor crowd evacuation. Finally, we apply undergraduate course data from a university in Harbin to conduct a case study. It compares and analyzes the effects of algorithm improvement and optimization of gathering situations and explores the impact of path blocking on the degree of gathering of individuals on other pathways.Keywords: the university timetabling problem, risk prevention, genetic algorithm, risk control
Procedia PDF Downloads 882523 Iterative Solver for Solving Large-Scale Frictional Contact Problems
Authors: Thierno Diop, Michel Fortin, Jean Deteix
Abstract:
Since the precise formulation of the elastic part is irrelevant for the description of the algorithm, we shall consider a generic case. In practice, however, we will have to deal with a non linear material (for instance a Mooney-Rivlin model). We are interested in solving a finite element approximation of the problem, leading to large-scale non linear discrete problems and, after linearization, to large linear systems and ultimately to calculations needing iterative methods. This also implies that penalty method, and therefore augmented Lagrangian method, are to be banned because of their negative effect on the condition number of the underlying discrete systems and thus on the convergence of iterative methods. This is in rupture to the mainstream of methods for contact in which augmented Lagrangian is the principal tool. We shall first present the problem and its discretization; this will lead us to describe a general solution algorithm relying on a preconditioner for saddle-point problems which we shall describe in some detail as it is not entirely standard. We will propose an iterative approach for solving three-dimensional frictional contact problems between elastic bodies, including contact with a rigid body, contact between two or more bodies and also self-contact.Keywords: frictional contact, three-dimensional, large-scale, iterative method
Procedia PDF Downloads 2102522 An Innovative Auditory Impulsed EEG and Neural Network Based Biometric Identification System
Authors: Ritesh Kumar, Gitanjali Chhetri, Mandira Bhatia, Mohit Mishra, Abhijith Bailur, Abhinav
Abstract:
The prevalence of the internet and technology in our day to day lives is creating more security issues than ever. The need for protecting and providing a secure access to private and business data has led to the development of many security systems. One of the potential solutions is to employ the bio-metric authentication technique. In this paper we present an innovative biometric authentication method that utilizes a person’s EEG signal, which is acquired in response to an auditory stimulus,and transferred wirelessly to a computer that has the necessary ANN algorithm-Multi layer perceptrol neural network because of is its ability to differentiate between information which is not linearly separable.In order to determine the weights of the hidden layer we use Gaussian random weight initialization. MLP utilizes a supervised learning technique called Back propagation for training the network. The complex algorithm used for EEG classification reduces the chances of intrusion into the protected public or private data.Keywords: EEG signal, auditory evoked potential, biometrics, multilayer perceptron neural network, back propagation rule, Gaussian random weight initialization
Procedia PDF Downloads 4092521 A Distributed Mobile Agent Based on Intrusion Detection System for MANET
Authors: Maad Kamal Al-Anni
Abstract:
This study is about an algorithmic dependence of Artificial Neural Network on Multilayer Perceptron (MPL) pertaining to the classification and clustering presentations for Mobile Adhoc Network vulnerabilities. Moreover, mobile ad hoc network (MANET) is ubiquitous intelligent internetworking devices in which it has the ability to detect their environment using an autonomous system of mobile nodes that are connected via wireless links. Security affairs are the most important subject in MANET due to the easy penetrative scenarios occurred in such an auto configuration network. One of the powerful techniques used for inspecting the network packets is Intrusion Detection System (IDS); in this article, we are going to show the effectiveness of artificial neural networks used as a machine learning along with stochastic approach (information gain) to classify the malicious behaviors in simulated network with respect to different IDS techniques. The monitoring agent is responsible for detection inference engine, the audit data is collected from collecting agent by simulating the node attack and contrasted outputs with normal behaviors of the framework, whenever. In the event that there is any deviation from the ordinary behaviors then the monitoring agent is considered this event as an attack , in this article we are going to demonstrate the signature-based IDS approach in a MANET by implementing the back propagation algorithm over ensemble-based Traffic Table (TT), thus the signature of malicious behaviors or undesirable activities are often significantly prognosticated and efficiently figured out, by increasing the parametric set-up of Back propagation algorithm during the experimental results which empirically shown its effectiveness for the ratio of detection index up to 98.6 percentage. Consequently it is proved in empirical results in this article, the performance matrices are also being included in this article with Xgraph screen show by different through puts like Packet Delivery Ratio (PDR), Through Put(TP), and Average Delay(AD).Keywords: Intrusion Detection System (IDS), Mobile Adhoc Networks (MANET), Back Propagation Algorithm (BPA), Neural Networks (NN)
Procedia PDF Downloads 1942520 Using Personalized Spiking Neural Networks, Distinct Techniques for Self-Governing
Authors: Brwa Abdulrahman Abubaker
Abstract:
Recently, there has been a lot of interest in the difficult task of applying reinforcement learning to autonomous mobile robots. Conventional reinforcement learning (TRL) techniques have many drawbacks, such as lengthy computation times, intricate control frameworks, a great deal of trial and error searching, and sluggish convergence. In this paper, a modified Spiking Neural Network (SNN) is used to offer a distinct method for autonomous mobile robot learning and control in unexpected surroundings. As a learning algorithm, the suggested model combines dopamine modulation with spike-timing-dependent plasticity (STDP). In order to create more computationally efficient, biologically inspired control systems that are adaptable to changing settings, this work uses the effective and physiologically credible Izhikevich neuron model. This study is primarily focused on creating an algorithm for target tracking in the presence of obstacles. Results show that the SNN trained with three obstacles yielded an impressive 96% success rate for our proposal, with collisions happening in about 4% of the 214 simulated seconds.Keywords: spiking neural network, spike-timing-dependent plasticity, dopamine modulation, reinforcement learning
Procedia PDF Downloads 212519 Hybridized Approach for Distance Estimation Using K-Means Clustering
Authors: Ritu Vashistha, Jitender Kumar
Abstract:
Clustering using the K-means algorithm is a very common way to understand and analyze the obtained output data. When a similar object is grouped, this is called the basis of Clustering. There is K number of objects and C number of cluster in to single cluster in which k is always supposed to be less than C having each cluster to be its own centroid but the major problem is how is identify the cluster is correct based on the data. Formulation of the cluster is not a regular task for every tuple of row record or entity but it is done by an iterative process. Each and every record, tuple, entity is checked and examined and similarity dissimilarity is examined. So this iterative process seems to be very lengthy and unable to give optimal output for the cluster and time taken to find the cluster. To overcome the drawback challenge, we are proposing a formula to find the clusters at the run time, so this approach can give us optimal results. The proposed approach uses the Euclidian distance formula as well melanosis to find the minimum distance between slots as technically we called clusters and the same approach we have also applied to Ant Colony Optimization(ACO) algorithm, which results in the production of two and multi-dimensional matrix.Keywords: ant colony optimization, data clustering, centroids, data mining, k-means
Procedia PDF Downloads 1282518 Integrated Genetic-A* Graph Search Algorithm Decision Model for Evaluating Cost and Quality of School Renovation Strategies
Authors: Yu-Ching Cheng, Yi-Kai Juan, Daniel Castro
Abstract:
Energy consumption of buildings has been an increasing concern for researchers and practitioners in the last decade. Sustainable building renovation can reduce energy consumption and carbon dioxide emissions; meanwhile, it also can extend existing buildings useful life and facilitate environmental sustainability while providing social and economic benefits to the society. School buildings are different from other designed spaces as they are more crowded and host the largest portion of daily activities and occupants. Strategies that focus on reducing energy use but also improve the students’ learning environment becomes a significant subject in sustainable school buildings development. A decision model is developed in this study to solve complicated and large-scale combinational, discrete and determinate problems such as school renovation projects. The task of this model is to automatically search for the most cost-effective (lower cost and higher quality) renovation strategies. In this study, the search process of optimal school building renovation solutions is by nature a large-scale zero-one programming determinate problem. A* is suitable for solving deterministic problems due to its stable and effective search process, and genetic algorithms (GA) provides opportunities to acquire global optimal solutions in a short time via its indeterminate search process based on probability. These two algorithms are combined in this study to consider trade-offs between renovation cost and improved quality, this decision model is able to evaluate current school environmental conditions and suggest an optimal scheme of sustainable school buildings renovation strategies. Through adoption of this decision model, school managers can overcome existing limitations and transform school buildings into spaces more beneficial to students and friendly to the environment.Keywords: decision model, school buildings, sustainable renovation, genetic algorithm, A* search algorithm
Procedia PDF Downloads 1182517 A Fast Optimizer for Large-scale Fulfillment Planning based on Genetic Algorithm
Authors: Choonoh Lee, Seyeon Park, Dongyun Kang, Jaehyeong Choi, Soojee Kim, Younggeun Kim
Abstract:
Market Kurly is the first South Korean online grocery retailer that guarantees same-day, overnight shipping. More than 1.6 million customers place an average of 4.7 million orders and add 3 to 14 products into a cart per month. The company has sold almost 30,000 kinds of various products in the past 6 months, including food items, cosmetics, kitchenware, toys for kids/pets, and even flowers. The company is operating and expanding multiple dry, cold, and frozen fulfillment centers in order to store and ship these products. Due to the scale and complexity of the fulfillment, pick-pack-ship processes are planned and operated in batches, and thus, the planning that decides the batch of the customers’ orders is a critical factor in overall productivity. This paper introduces a metaheuristic optimization method that reduces the complexity of batch processing in a fulfillment center. The method is an iterative genetic algorithm with heuristic creation and evolution strategies; it aims to group similar orders into pick-pack-ship batches to minimize the total number of distinct products. With a well-designed approach to create initial genes, the method produces streamlined plans, up to 13.5% less complex than the actual plans carried out in the company’s fulfillment centers in the previous months. Furthermore, our digital-twin simulations show that the optimized plans can reduce 3% of operation time for packing, which is the most complex and time-consuming task in the process. The optimization method implements a multithreading design on the Spring framework to support the company’s warehouse management systems in near real-time, finding a solution for 4,000 orders within 5 to 7 seconds on an AWS c5.2xlarge instance.Keywords: fulfillment planning, genetic algorithm, online grocery retail, optimization
Procedia PDF Downloads 832516 Design Standardization in Aramco: Strategic Analysis
Authors: Mujahid S. Alharbi
Abstract:
The construction of process plants in oil and gas-producing countries, such as Saudi Arabia, necessitates substantial investment in design and building. Each new plant, while unique, includes common building types, suggesting an opportunity for design standardization. This study investigates the adoption of standardized Issue For Construction (IFC) packages for non-process buildings in Saudi Aramco. A SWOT analysis presents the strengths, weaknesses, opportunities, and threats of this approach. The approach's benefits are illustrated using the Hawiyah Unayzah Gas Reservoir Storage Program (HUGRSP) as a case study. Standardization not only offers significant cost savings and operational efficiencies but also expedites project timelines, reduces the potential for change orders, and fosters local economic growth by allocating building tasks to local contractors. Standardization also improves project management by easing interface constraints between different contractors and promoting adaptability to future industry changes. This research underscores the standardization of non-process buildings as a powerful strategy for cost optimization, efficiency enhancement, and local economic development in process plant construction within the oil and gas sector.Keywords: building, construction, management, project, standardization
Procedia PDF Downloads 642515 Automating 2D CAD to 3D Model Generation Process: Wall pop-ups
Authors: Mohit Gupta, Chialing Wei, Thomas Czerniawski
Abstract:
In this paper, we have built a neural network that can detect walls on 2D sheets and subsequently create a 3D model in Revit using Dynamo. The training set includes 3500 labeled images, and the detection algorithm used is YOLO. Typically, engineers/designers make concentrated efforts to convert 2D cad drawings to 3D models. This costs a considerable amount of time and human effort. This paper makes a contribution in automating the task of 3D walls modeling. 1. Detecting Walls in 2D cad and generating 3D pop-ups in Revit. 2. Saving designer his/her modeling time in drafting elements like walls from 2D cad to 3D representation. An object detection algorithm YOLO is used for wall detection and localization. The neural network is trained over 3500 labeled images of size 256x256x3. Then, Dynamo is interfaced with the output of the neural network to pop-up 3D walls in Revit. The research uses modern technological tools like deep learning and artificial intelligence to automate the process of generating 3D walls without needing humans to manually model them. Thus, contributes to saving time, human effort, and money.Keywords: neural networks, Yolo, 2D to 3D transformation, CAD object detection
Procedia PDF Downloads 1442514 Land Cover Classification Using Sentinel-2 Image Data and Random Forest Algorithm
Authors: Thanh Noi Phan, Martin Kappas, Jan Degener
Abstract:
The currently launched Sentinel 2 (S2) satellite (June, 2015) bring a great potential and opportunities for land use/cover map applications, due to its fine spatial resolution multispectral as well as high temporal resolutions. So far, there are handful studies using S2 real data for land cover classification. Especially in northern Vietnam, to our best knowledge, there exist no studies using S2 data for land cover map application. The aim of this study is to provide the preliminary result of land cover classification using Sentinel -2 data with a rising state – of – art classifier, Random Forest. A case study with heterogeneous land use/cover in the eastern of Hanoi Capital – Vietnam was chosen for this study. All 10 spectral bands of 10 and 20 m pixel size of S2 images were used, the 10 m bands were resampled to 20 m. Among several classified algorithms, supervised Random Forest classifier (RF) was applied because it was reported as one of the most accuracy methods of satellite image classification. The results showed that the red-edge and shortwave infrared (SWIR) bands play an important role in land cover classified results. A very high overall accuracy above 90% of classification results was achieved.Keywords: classify algorithm, classification, land cover, random forest, sentinel 2, Vietnam
Procedia PDF Downloads 3882513 Arabic Lexicon Learning to Analyze Sentiment in Microblogs
Authors: Mahmoud B. Rokaya
Abstract:
The study of opinion mining and sentiment analysis includes analysis of opinions, sentiments, evaluations, attitudes, and emotions. The rapid growth of social media, social networks, reviews, forum discussions, microblogs, and Twitter, leads to a parallel growth in the field of sentiment analysis. The field of sentiment analysis tries to develop effective tools to make it possible to capture the trends of people. There are two approaches in the field, lexicon-based and corpus-based methods. A lexicon-based method uses a sentiment lexicon which includes sentiment words and phrases with assigned numeric scores. These scores reveal if sentiment phrases are positive or negative, their intensity, and/or their emotional orientations. Creation of manual lexicons is hard. This brings the need for adaptive automated methods for generating a lexicon. The proposed method generates dynamic lexicons based on the corpus and then classifies text using these lexicons. In the proposed method, different approaches are combined to generate lexicons from text. The proposed method classifies the tweets into 5 classes instead of +ve or –ve classes. The sentiment classification problem is written as an optimization problem, finding optimum sentiment lexicons are the goal of the optimization process. The solution was produced based on mathematical programming approaches to find the best lexicon to classify texts. A genetic algorithm was written to find the optimal lexicon. Then, extraction of a meta-level feature was done based on the optimal lexicon. The experiments were conducted on several datasets. Results, in terms of accuracy, recall and F measure, outperformed the state-of-the-art methods proposed in the literature in some of the datasets. A better understanding of the Arabic language and culture of Arab Twitter users and sentiment orientation of words in different contexts can be achieved based on the sentiment lexicons proposed by the algorithm.Keywords: social media, Twitter sentiment, sentiment analysis, lexicon, genetic algorithm, evolutionary computation
Procedia PDF Downloads 1882512 Advantages of Neural Network Based Air Data Estimation for Unmanned Aerial Vehicles
Authors: Angelo Lerro, Manuela Battipede, Piero Gili, Alberto Brandl
Abstract:
Redundancy requirements for UAV (Unmanned Aerial Vehicle) are hardly faced due to the generally restricted amount of available space and allowable weight for the aircraft systems, limiting their exploitation. Essential equipment as the Air Data, Attitude and Heading Reference Systems (ADAHRS) require several external probes to measure significant data as the Angle of Attack or the Sideslip Angle. Previous research focused on the analysis of a patented technology named Smart-ADAHRS (Smart Air Data, Attitude and Heading Reference System) as an alternative method to obtain reliable and accurate estimates of the aerodynamic angles. This solution is based on an innovative sensor fusion algorithm implementing soft computing techniques and it allows to obtain a simplified inertial and air data system reducing external devices. In fact, only one external source of dynamic and static pressures is needed. This paper focuses on the benefits which would be gained by the implementation of this system in UAV applications. A simplification of the entire ADAHRS architecture will bring to reduce the overall cost together with improved safety performance. Smart-ADAHRS has currently reached Technology Readiness Level (TRL) 6. Real flight tests took place on ultralight aircraft equipped with a suitable Flight Test Instrumentation (FTI). The output of the algorithm using the flight test measurements demonstrates the capability for this fusion algorithm to embed in a single device multiple physical and virtual sensors. Any source of dynamic and static pressure can be integrated with this system gaining a significant improvement in terms of versatility.Keywords: aerodynamic angles, air data system, flight test, neural network, unmanned aerial vehicle, virtual sensor
Procedia PDF Downloads 2212511 Resource Creation Using Natural Language Processing Techniques for Malay Translated Qur'an
Authors: Nor Diana Ahmad, Eric Atwell, Brandon Bennett
Abstract:
Text processing techniques for English have been developed for several decades. But for the Malay language, text processing methods are still far behind. Moreover, there are limited resources, tools for computational linguistic analysis available for the Malay language. Therefore, this research presents the use of natural language processing (NLP) in processing Malay translated Qur’an text. As the result, a new language resource for Malay translated Qur’an was created. This resource will help other researchers to build the necessary processing tools for the Malay language. This research also develops a simple question-answer prototype to demonstrate the use of the Malay Qur’an resource for text processing. This prototype has been developed using Python. The prototype pre-processes the Malay Qur’an and an input query using a stemming algorithm and then searches for occurrences of the query word stem. The result produced shows improved matching likelihood between user query and its answer. A POS-tagging algorithm has also been produced. The stemming and tagging algorithms can be used as tools for research related to other Malay texts and can be used to support applications such as information retrieval, question answering systems, ontology-based search and other text analysis tasks.Keywords: language resource, Malay translated Qur'an, natural language processing (NLP), text processing
Procedia PDF Downloads 3182510 Improved Multi-Objective Particle Swarm Optimization Applied to Design Problem
Authors: Kapse Swapnil, K. Shankar
Abstract:
Aiming at optimizing the weight and deflection of cantilever beam subjected to maximum stress and maximum deflection, Multi-objective Particle Swarm Optimization (MOPSO) with Utopia Point based local search is implemented. Utopia point is used to govern the search towards the Pareto Optimal set. The elite candidates obtained during the iterations are stored in an archive according to non-dominated sorting and also the archive is truncated based on least crowding distance. Local search is also performed on elite candidates and the most diverse particle is selected as the global best. This method is implemented on standard test functions and it is observed that the improved algorithm gives better convergence and diversity as compared to NSGA-II in fewer iterations. Implementation on practical structural problem shows that in 5 to 6 iterations, the improved algorithm converges with better diversity as evident by the improvement of cantilever beam on an average of 0.78% and 9.28% in the weight and deflection respectively compared to NSGA-II.Keywords: Utopia point, multi-objective particle swarm optimization, local search, cantilever beam
Procedia PDF Downloads 519