Search results for: corpus based approach
35438 A Deep Learning Approach to Detect Complete Safety Equipment for Construction Workers Based on YOLOv7
Authors: Shariful Islam, Sharun Akter Khushbu, S. M. Shaqib, Shahriar Sultan Ramit
Abstract:
In the construction sector, ensuring worker safety is of the utmost significance. In this study, a deep learning-based technique is presented for identifying safety gear worn by construction workers, such as helmets, goggles, jackets, gloves, and footwear. The suggested method precisely locates these safety items by using the YOLO v7 (You Only Look Once) object detection algorithm. The dataset utilized in this work consists of labeled images split into training, testing and validation sets. Each image has bounding box labels that indicate where the safety equipment is located within the image. The model is trained to identify and categorize the safety equipment based on the labeled dataset through an iterative training approach. We used custom dataset to train this model. Our trained model performed admirably well, with good precision, recall, and F1-score for safety equipment recognition. Also, the model's evaluation produced encouraging results, with a [email protected] score of 87.7%. The model performs effectively, making it possible to quickly identify safety equipment violations on building sites. A thorough evaluation of the outcomes reveals the model's advantages and points up potential areas for development. By offering an automatic and trustworthy method for safety equipment detection, this research contributes to the fields of computer vision and workplace safety. The proposed deep learning-based approach will increase safety compliance and reduce the risk of accidents in the construction industry.Keywords: deep learning, safety equipment detection, YOLOv7, computer vision, workplace safety
Procedia PDF Downloads 6835437 Ecological and Economical Indicators of Successful Community Based Forest Management: A Case of Lowland Community Forestry in Nepal
Authors: Bikram Jung Kunwar, Pralhad Kunwor
Abstract:
The Community-Based Forest Management (CBFM) approach is often glorified as the best forest management alternatives in the developing countries. However, how the approach has been understood by the local user households, who implement it is remained unanswered for many planners, policy makers, and sometimes researcher as well. The study attempts to assess the understanding of ecology and economics of CBFM in Nepal, where community forest program has been implemented since the 1970s. In order to understand the impacts of the program, eight criteria and sixteen indicators for ecological conservation and similarly same number of criteria and indicators for socio-economic impacts of the program were designed and compared between before and after the program implementation. The community forestry program has positive effects in forest ecology conservation and at the same time rural livelihood improvement of local people. The study revealed that collective understanding of forest ecology and economics leads the CBFM approach towards the sustainability of the program in a win-win situation. The recommendations of the study are expected to be useful to natural resource managers, planners, and policy makers.Keywords: community, forest management, ecology, economics, Nepal
Procedia PDF Downloads 39435436 Memory Types in Hemodialysis (HD) Patients; A Study Based on Hemodialysis Duration, Zahedan: South East of Iran
Authors: Behnoush Sabayan, Ali Alidadi, Saeid Ebarhimi, N. M. Bakhshani
Abstract:
Hemodialysis (HD) patients are at a high risk of atherosclerotic and vascular disease; also little information is available for the HD impact on brain structure of these patients. We studied the brain abnormalities in HD patients. The aim of this study was to investigate the effect of long term HD on brain structure of HD patients. Non-contrast MRI was used to evaluate imaging findings. Our study included 80 HD patients of whom 39 had less than six months of HD and 41 patients had a history of HD more than six months. The population had a mean age of 51.60 years old and 27.5% were female. According to study, HD patients who have been hemodialyzed for a long time (median time of HD was up to 4 years) had small vessel ischemia than the HD patients who underwent HD for a shorter term, which the median time was 3 to 5 months. Most of the small vessel ischemia was located in pre-ventricular, subcortical and white matter (1.33± .471, 1.23± .420 and 1.39±.490). However, the other brain damages like: central pons abnormality, global brain atrophy, thinning of corpus callosum and frontal lobe atrophy were found (P<0.01). The present study demonstrated that HD patients who were under HD for a longer time had small vessel ischemia and we conclude that this small vessel ischemia might be a causative mechanism of brain atrophy in chronic hemodialysis patients. However, additional researches are needed in this area.Keywords: Hemodialysis Patients, Duration of Hemodialysis, MRI, Zahedan
Procedia PDF Downloads 21335435 Radar Fault Diagnosis Strategy Based on Deep Learning
Authors: Bin Feng, Zhulin Zong
Abstract:
Radar systems are critical in the modern military, aviation, and maritime operations, and their proper functioning is essential for the success of these operations. However, due to the complexity and sensitivity of radar systems, they are susceptible to various faults that can significantly affect their performance. Traditional radar fault diagnosis strategies rely on expert knowledge and rule-based approaches, which are often limited in effectiveness and require a lot of time and resources. Deep learning has recently emerged as a promising approach for fault diagnosis due to its ability to learn features and patterns from large amounts of data automatically. In this paper, we propose a radar fault diagnosis strategy based on deep learning that can accurately identify and classify faults in radar systems. Our approach uses convolutional neural networks (CNN) to extract features from radar signals and fault classify the features. The proposed strategy is trained and validated on a dataset of measured radar signals with various types of faults. The results show that it achieves high accuracy in fault diagnosis. To further evaluate the effectiveness of the proposed strategy, we compare it with traditional rule-based approaches and other machine learning-based methods, including decision trees, support vector machines (SVMs), and random forests. The results demonstrate that our deep learning-based approach outperforms the traditional approaches in terms of accuracy and efficiency. Finally, we discuss the potential applications and limitations of the proposed strategy, as well as future research directions. Our study highlights the importance and potential of deep learning for radar fault diagnosis. It suggests that it can be a valuable tool for improving the performance and reliability of radar systems. In summary, this paper presents a radar fault diagnosis strategy based on deep learning that achieves high accuracy and efficiency in identifying and classifying faults in radar systems. The proposed strategy has significant potential for practical applications and can pave the way for further research.Keywords: radar system, fault diagnosis, deep learning, radar fault
Procedia PDF Downloads 9035434 Risk Management and Security Practice in Customs Supply Chain: Application of Cross ABC Method to the Moroccan Customs
Authors: Lamia Hammadi, Abdellah Ait Ouhman, Aomar Ibourk
Abstract:
It is widely assumed that the case of Customs Supply Chain is classified as a complex system, due to not only the variety and large number of actors, but also their complex structural links, and the interactions between these actors, that’s why this system is subject to various types of Risks. The economic, political and social impacts of those risks are highly detrimental to countries, businesses and the public, for this reason, Risk management in the customs supply chain is becoming a crucial issue to ensure the sustainability, security and safety. The main characteristic of customs risk management approach is determining which goods and means of transport should be examined? To what extend? And where future compliance resources should be directed? The purposes of this article are, firstly to deal with the concept of customs supply chain, secondly present our risk management approach based on Cross Activity Based Costing (ABC) Method as an interactive tool to support decision making in customs risk management. Finally, analysis of case study of Moroccan customs to putting theory into practice and will thus draw together the various elements of a structured and efficient risk management approach.Keywords: cross ABC method, customs supply chain, risk, risk management
Procedia PDF Downloads 37935433 Implementation of an Economic – Probabilistic Model to Risk Analysis of ERP Project in Technological Innovation Firms – A Case Study of ICT Industry in Iran
Authors: Reza Heidari, Maryam Amiri
Abstract:
In a technological world, many countries have a tendency to fortifying their companies and technological infrastructures. Also, one of the most important requirements for developing technology is innovation, and then, all companies are struggling to consider innovation as a basic principle. Since, the expansion of a product need to combine different technologies, therefore, different innovative projects would be run in the firms as a base of technology development. In such an environment, enterprise resource planning (ERP) has special significance in order to develop and strengthen of innovations. In this article, an economic-probabilistic analysis was provided to perform an implementation project of ERP in the technological innovation (TI) based firms. The used model in this article assesses simultaneously both risk and economic analysis in view of the probability of each event that is jointly between economical approach and risk investigation approach. To provide an economic-probabilistic analysis of risk of the project, activities and milestones in the cash flow were extracted. Also, probability of occurrence of each of them was assessed. Since, Resources planning in an innovative firm is the object of this project. Therefore, we extracted various risks that are in relation with innovative project and then they were evaluated in the form of cash flow. This model, by considering risks affecting the project and the probability of each of them and assign them to the project's cash flow categories, presents an adjusted cash flow based on Net Present Value (NPV) and with probabilistic simulation approach. Indeed, this model presented economic analysis of the project based on risks-adjusted. Then, it measures NPV of the project, by concerning that these risks which have the most effect on technological innovation projects, and in the following measures probability associated with the NPV for each category. As a result of application of presented model in the information and communication technology (ICT) industry, provided an appropriate analysis of feasibility of the project from the point of view of cash flow based on risk impact on the project. Obtained results can be given to decision makers until they can practically have a systematically analysis of the possibility of the project with an economic approach and as moderated.Keywords: cash flow categorization, economic evaluation, probabilistic, risk assessment, technological innovation
Procedia PDF Downloads 40335432 Examples of Parameterization of Stabilizing Controllers with One-Side Coprime Factorization
Authors: Kazuyoshi Mori
Abstract:
Examples of parameterization of stabilizing controllers that require only one of right-/left-coprime factorizations are presented. One parameterization method requires one side coprime factorization. The other requires no coprime factorization. The methods are based on the factorization approach so that a number of models can be applied the method we use in this paper.Keywords: parametrization, coprime factorization, factorization approach, linear systems
Procedia PDF Downloads 37335431 Variation in Italian Specialized Economic Texts
Authors: Abdelmagid Basyouny Sakr
Abstract:
Terminological variation is a reality and it is now recognized by terminologists. This paper investigates the terminological variation in the context of specialized economic texts in Italian. It aims to find whether certain patterns or tendencies can be derived from the analysis of these texts. Term variants pose two different kinds of difficulties. The first one is being able to recognize linguistic expressions that denote the same concept in running text. Another one lies in knowing which variant should be considered and for what purpose. This would help to differentiate between variants that could be candidates for inclusion in terminological resources and the ones which are synonyms or contextual variants. New insights about terminological variation in specialized texts could contribute to improve specialized dictionaries which will better account for the different ways in which a given thought is expressed.Keywords: corpus linguistics, specialized communication, terms and concepts, terminological variation
Procedia PDF Downloads 15935430 Reaching to the Unreachable: Can Local Adaptation Plan of Action (LAPA) Overcome the Current Barriers to Reach to the Vulnerable?
Authors: Bimal Raj Regmi, Cassandra Star
Abstract:
Climate change adaptation is now the priority of many Least Developed Countries (LDCs). The country governments in LDCs are designing institutional and financing architecture to implement adaptation programmes. Nepal has introduced the concept of Local Adaptation Plan of Action (LAPA) to facilitate adaptation at the local level. However, there is lack of clarity and ambiguity on whether or not LAPA can be effective means to reach to the most vulnerable. This research paper aims to generate evidences to assess the applicability and significance of LAPA. The study used a case study approach and relied on data gathered from field studies carried out in Pyuthan and Nawalparasi district of Nepal. The findings show that LAPA has potentials to link the community based adaptation with national adaptation initiatives and thus act as middle range approach to adaptation planning. However, the current scale of LAPA and its approaches to planning and delivery are constraints by socio-economic and governance barriers. This research paper argue that the in order to address the constraints a more flexible and co-management approach to LAPA is needed.Keywords: community based adaptation, local adaptation, co-management, climate change
Procedia PDF Downloads 25935429 RAPDAC: Role Centric Attribute Based Policy Driven Access Control Model
Authors: Jamil Ahmed
Abstract:
Access control models aim to decide whether a user should be denied or granted access to the user‟s requested activity. Various access control models have been established and proposed. The most prominent of these models include role-based, attribute-based, policy based access control models as well as role-centric attribute based access control model. In this paper, a novel access control model is presented called “Role centric Attribute based Policy Driven Access Control (RAPDAC) model”. RAPDAC incorporates the concept of “policy” in the “role centric attribute based access control model”. It leverages the concept of "policy‟ by precisely combining the evaluation of conditions, attributes, permissions and roles in order to allow authorization access. This approach allows capturing the "access control policy‟ of a real time application in a well defined manner. RAPDAC model allows making access decision at much finer granularity as illustrated by the case study of a real time library information system.Keywords: authorization, access control model, role based access control, attribute based access control
Procedia PDF Downloads 15935428 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation
Authors: Mohammad Anwar, Shah Waliullah
Abstract:
This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model
Procedia PDF Downloads 6835427 Introducing a Practical Model for Instructional System Design Based on Determining of the knowledge Level of the Organization: Case Study of Isfahan Public Transportation Co.
Authors: Mojtaba Aghajari, Alireza Aghasi
Abstract:
The first challenge which the current research faced has been the identification or determination of the level of knowledge in Isfahan public transportation corporation, and the second challenge has been the recognition and choice of a proper approach for the instructional system design. Responding these two challenges will present an appropriate model of instructional system design. In order to respond the first challenge or question, Nonaka and Takeuchi KM model has been utilized due to its universality among the 26 models proposed so far. The statistical population of this research included 2200 people, among which 200 persons were chosen as the sample of the research by the use of Morgan’s method. The data gathering has been carried out by the means of a questionnaire based on Nonaka and Takeuchi KM model, analysis of which has been done by SPSS program. The output of this questionnaire, yielding the point of 1.96 (out of 5 points), revealed that the general condition of Isfahan public transportation corporation is weak concerning its being knowledge-centered. After placing this output on Jonassen’s continuum, it was revealed that the appropriate approach for instructional system design is the system (or behavioral) approach. Accordingly, different steps of the general model of ADDIE, which covers all of the ISO10015 standards, were adopted in the act of designing. Such process in Isfahan public transportation corporation was designed and divided into three main steps, including: instructional designing and planning, instructional course planning, determination of the evaluation and the effectiveness of the instructional courses.Keywords: instructional system design, system approach, knowledge management, employees
Procedia PDF Downloads 32635426 Students' Perception of Using Dental E-Models in an Inquiry-Based Curriculum
Authors: Yanqi Yang, Chongshan Liao, Cheuk Hin Ho, Susan Bridges
Abstract:
Aim: To investigate student’s perceptions of using e-models in an inquiry-based curriculum. Approach: 52 second-year dental students completed a pre- and post-test questionnaire relating to their perceptions of e-models and their use in inquiry-based learning. The pre-test occurred prior to any learning with e-models. The follow-up survey was conducted after one year's experience of using e-models. Results: There was no significant difference between the two sets of questionnaires regarding student’s perceptions of the usefulness of e-models and their willingness to use e-models in future inquiry-based learning. Most of the students preferred using both plaster models and e-models in tandem. Conclusion: Students did not change their attitude towards e-models and most of them agreed or were neutral that e-models are useful in inquiry-based learning. Whilst recognizing the utility of 3D models for learning, student's preference for combining these with solid models has implications for the development of haptic sensibility in an operative discipline.Keywords: e-models, inquiry-based curriculum, education, questionnaire
Procedia PDF Downloads 43135425 Visual and Chemical Servoing of a Hexapod Robot in a Confined Environment Using Jacobian Estimator
Authors: Guillaume Morin-Duponchelle, Ahmed Nait Chabane, Benoit Zerr, Pierre Schoesetters
Abstract:
Industrial inspection can be achieved through robotic systems, allowing visual and chemical servoing. A popular scheme for visual servo-controlled robotic is the image-based servoing sys-tems. In this paper, an approach of visual and chemical servoing of a hexapod robot using a visual and chemical Jacobian matrix are proposed. The basic idea behind the visual Jacobian matrix is modeling the differential relationship between the camera system and the robotic control system to detect and track accurately points of interest in confined environments. This approach allows the robot to easily detect and navigates to the QR code or seeks a gas source localization using surge cast algorithm. To track the QR code target, a visual servoing based on Jacobian matrix is used. For chemical servoing, three gas sensors are embedded on the hexapod. A Jacobian matrix applied to the gas concentration measurements allows estimating the direction of the main gas source. The effectiveness of the proposed scheme is first demonstrated on simulation. Finally, a hexapod prototype is designed and built and the experimental validation of the approach is presented and discussed.Keywords: chemical servoing, hexapod robot, Jacobian matrix, visual servoing, navigation
Procedia PDF Downloads 12535424 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups
Authors: Naushad Mamode Khan
Abstract:
The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood based estimating methodology. The joint generalized quasilikelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQLIII) that are based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.Keywords: longitudinal, com-Poisson, ill-conditioned, INAR(1), GLMS, GQL
Procedia PDF Downloads 35435423 Voltage Stability Margin-Based Approach for Placement of Distributed Generators in Power Systems
Authors: Oludamilare Bode Adewuyi, Yanxia Sun, Isaiah Gbadegesin Adebayo
Abstract:
Voltage stability analysis is crucial to the reliable and economic operation of power systems. The power system of developing nations is more susceptible to failures due to the continuously increasing load demand, which is not matched with generation increase and efficient transmission infrastructures. Thus, most power systems are heavily stressed, and the planning of extra generation from distributed generation sources needs to be efficiently done so as to ensure the security of the power system. Some voltage stability index-based approach for DG siting has been reported in the literature. However, most of the existing voltage stability indices, though sufficient, are found to be inaccurate, especially for overloaded power systems. In this paper, the performance of a relatively different approach using a line voltage stability margin indicator, which has proven to have better accuracy, has been presented and compared with a conventional line voltage stability index for DG siting using the Nigerian 28 bus system. Critical boundary index (CBI) for voltage stability margin estimation was deployed to identify suitable locations for DG placement, and the performance was compared with DG placement using the Novel Line Stability Index (NLSI) approach. From the simulation results, both CBI and NLSI agreed greatly on suitable locations for DG on the test system; while CBI identified bus 18 as the most suitable at system overload, NLSI identified bus 8 to be the most suitable. Considering the effect of the DG placement at the selected buses on the voltage magnitude profile, the result shows that the DG placed on bus 18 identified by CBI improved the performance of the power system better.Keywords: voltage stability analysis, voltage collapse, voltage stability index, distributed generation
Procedia PDF Downloads 9335422 Multi-Level Priority Based Task Scheduling Algorithm for Workflows in Cloud Environment
Authors: Anju Bala, Inderveer Chana
Abstract:
Task scheduling is the key concern for the execution of performance-driven workflow applications. As efficient scheduling can have major impact on the performance of the system, task scheduling is often chosen for assigning the request to resources in an efficient way based on cloud resource characteristics. In this paper, priority based task scheduling algorithm has been proposed that prioritizes the tasks based on the length of the instructions. The proposed scheduling approach prioritize the tasks of Cloud applications according to the limits set by six sigma control charts based on dynamic threshold values. Further, the proposed algorithm has been validated through the CloudSim toolkit. The experimental results demonstrate that the proposed algorithm is effective for handling multiple task lists from workflows and in considerably reducing Makespan and Execution time.Keywords: cloud computing, priority based scheduling, task scheduling, VM allocation
Procedia PDF Downloads 51735421 CRISPR Technology: A Tool in the Potential Cure for COVID-19 Virus
Authors: Chijindu Okpalaoka, Charles Chinedu Onuselogu
Abstract:
COVID-19, humanity's coronavirus disease caused by SARS-CoV-2, was first detected in late 2019 in Wuhan, China. COVID-19 lacked an established conventional pharmaceutical therapy, and as a result, the outbreak quickly became an epidemic affecting the entire World. Only a qPCR assay is reliable for diagnosing COVID-19. Clustered, regularly interspaced short palindromic repeats (CRISPR) technology is being researched for speedy and specific identification of COVID-19, among other therapeutic techniques. Apart from its therapeutic capabilities, the CRISPR technique is being evaluated to develop antiviral therapies; nevertheless, no CRISPR-based medication has been approved for human use to date. Prophylactic antiviral CRISPR in living being cells, a Cas 13-based approach against coronavirus, has been developed. While this method can be evolved into a treatment approach, it may face substantial obstacles in human clinical trials for licensure. This study discussed the potential applications of CRISPR-based techniques for developing a speedy and accurate feasible treatment alternative for the COVID-19 virus.Keywords: COVID-19, CRISPR technique, Cas13, SARS-CoV-2, prophylactic antiviral
Procedia PDF Downloads 12935420 Streamlining Cybersecurity Risk Assessment for Industrial Control and Automation Systems: Leveraging the National Institute of Standard and Technology’s Risk Management Framework (RMF) Using Model-Based System Engineering (MBSE)
Authors: Gampel Alexander, Mazzuchi Thomas, Sarkani Shahram
Abstract:
The cybersecurity landscape is constantly evolving, and organizations must adapt to the changing threat environment to protect their assets. The implementation of the NIST Risk Management Framework (RMF) has become critical in ensuring the security and safety of industrial control and automation systems. However, cybersecurity professionals are facing challenges in implementing RMF, leading to systems operating without authorization and being non-compliant with regulations. The current approach to RMF implementation based on business practices is limited and insufficient, leaving organizations vulnerable to cyberattacks resulting in the loss of personal consumer data and critical infrastructure details. To address these challenges, this research proposes a Model-Based Systems Engineering (MBSE) approach to implementing cybersecurity controls and assessing risk through the RMF process. The study emphasizes the need to shift to a modeling approach, which can streamline the RMF process and eliminate bloated structures that make it difficult to receive an Authorization-To-Operate (ATO). The study focuses on the practical application of MBSE in industrial control and automation systems to improve the security and safety of operations. It is concluded that MBSE can be used to solve the implementation challenges of the NIST RMF process and improve the security of industrial control and automation systems. The research suggests that MBSE provides a more effective and efficient method for implementing cybersecurity controls and assessing risk through the RMF process. The future work for this research involves exploring the broader applicability of MBSE in different industries and domains. The study suggests that the MBSE approach can be applied to other domains beyond industrial control and automation systems.Keywords: authorization-to-operate (ATO), industrial control systems (ICS), model-based system’s engineering (MBSE), risk management framework (RMF)
Procedia PDF Downloads 9535419 Direct-Displacement Based Design for Buildings with Non-Linear Viscous Dampers
Authors: Kelly F. Delgado-De Agrela, Sonia E. Ruiz, Marco A. Santos-Santiago
Abstract:
An approach is proposed for the design of regular buildings equipped with non-linear viscous dissipating devices. The approach is based on a direct-displacement seismic design method which satisfies seismic performance objectives. The global system involved is formed by structural regular moment frames capable of supporting gravity and lateral loads with elastic response behavior plus a set of non-linear viscous dissipating devices which reduce the structural seismic response. The dampers are characterized by two design parameters: (1) a positive real exponent α which represents the non-linearity of the damper, and (2) the damping coefficient C of the device, whose constitutive force-velocity law is given by F=Cvᵃ, where v is the velocity between the ends of the damper. The procedure is carried out using a substitute structure. Two limits states are verified: serviceability and near collapse. The reduction of the spectral ordinates by the additional damping assumed in the design process and introduced to the structure by the viscous non-linear dampers is performed according to a damping reduction factor. For the design of the non-linear damper system, the real velocity is considered instead of the pseudo-velocity. The proposed design methodology is applied to an 8-story steel moment frame building equipped with non-linear viscous dampers, located in intermediate soil zone of Mexico City, with a dominant period Tₛ = 1s. In order to validate the approach, nonlinear static analyses and nonlinear time history analyses are performed.Keywords: based design, direct-displacement based design, non-linear viscous dampers, performance design
Procedia PDF Downloads 19335418 Flowing Online Vehicle GPS Data Clustering Using a New Parallel K-Means Algorithm
Authors: Orhun Vural, Oguz Bayat, Rustu Akay, Osman N. Ucan
Abstract:
This study presents a new parallel approach clustering of GPS data. Evaluation has been made by comparing execution time of various clustering algorithms on GPS data. This paper aims to propose a parallel based on neighborhood K-means algorithm to make it faster. The proposed parallelization approach assumes that each GPS data represents a vehicle and to communicate between vehicles close to each other after vehicles are clustered. This parallelization approach has been examined on different sized continuously changing GPS data and compared with serial K-means algorithm and other serial clustering algorithms. The results demonstrated that proposed parallel K-means algorithm has been shown to work much faster than other clustering algorithms.Keywords: parallel k-means algorithm, parallel clustering, clustering algorithms, clustering on flowing data
Procedia PDF Downloads 22135417 Optimal Tuning of a Fuzzy Immune PID Parameters to Control a Delayed System
Authors: S. Gherbi, F. Bouchareb
Abstract:
This paper deals with the novel intelligent bio-inspired control strategies, it presents a novel approach based on an optimal fuzzy immune PID parameters tuning, it is a combination of a PID controller, inspired by the human immune mechanism with fuzzy logic. Such controller offers more possibilities to deal with the delayed systems control difficulties due to the delay term. Indeed, we use an optimization approach to tune the four parameters of the controller in addition to the fuzzy function; the obtained controller is implemented in a modified Smith predictor structure, which is well known that it is the most efficient to the control of delayed systems. The application of the presented approach to control a three tank delay system shows good performances and proves the efficiency of the method.Keywords: delayed systems, fuzzy immune PID, optimization, Smith predictor
Procedia PDF Downloads 43335416 Probabilistic Approach to Contrast Theoretical Predictions from a Public Corruption Game Using Bayesian Networks
Authors: Jaime E. Fernandez, Pablo J. Valverde
Abstract:
This paper presents a methodological approach that aims to contrast/validate theoretical results from a corruption network game through probabilistic analysis of simulated microdata using Bayesian Networks (BNs). The research develops a public corruption model in a game theory framework. Theoretical results suggest a series of 'optimal settings' of model's exogenous parameters that boost the emergence of corruption. The paper contrasts these outcomes with probabilistic inference results based on BNs adjusted over simulated microdata. Principal findings indicate that probabilistic reasoning based on BNs significantly improves parameter specification and causal analysis in a public corruption game.Keywords: Bayesian networks, probabilistic reasoning, public corruption, theoretical games
Procedia PDF Downloads 21035415 Conjunctive Management of Surface and Groundwater Resources under Uncertainty: A Retrospective Optimization Approach
Authors: Julius M. Ndambuki, Gislar E. Kifanyi, Samuel N. Odai, Charles Gyamfi
Abstract:
Conjunctive management of surface and groundwater resources is a challenging task due to the spatial and temporal variability nature of hydrology as well as hydrogeology of the water storage systems. Surface water-groundwater hydrogeology is highly uncertain; thus it is imperative that this uncertainty is explicitly accounted for, when managing water resources. Various methodologies have been developed and applied by researchers in an attempt to account for the uncertainty. For example, simulation-optimization models are often used for conjunctive water resources management. However, direct application of such an approach in which all realizations are considered at each iteration of the optimization process leads to a very expensive optimization in terms of computational time, particularly when the number of realizations is large. The aim of this paper, therefore, is to introduce and apply an efficient approach referred to as Retrospective Optimization Approximation (ROA) that can be used for optimizing conjunctive use of surface water and groundwater over a multiple hydrogeological model simulations. This work is based on stochastic simulation-optimization framework using a recently emerged technique of sample average approximation (SAA) which is a sampling based method implemented within the Retrospective Optimization Approximation (ROA) approach. The ROA approach solves and evaluates a sequence of generated optimization sub-problems in an increasing number of realizations (sample size). Response matrix technique was used for linking simulation model with optimization procedure. The k-means clustering sampling technique was used to map the realizations. The methodology is demonstrated through the application to a hypothetical example. In the example, the optimization sub-problems generated were solved and analysed using “Active-Set” core optimizer implemented under MATLAB 2014a environment. Through k-means clustering sampling technique, the ROA – Active Set procedure was able to arrive at a (nearly) converged maximum expected total optimal conjunctive water use withdrawal rate within a relatively few number of iterations (6 to 7 iterations). Results indicate that the ROA approach is a promising technique for optimizing conjunctive water use of surface water and groundwater withdrawal rates under hydrogeological uncertainty.Keywords: conjunctive water management, retrospective optimization approximation approach, sample average approximation, uncertainty
Procedia PDF Downloads 23135414 Multi-Criteria Inventory Classification Process Based on Logical Analysis of Data
Authors: Diana López-Soto, Soumaya Yacout, Francisco Ángel-Bello
Abstract:
Although inventories are considered as stocks of money sitting on shelve, they are needed in order to secure a constant and continuous production. Therefore, companies need to have control over the amount of inventory in order to find the balance between excessive and shortage of inventory. The classification of items according to certain criteria such as the price, the usage rate and the lead time before arrival allows any company to concentrate its investment in inventory according to certain ranking or priority of items. This makes the decision making process for inventory management easier and more justifiable. The purpose of this paper is to present a new approach for the classification of new items based on the already existing criteria. This approach is called the Logical Analysis of Data (LAD). It is used in this paper to assist the process of ABC items classification based on multiple criteria. LAD is a data mining technique based on Boolean theory that is used for pattern recognition. This technique has been tested in medicine, industry, credit risk analysis, and engineering with remarkable results. An application on ABC inventory classification is presented for the first time, and the results are compared with those obtained when using the well-known AHP technique and the ANN technique. The results show that LAD presented very good classification accuracy.Keywords: ABC multi-criteria inventory classification, inventory management, multi-class LAD model, multi-criteria classification
Procedia PDF Downloads 88135413 New Machine Learning Optimization Approach Based on Input Variables Disposition Applied for Time Series Prediction
Authors: Hervice Roméo Fogno Fotsoa, Germaine Djuidje Kenmoe, Claude Vidal Aloyem Kazé
Abstract:
One of the main applications of machine learning is the prediction of time series. But a more accurate prediction requires a more optimal model of machine learning. Several optimization techniques have been developed, but without considering the input variables disposition of the system. Thus, this work aims to present a new machine learning architecture optimization technique based on their optimal input variables disposition. The validations are done on the prediction of wind time series, using data collected in Cameroon. The number of possible dispositions with four input variables is determined, i.e., twenty-four. Each of the dispositions is used to perform the prediction, with the main criteria being the training and prediction performances. The results obtained from a static architecture and a dynamic architecture of neural networks have shown that these performances are a function of the input variable's disposition, and this is in a different way from the architectures. This analysis revealed that it is necessary to take into account the input variable's disposition for the development of a more optimal neural network model. Thus, a new neural network training algorithm is proposed by introducing the search for the optimal input variables disposition in the traditional back-propagation algorithm. The results of the application of this new optimization approach on the two single neural network architectures are compared with the previously obtained results step by step. Moreover, this proposed approach is validated in a collaborative optimization method with a single objective optimization technique, i.e., genetic algorithm back-propagation neural networks. From these comparisons, it is concluded that each proposed model outperforms its traditional model in terms of training and prediction performance of time series. Thus the proposed optimization approach can be useful in improving the accuracy of time series forecasts. This proves that the proposed optimization approach can be useful in improving the accuracy of time series prediction based on machine learning.Keywords: input variable disposition, machine learning, optimization, performance, time series prediction
Procedia PDF Downloads 10935412 Analysis of Risk-Based Disaster Planning in Local Communities
Authors: R. A. Temah, L. A. Nkengla-Asi
Abstract:
Planning for future disasters sets the stage for a variety of activities that may trigger multiple recurring operations and expose the community to opportunities to minimize risks. Local communities are increasingly embracing the necessity for planning based on local risks, but are also significantly challenged to effectively plan and response to disasters. This research examines basic risk-based disaster planning model and compares it with advanced risk-based planning that introduces the identification and alignment of varieties of local capabilities within and out of the local community that can be pivotal to facilitate the management of local risks and cascading effects prior to a disaster. A critical review shows that the identification and alignment of capabilities can potentially enhance risk-based disaster planning. A tailored holistic approach to risk based disaster planning is pivotal to enhance collective action and a reduction in disaster collective cost.Keywords: capabilities, disaster planning, hazards, local community, risk-based
Procedia PDF Downloads 20635411 Hybrid Approach for the Min-Interference Frequency Assignment
Authors: F. Debbat, F. T. Bendimerad
Abstract:
The efficient frequency assignment for radio communications becomes more and more crucial when developing new information technologies and their applications. It is consists in defining an assignment of frequencies to radio links, to be established between base stations and mobile transmitters. Separation of the frequencies assigned is necessary to avoid interference. However, unnecessary separation causes an excess requirement for spectrum, the cost of which may be very high. This problem is NP-hard problem which cannot be solved by conventional optimization algorithms. It is therefore necessary to use metaheuristic methods to solve it. This paper proposes Hybrid approach based on simulated annealing (SA) and Tabu Search (TS) methods to solve this problem. Computational results, obtained on a number of standard problem instances, testify the effectiveness of the proposed approach.Keywords: cellular mobile communication, frequency assignment problem, optimization, tabu search, simulated annealing
Procedia PDF Downloads 38535410 Introduction of an Approach of Complex Virtual Devices to Achieve Device Interoperability in Smart Building Systems
Authors: Thomas Meier
Abstract:
One of the major challenges for sustainable smart building systems is to support device interoperability, i.e. connecting sensor or actuator devices from different vendors, and present their functionality to the external applications. Furthermore, smart building systems are supposed to connect with devices that are not available yet, i.e. devices that become available on the market sometime later. It is of vital importance that a sustainable smart building platform provides an appropriate external interface that can be leveraged by external applications and smart services. An external platform interface must be stable and independent of specific devices and should support flexible and scalable usage scenarios. A typical approach applied in smart home systems is based on a generic device interface used within the smart building platform. Device functions, even of rather complex devices, are mapped to that generic base type interface by means of specific device drivers. Our new approach, presented in this work, extends that approach by using the smart building system’s rule engine to create complex virtual devices that can represent the most diverse properties of real devices. We examined and evaluated both approaches by means of a practical case study using a smart building system that we have developed. We show that the solution we present allows the highest degree of flexibility without affecting external application interface stability and scalability. In contrast to other systems our approach supports complex virtual device configuration on application layer (e.g. by administration users) instead of device configuration at platform layer (e.g. platform operators). Based on our work, we can show that our approach supports almost arbitrarily flexible use case scenarios without affecting the external application interface stability. However, the cost of this approach is additional appropriate configuration overhead and additional resource consumption at the IoT platform level that must be considered by platform operators. We conclude that the concept of complex virtual devices presented in this work can be applied to improve the usability and device interoperability of sustainable intelligent building systems significantly.Keywords: Internet of Things, smart building, device interoperability, device integration, smart home
Procedia PDF Downloads 27135409 A Risk Management Approach to the Diagnosis of Attention Deficit-Hyperactivity Disorder
Authors: Lloyd A. Taylor
Abstract:
An increase in the prevalence of Attention Deficit-Hyperactivity Disorder (ADHD) highlights the need to consider factors that may be exacerbating symptom presentation. Traditional diagnostic criteria provide a little framework for healthcare providers to consider as they attempt to diagnose and treat children with behavioral problems. In fact, aside from exclusion criteria, limited alternative considerations are available, and approaches fail to consider the impact of outside factors that could increase or decrease the likelihood of appropriate diagnosis and success of interventions. This paper will consider specific systems-based factors that influence behavior and intervention successes that, when not considered, could account for the upsurge of diagnoses. These include understanding (1) challenges in the healthcare system, (2) the influence and impact of educators and the educational system, (3) technology use, and (4) patient and parental attitudes about the diagnosis of ADHD. These factors must be considered both individually and as a whole when considering both the increase in diagnoses and the subsequent increases in prescriptions for psychostimulant medication. A theoretical model based on a risk management approach will be presented. Finally, data will be presented that demonstrates pediatric provider satisfaction with this approach to diagnoses and treatment of ADHD as it relates to practice trends.Keywords: ADHD, diagnostic criteria, risk management model, pediatricians
Procedia PDF Downloads 94