Search results for: partially observable Markov decision processes
9595 Unintended Health Inequity: Using the Relationship Between the Social Determinants of Health and Employer-Sponsored Health Insurance as a Catalyst for Organizational Development and Change
Authors: Dinamarie Fonzone
Abstract:
Employer-sponsored health insurance (ESI) strategic decision-making processes rely on financial analysis to guide leadership in choosing plans that will produce optimal organizational spending outcomes. These financial decision-making methods have not abated ESI costs. Previously unrecognized external social determinants, the impact on ESI plan spending, and other organizational strategies are emerging and are important considerations for organizational decision-makers and change management practitioners. The purpose of thisstudy is to examine the relationship between the social determinants of health (SDoH), employer-sponsored health insurance (ESI) plans, andthe unintended consequence of health inequity. A quantitative research design using selectemployee records from an existing employer human capital management database will be analyzed. Statistical regressionmethods will be used to study the relationships between certainSDoH (employee income, neighborhood geographic living area, and health care access) and health plan utilization, cost, and chronic disease prevalence. The discussion will include an application of the social gradient of health theory to the study findings, organizational transformation through changes in ESI decision-making mental models, and the connection of ESI health inequity to organizational development and changediversity, equity, and inclusion strategies.Keywords: employer-sponsored health insurance, social determinants of health, health inequity, mental models, organizational development, organizational change, social gradient of health theory
Procedia PDF Downloads 1079594 Importance of Developing a Decision Support System for Diagnosis of Glaucoma
Authors: Murat Durucu
Abstract:
Glaucoma is a condition of irreversible blindness, early diagnosis and appropriate interventions to make the patients able to see longer time. In this study, it addressed that the importance of developing a decision support system for glaucoma diagnosis. Glaucoma occurs when pressure happens around the eyes it causes some damage to the optic nerves and deterioration of vision. There are different levels ranging blindness of glaucoma disease. The diagnosis at an early stage allows a chance for therapies that slows the progression of the disease. In recent years, imaging technology from Heidelberg Retinal Tomography (HRT), Stereoscopic Disc Photo (SDP) and Optical Coherence Tomography (OCT) have been used for the diagnosis of glaucoma. This better accuracy and faster imaging techniques in response technique of OCT have become the most common method used by experts. Although OCT images or HRT precision and quickness, especially in the early stages, there are still difficulties and mistakes are occurred in diagnosis of glaucoma. It is difficult to obtain objective results on diagnosis and placement process of the doctor's. It seems very important to develop an objective decision support system for diagnosis and level the glaucoma disease for patients. By using OCT images and pattern recognition systems, it is possible to develop a support system for doctors to make their decisions on glaucoma. Thus, in this recent study, we develop an evaluation and support system to the usage of doctors. Pattern recognition system based computer software would help the doctors to make an objective evaluation for their patients. It is intended that after development and evaluation processes of the software, the system is planning to be serve for the usage of doctors in different hospitals.Keywords: decision support system, glaucoma, image processing, pattern recognition
Procedia PDF Downloads 3029593 Efficient Design of Distribution Logistics by Using a Model-Based Decision Support System
Abstract:
The design of distribution logistics has a decisive impact on a company's logistics costs and performance. Hence, such solutions make an essential contribution to corporate success. This article describes a decision support system for analyzing the potential of distribution logistics in terms of logistics costs and performance. In contrast to previous procedures of business process re-engineering (BPR), this method maps distribution logistics holistically under variable distribution structures. Combined with qualitative measures the decision support system will contribute to a more efficient design of distribution logistics.Keywords: decision support system, distribution logistics, potential analyses, supply chain management
Procedia PDF Downloads 4069592 The Evaluation of Child Maltreatment Severity and the Decision-Making Processes in the Child Protection System
Authors: Maria M. Calheiros, Carla Silva, Eunice Magalhães
Abstract:
Professionals working in child protection services (CPS) need to have common and clear criteria to identify cases of maltreatment and to differentiate levels of severity in order to determine when CPS intervention is required, its nature and urgency, and, in most countries, the service that will be in charge of the case (community or specialized CPS). Actually, decision-making process is complex in CPS, and, for that reason, such criteria are particularly important for who significantly contribute to that decision-making in child maltreatment cases. The main objective of this presentation is to describe the Maltreatment Severity Assessment Questionnaire (MSQ), specifically designed to be used by professionals in the CPS, which adopts a multidimensional approach and uses a scale of severity within subtypes. Specifically, we aim to provide evidence of validity and reliability of this tool, in order to improve the quality and validity of assessment processes and, consequently, the decision making in CPS. The total sample was composed of 1000 children and/or adolescents (51.1% boys), aged between 0 and 18 years old (M = 9.47; DP = 4.51). All the participants were referred to official institutions of the children and youth protective system. Children and adolescents maltreatment (abuse, neglect experiences and sexual abuse) were assessed with 21 items of the Maltreatment Severity Questionnaire (MSQ), by professionals of CPS. Each item (sub-type) was composed of four descriptors of increasing severity. Professionals rated the level of severity, using a 4-point scale (1= minimally severe; 2= moderately severe; 3= highly severe; 4= extremely severe). The construct validity of the Maltreatment Severity Questionnaire was assessed with a holdout method, performing an Exploratory Factor Analysis (EFA) followed by a Confirmatory Factor Analysis (CFA). The final solution comprised 18 items organized in three factors 47.3% of variance explained. ‘Physical neglect’ (eight items) was defined by parental omissions concerning the insurance and monitoring of the child’s physical well-being and health, namely in terms of clothing, hygiene, housing conditions and contextual environmental security. ‘Physical and Psychological Abuse’ (four items) described abusive physical and psychological actions, namely, coercive/punitive disciplinary methods, physically violent methods or verbal interactions that offend and denigrate the child, with the potential to disrupt psychological attributes (e.g., self-esteem). ‘Psychological neglect’ (six items) involved omissions related to children emotional development, mental health monitoring, school attendance, development needs, as well as inappropriate relationship patterns with attachment figures. Results indicated a good reliability of all the factors. The assessment of child maltreatment cases with MSQ could have a set of practical and research implications: a) It is a valid and reliable multidimensional instrument to measure child maltreatment, b) It is an instrument integrating the co-occurrence of various types of maltreatment and a within-subtypes scale of severity; c) Specifically designed for professionals, it may assist them in decision-making processes; d) More than using case file reports to evaluate maltreatment experiences, researchers could guide more appropriately their research about determinants and consequences of maltreatment.Keywords: assessment, maltreatment, children and youth, decision-making
Procedia PDF Downloads 2909591 Sustainable Approach for Strategic Planning of Construction of Buildings using Multi-Criteria Decision Making Tools
Authors: Kishor Bhagwat, Gayatri Vyas
Abstract:
Construction industry is earmarked with complex processes depending on the nature and scope of the project. In recent years, developments in this sector are remarkable and have resulted in both positive and negative impacts on the environment and human being. Sustainable construction can be looked upon as one of the solution to overcome the negative impacts since sustainable construction is a vast concept, which includes many parameters, and sometimes the use of multi-criteria decision making [MCDM] tools becomes necessary. The main objective of this study is to determine the weightage of sustainable building parameters with the help of MCDM tools. Questionnaire survey was conducted to examine the perspective of respondents on the importance of weights of the criterion, and the respondents were architects, green building consultants, and civil engineers. This paper presents an overview of research related to Indian and international green building rating systems and MCDM. The results depict that economy, environmental health, and safety, site selection, climatic condition, etc., are important parameters in sustainable construction.Keywords: green building, sustainability, multi-criteria decision making method [MCDM], analytical hierarchy process [AHP], technique for order preference by similarity to an ideal solution [TOPSIS], entropy
Procedia PDF Downloads 999590 Empirical and Indian Automotive Equity Portfolio Decision Support
Authors: P. Sankar, P. James Daniel Paul, Siddhant Sahu
Abstract:
A brief review of the empirical studies on the methodology of the stock market decision support would indicate that they are at a threshold of validating the accuracy of the traditional and the fuzzy, artificial neural network and the decision trees. Many researchers have been attempting to compare these models using various data sets worldwide. However, the research community is on the way to the conclusive confidence in the emerged models. This paper attempts to use the automotive sector stock prices from National Stock Exchange (NSE), India and analyze them for the intra-sectorial support for stock market decisions. The study identifies the significant variables and their lags which affect the price of the stocks using OLS analysis and decision tree classifiers.Keywords: Indian automotive sector, stock market decisions, equity portfolio analysis, decision tree classifiers, statistical data analysis
Procedia PDF Downloads 4859589 Occurrence of Illicit Drugs in Aqueous Environment and Removal Efficiency of Wastewater Treatment Plants
Authors: Meena K. Yadav, Rupak Aryal, Michael D. Short, Ben Van Den Akker, Christopher P. Saint, Cobus Gerber
Abstract:
Illicit drugs are considered as emerging contaminants of concern that have become an interesting issue for the scientific community from last few years due to their existence in the water environment. A number of the literature has revealed their occurrence in the environment. This is mainly due to the fact that some drugs are partially removed during wastewater treatment processes, and remaining being able to enter the environment and contaminate surface and groundwater and subsequently, drinking water. Therefore, this paper evaluates the occurrence of key illicit drugs in wastewater (influent and effluent) samples in 4 wastewater treatment plants across Adelaide, South Australia over a 1 year period. This paper also compares the efficiency of wastewater treatment plants adopting different technologies in the removal of selected illicit drugs, especially in the context of which technology has higher removal rates. The influent and effluent samples were analysed using Liquid Chromatography tandem Mass Spectrometry (LC-MS/MS). The levels of drugs detected were in the range of mg/L – ng/L in effluent samples; thus emphasising the influence on water quality of receiving water bodies and the significance of removal efficiency of WWTPs(Wastewater Treatment Plants). The results show that the drugs responded differently in the removal depending on the treatment processes used by the WWTPs.Keywords: illicit drugs, removal efficiency, treatment technology, wastewater
Procedia PDF Downloads 2629588 The Potential Impact of Big Data Analytics on Pharmaceutical Supply Chain Management
Authors: Maryam Ziaee, Himanshu Shee, Amrik Sohal
Abstract:
Big Data Analytics (BDA) in supply chain management has recently drawn the attention of academics and practitioners. Big data refers to a massive amount of data from different sources, in different formats, generated at high speed through transactions in business environments and supply chain networks. Traditional statistical tools and techniques find it difficult to analyse this massive data. BDA can assist organisations to capture, store, and analyse data specifically in the field of supply chain. Currently, there is a paucity of research on BDA in the pharmaceutical supply chain context. In this research, the Australian pharmaceutical supply chain was selected as the case study. This industry is highly significant since the right medicine must reach the right patients, at the right time, in right quantity, in good condition, and at the right price to save lives. However, drug shortages remain a substantial problem for hospitals across Australia with implications on patient care, staff resourcing, and expenditure. Furthermore, a massive volume and variety of data is generated at fast speed from multiple sources in pharmaceutical supply chain, which needs to be captured and analysed to benefit operational decisions at every stage of supply chain processes. As the pharmaceutical industry lags behind other industries in using BDA, it raises the question of whether the use of BDA can improve transparency among pharmaceutical supply chain by enabling the partners to make informed-decisions across their operational activities. This presentation explores the impacts of BDA on supply chain management. An exploratory qualitative approach was adopted to analyse data collected through interviews. This study also explores the BDA potential in the whole pharmaceutical supply chain rather than focusing on a single entity. Twenty semi-structured interviews were undertaken with top managers in fifteen organisations (five pharmaceutical manufacturers, five wholesalers/distributors, and five public hospital pharmacies) to investigate their views on the use of BDA. The findings revealed that BDA can enable pharmaceutical entities to have improved visibility over the whole supply chain and also the market; it enables entities, especially manufacturers, to monitor consumption and the demand rate in real-time and make accurate demand forecasts which reduce drug shortages. Timely and precise decision-making can allow the entities to source and manage their stocks more effectively. This can likely address the drug demand at hospitals and respond to unanticipated issues such as drug shortages. Earlier studies explore BDA in the context of clinical healthcare; however, this presentation investigates the benefits of BDA in the Australian pharmaceutical supply chain. Furthermore, this research enhances managers’ insight into the potentials of BDA at every stage of supply chain processes and helps to improve decision-making in their supply chain operations. The findings will turn the rhetoric of data-driven decision into a reality where the managers may opt for analytics for improved decision-making in the supply chain processes.Keywords: big data analytics, data-driven decision, pharmaceutical industry, supply chain management
Procedia PDF Downloads 1069587 An Efficient Motion Recognition System Based on LMA Technique and a Discrete Hidden Markov Model
Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier
Abstract:
Human motion recognition has been extensively increased in recent years due to its importance in a wide range of applications, such as human-computer interaction, intelligent surveillance, augmented reality, content-based video compression and retrieval, etc. However, it is still regarded as a challenging task especially in realistic scenarios. It can be seen as a general machine learning problem which requires an effective human motion representation and an efficient learning method. In this work, we introduce a descriptor based on Laban Movement Analysis technique, a formal and universal language for human movement, to capture both quantitative and qualitative aspects of movement. We use Discrete Hidden Markov Model (DHMM) for training and classification motions. We improve the classification algorithm by proposing two DHMMs for each motion class to process the motion sequence in two different directions, forward and backward. Such modification allows avoiding the misclassification that can happen when recognizing similar motions. Two experiments are conducted. In the first one, we evaluate our method on a public dataset, the Microsoft Research Cambridge-12 Kinect gesture data set (MSRC-12) which is a widely used dataset for evaluating action/gesture recognition methods. In the second experiment, we build a dataset composed of 10 gestures(Introduce yourself, waving, Dance, move, turn left, turn right, stop, sit down, increase velocity, decrease velocity) performed by 20 persons. The evaluation of the system includes testing the efficiency of our descriptor vector based on LMA with basic DHMM method and comparing the recognition results of the modified DHMM with the original one. Experiment results demonstrate that our method outperforms most of existing methods that used the MSRC-12 dataset, and a near perfect classification rate in our dataset.Keywords: human motion recognition, motion representation, Laban Movement Analysis, Discrete Hidden Markov Model
Procedia PDF Downloads 2079586 Team Cognitive Heterogeneity and Strategic Decision-Making Flexibility: The Role of Transactive Memory System and Task Complexity
Authors: Rui Xing, Baolin Ye, Nan Zhou, Guohong Wang
Abstract:
Drawing upon a perspective of cognitive interaction, this study explores the relationship between team cognitive heterogeneity and team strategic decision-making flexibility, treating the transactive memory system as a mediator and task complexity as a moderator. The hypotheses were tested in linear regression models by using data gathered from 67 strategic decision-making teams in the new-energy vehicle industry. It is found that team cognitive heterogeneity has a positive impact on strategic decision-making flexibility through the mediation of specialization and coordination of the transactive memory system, which is positively moderated by task complexity.Keywords: strategic decision-making flexibility, team cognitive heterogeneity, transactive memory system, task complexity
Procedia PDF Downloads 799585 Decision Support System Based On GIS and MCDM to Identify Land Suitability for Agriculture
Authors: Abdelkader Mendas
Abstract:
The integration of MultiCriteria Decision Making (MCDM) approaches in a Geographical Information System (GIS) provides a powerful spatial decision support system which offers the opportunity to efficiently produce the land suitability maps for agriculture. Indeed, GIS is a powerful tool for analyzing spatial data and establishing a process for decision support. Because of their spatial aggregation functions, MCDM methods can facilitate decision making in situations where several solutions are available, various criteria have to be taken into account and decision-makers are in conflict. The parameters and the classification system used in this work are inspired from the FAO (Food and Agriculture Organization) approach dedicated to a sustainable agriculture. A spatial decision support system has been developed for establishing the land suitability map for agriculture. It incorporates the multicriteria analysis method ELECTRE Tri (ELimitation Et Choix Traduisant la REalité) in a GIS within the GIS program package environment. The main purpose of this research is to propose a conceptual and methodological framework for the combination of GIS and multicriteria methods in a single coherent system that takes into account the whole process from the acquisition of spatially referenced data to decision-making. In this context, a spatial decision support system for developing land suitability maps for agriculture has been developed. The algorithm of ELECTRE Tri is incorporated into a GIS environment and added to the other analysis functions of GIS. This approach has been tested on an area in Algeria. A land suitability map for durum wheat has been produced. Through the obtained results, it appears that ELECTRE Tri method, integrated into a GIS, is better suited to the problem of land suitability for agriculture. The coherence of the obtained maps confirms the system effectiveness.Keywords: multicriteria decision analysis, decision support system, geographical information system, land suitability for agriculture
Procedia PDF Downloads 6379584 Fast Prediction Unit Partition Decision and Accelerating the Algorithm Using Cudafor Intra and Inter Prediction of HEVC
Authors: Qiang Zhang, Chun Yuan
Abstract:
Since the PU (Prediction Unit) decision process is the most time consuming part of the emerging HEVC (High Efficient Video Coding) standardin intra and inter frame coding, this paper proposes the fast PU decision algorithm and speed up the algorithm using CUDA (Compute Unified Device Architecture). In intra frame coding, the fast PU decision algorithm uses the texture features to skip intra-frame prediction or terminal the intra-frame prediction for smaller PU size. In inter frame coding of HEVC, the fast PU decision algorithm takes use of the similarity of its own two Nx2N size PU's motion vectors and the hierarchical structure of CU (Coding Unit) partition to skip some modes of PU partition, so as to reduce the motion estimation times. The accelerate algorithm using CUDA is based on the fast PU decision algorithm which uses the GPU to make the motion search and the gradient computation could be parallel computed. The proposed algorithm achieves up to 57% time saving compared to the HM 10.0 with little rate-distortion losses (0.043dB drop and 1.82% bitrate increase on average).Keywords: HEVC, PU decision, inter prediction, intra prediction, CUDA, parallel
Procedia PDF Downloads 3999583 SeCloudBPMN: A Lightweight Extension for BPMN Considering Security Threats in the Cloud
Authors: Somayeh Sobati Moghadam
Abstract:
Business processes are crucial for organizations and help businesses to evaluate and optimize their performance and processes against current and future-state business goals. Outsourcing business processes to the cloud becomes popular due to a wide varsity of benefits and cost-saving. However, cloud outsourcing raises enterprise data security concerns, which must be incorporated in Business Process Model and Notation (BPMN). This paper, presents SeCloudBPMN, a lightweight extension for BPMN which extends the BPMN to explicitly support the security threats in the cloud as an outsourcing environment. SeCloudBPMN helps business’s security experts to outsource business processes to the cloud considering different threats from inside and outside the cloud. In this way, appropriate security countermeasures could be considered to preserve data security in business processes outsourcing to the cloud.Keywords: BPMN, security threats, cloud computing, business processes outsourcing, privacy
Procedia PDF Downloads 2699582 Verification and Proposal of Information Processing Model Using EEG-Based Brain Activity Monitoring
Authors: Toshitaka Higashino, Naoki Wakamiya
Abstract:
Human beings perform a task by perceiving information from outside, recognizing them, and responding them. There have been various attempts to analyze and understand internal processes behind the reaction to a given stimulus by conducting psychological experiments and analysis from multiple perspectives. Among these, we focused on Model Human Processor (MHP). However, it was built based on psychological experiments and thus the relation with brain activity was unclear so far. To verify the validity of the MHP and propose our model from a viewpoint of neuroscience, EEG (Electroencephalography) measurements are performed during experiments in this study. More specifically, first, experiments were conducted where Latin alphabet characters were used as visual stimuli. In addition to response time, ERPs (event-related potentials) such as N100 and P300 were measured by using EEG. By comparing cycle time predicted by the MHP and latency of ERPs, it was found that N100, related to perception of stimuli, appeared at the end of the perceptual processor. Furthermore, by conducting an additional experiment, it was revealed that P300, related to decision making, appeared during the response decision process, not at the end. Second, by experiments using Japanese Hiragana characters, i.e. Japan's own phonetic symbols, those findings were confirmed. Finally, Japanese Kanji characters were used as more complicated visual stimuli. A Kanji character usually has several readings and several meanings. Despite the difference, a reading-related task and a meaning-related task exhibited similar results, meaning that they involved similar information processing processes of the brain. Based on those results, our model was proposed which reflects response time and ERP latency. It consists of three processors: the perception processor from an input of a stimulus to appearance of N100, the cognitive processor from N100 to P300, and the decision-action processor from P300 to response. Using our model, an application system which reflects brain activity can be established.Keywords: brain activity, EEG, information processing model, model human processor
Procedia PDF Downloads 989581 Jointly Optimal Statistical Process Control and Maintenance Policy for Deteriorating Processes
Authors: Lucas Paganin, Viliam Makis
Abstract:
With the advent of globalization, the market competition has become a major issue for most companies. One of the main strategies to overcome this situation is the quality improvement of the product at a lower cost to meet customers’ expectations. In order to achieve the desired quality of products, it is important to control the process to meet the specifications, and to implement the optimal maintenance policy for the machines and the production lines. Thus, the overall objective is to reduce process variation and the production and maintenance costs. In this paper, an integrated model involving Statistical Process Control (SPC) and maintenance is developed to achieve this goal. Therefore, the main focus of this paper is to develop the jointly optimal maintenance and statistical process control policy minimizing the total long run expected average cost per unit time. In our model, the production process can go out of control due to either the deterioration of equipment or other assignable causes. The equipment is also subject to failures in any of the operating states due to deterioration and aging. Hence, the process mean is controlled by an Xbar control chart using equidistant sampling epochs. We assume that the machine inspection epochs are the times when the control chart signals an out-of-control condition, considering both true and false alarms. At these times, the production process will be stopped, and an investigation will be conducted not only to determine whether it is a true or false alarm, but also to identify the causes of the true alarm, whether it was caused by the change in the machine setting, by other assignable causes, or by both. If the system is out of control, the proper actions will be taken to bring it back to the in-control state. At these epochs, a maintenance action can be taken, which can be no action, or preventive replacement of the unit. When the equipment is in the failure state, a corrective maintenance action is performed, which can be minimal repair or replacement of the machine and the process is brought to the in-control state. SMDP framework is used to formulate and solve the joint control problem. Numerical example is developed to demonstrate the effectiveness of the control policy.Keywords: maintenance, semi-Markov decision process, statistical process control, Xbar control chart
Procedia PDF Downloads 919580 Decision Analysis Module for Excel
Authors: Radomir Perzina, Jaroslav Ramik
Abstract:
The Analytic Hierarchy Process is frequently used approach for solving decision making problems. There exists wide range of software programs utilizing that approach. Their main disadvantage is that they are relatively expensive and missing intermediate calculations. This work introduces a Microsoft Excel add-in called DAME – Decision Analysis Module for Excel. Comparing to other computer programs DAME is free, can work with scenarios or multiple decision makers and displays intermediate calculations. Users can structure their decision models into three levels – scenarios/users, criteria and variants. Items on all levels can be evaluated either by weights or pair-wise comparisons. There are provided three different methods for the evaluation of the weights of criteria, the variants as well as the scenarios – Saaty’s Method, Geometric Mean Method and Fuller’s Triangle Method. Multiplicative and additive syntheses are supported. The proposed software package is demonstrated on couple of illustrating examples of real life decision problems.Keywords: analytic hierarchy process, multi-criteria decision making, pair-wise comparisons, Microsoft Excel, scenarios
Procedia PDF Downloads 4529579 Metareasoning Image Optimization Q-Learning
Authors: Mahasa Zahirnia
Abstract:
The purpose of this paper is to explore new and effective ways of optimizing satellite images using artificial intelligence, and the process of implementing reinforcement learning to enhance the quality of data captured within the image. In our implementation of Bellman's Reinforcement Learning equations, associated state diagrams, and multi-stage image processing, we were able to enhance image quality, detect and define objects. Reinforcement learning is the differentiator in the area of artificial intelligence, and Q-Learning relies on trial and error to achieve its goals. The reward system that is embedded in Q-Learning allows the agent to self-evaluate its performance and decide on the best possible course of action based on the current and future environment. Results show that within a simulated environment, built on the images that are commercially available, the rate of detection was 40-90%. Reinforcement learning through Q-Learning algorithm is not just desired but required design criteria for image optimization and enhancements. The proposed methods presented are a cost effective method of resolving uncertainty of the data because reinforcement learning finds ideal policies to manage the process using a smaller sample of images.Keywords: Q-learning, image optimization, reinforcement learning, Markov decision process
Procedia PDF Downloads 2159578 Advances in Artificial intelligence Using Speech Recognition
Authors: Khaled M. Alhawiti
Abstract:
This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance
Procedia PDF Downloads 4779577 Precise Identification of Clustered Regularly Interspaced Short Palindromic Repeats-Induced Mutations via Hidden Markov Model-Based Sequence Alignment
Authors: Jingyuan Hu, Zhandong Liu
Abstract:
CRISPR genome editing technology has transformed molecular biology by accurately targeting and altering an organism’s DNA. Despite the state-of-art precision of CRISPR genome editing, the imprecise mutation outcome and off-target effects present considerable risk, potentially leading to unintended genetic changes. Targeted deep sequencing, combined with bioinformatics sequence alignment, can detect such unwanted mutations. Nevertheless, the classical method, Needleman-Wunsch (NW) algorithm may produce false alignment outcomes, resulting in inaccurate mutation identification. The key to precisely identifying CRISPR-induced mutations lies in determining optimal parameters for the sequence alignment algorithm. Hidden Markov models (HMM) are ideally suited for this task, offering flexibility across CRISPR systems by leveraging forward-backward algorithms for parameter estimation. In this study, we introduce CRISPR-HMM, a statistical software to precisely call CRISPR-induced mutations. We demonstrate that the software significantly improves precision in identifying CRISPR-induced mutations compared to NW-based alignment, thereby enhancing the overall understanding of the CRISPR gene-editing process.Keywords: CRISPR, HMM, sequence alignment, gene editing
Procedia PDF Downloads 519576 Management Information System to Help Managers for Providing Decision Making in an Organization
Authors: Ajayi Oluwasola Felix
Abstract:
Management information system (MIS) provides information for the managerial activities in an organization. The main purpose of this research is, MIS provides accurate and timely information necessary to facilitate the decision-making process and enable the organizations planning control and operational functions to be carried out effectively. Management information system (MIS) is basically concerned with processing data into information and is then communicated to the various departments in an organization for appropriate decision-making. MIS is a subset of the overall planning and control activities covering the application of humans technologies, and procedures of the organization. The information system is the mechanism to ensure that information is available to the managers in the form they want it and when they need it.Keywords: Management Information Systems (MIS), information technology, decision-making, MIS in Organizations
Procedia PDF Downloads 5569575 Business Intelligence Proposal to Improve Decision Making in Companies Using Google Cloud Platform and Microsoft Power BI
Authors: Joel Vilca Tarazona, Igor Aguilar-Alonso
Abstract:
The problem of this research related to business intelligence is the lack of a tool that supports automated and efficient financial analysis for decision-making and allows an evaluation of the financial statements, which is why the availability of the information is difficult. Relevant information to managers and users as an instrument in decision making financial, and administrative. For them, a business intelligence solution is proposed that will reduce information access time, personnel costs, and process automation, proposing a 4-layer architecture based on what was reviewed by the research methodology.Keywords: decision making, business intelligence, Google Cloud, Microsoft Power BI
Procedia PDF Downloads 999574 The Act of Care: Reimagined Rituals towards Unattachment
Authors: Ioana G. Turcan
Abstract:
reimagined rituals towards unattachment wants to look at an ambiguous loss through the perspective of caregivers, those that accompany us at the beginning and possibly the end of life, those that observe, accumulate, and are impacted by our behavior and needs, but also those that are the witnesses of the human vulnerability. Someone taking care of a patient with dementia experiences ambiguous loss, being in a present of a person partially present, partially absent. The one offering care needs care, not isolation and the aim of the project is to consolidate existing communities or engage other possible ones using performance, storytelling, and other artistic methods. The long-term aim is that with community work, we will manage to co-create rituals in order to help us live with this kind of loss. Looking at them through the lens of different cultures and individuals exercises both the ability to extract the universal essence of a ritual, but also the need and freedom to express the specificity of each situation. To be seen and acknowledged by others, but more importantly, to see oneself from outside with dignity, is very powerful. Oftentimes we forget to express, look and appreciate our own stories, and instead, we choose to outcast them.Keywords: grief, socio-politics of loss, ambiguous loss, rituals
Procedia PDF Downloads 1779573 A Mathematical Model for Reliability Redundancy Optimization Problem of K-Out-Of-N: G System
Authors: Gak-Gyu Kim, Won Il Jung
Abstract:
According to a remarkable development of science and technology, function and role of the system of engineering fields has recently been diversified. The system has become increasingly more complex and precise, and thus, system designers intended to maximize reliability concentrate more effort at the design stage. This study deals with the reliability redundancy optimization problem (RROP) for k-out-of-n: G system configuration with cold standby and warm standby components. This paper further intends to present the optimal mathematical model through which the following three elements of (i) multiple components choices, (ii) redundant components quantity and (iii) the choice of redundancy strategies may be combined in order to maximize the reliability of the system. Therefore, we focus on the following three issues. First, we consider RROP that there exists warm standby state as well as cold standby state of the component. Second, as eliminating an approximation approach of the previous RROP studies, we construct a precise model for system reliability. Third, given transition time when the state of components changes, we present not simply a workable solution but the advanced method. For the wide applicability of RROPs, moreover, we use absorbing continuous time Markov chain and matrix analytic methods in the suggested mathematical model.Keywords: RROP, matrix analytic methods, k-out-of-n: G system, MTTF, absorbing continuous time Markov Chain
Procedia PDF Downloads 2549572 Analysis of Tilting Cause of a Residential Building in Durres by the Use of Cptu Test
Authors: Neritan Shkodrani
Abstract:
On November 26, 2019, an earthquake hit the central western part of Albania. It was assessed as Mw 6.4. Its epicenter was located offshore north western Durrës, about 7 km north of the city. In this paper, the consequences of settlements of very soft soils have been discussed for the case of a residential building, mentioned as “K Building”, which was suffering a significant tilting after the earthquake. “KBuilding” is an RC framed building having 12+1 (basement) storiesand a floor area of 21000 m2. The construction of the building was completed in 2012. “KBuilding”, located in Durres city, suffered severe non-structural damage during November 26, 2019, Durrës Earthquake sequences. During the in-site inspections immediately after the earthquake, the general condition of the buildings, the presence of observable settlements on the ground, and the crack situation in the structure were determined, and damage inspection were performed. It was significant to note that the “K Building” presented tilting that might be attributed, as it was believed at the beginning, partially to the failure of the columns of the ground floor and partially to liquefaction phenomena, but it did not collapse. At the first moment was not clear if the foundation had a bearing capacity failure or the foundation failed because of the soil liquefaction. Geotechnical soil investigations by using CPTU test were executed, and their data are usedto evaluatebearing capacity, consolidation settlement of the mat foundation, and soil liquefaction since they were believed to be the main reasons of this building tilting.Geotechnical soil investigation consist in 5 (five) Static Cone Penetration tests with pore pressure measurement (piezocone test). They reached a penetration depth of 20.0 m to 30.0 mand, clearly shown the presence of very soft and organic soils in the soil profile of the site. Geotechnical CPT based analysis of bearing capacity, consolidation, and secondary settlement are applied, and results are reported for each test. These results shown very small values of allowable bearing capacity and very high values of consolidation and secondary settlements. Liquefaction analysis based on the data of CPTU tests and the characteristics of ground shaking of the mentioned earthquake has shown the possibility of liquefaction for some layers of the considered soil profile, but the estimated vertical settlements are at a small range and clearly shown that the main reason of the building tilting was not related to the consequences of liquefaction, but was an existing settlement caused from the applied bearing pressure of this building. All the CPTU tests were carried out on August 2021, almost two years after the November 26, 2019, Durrës Earthquake and when the building itself was demolished. After removing the mat foundation on September 2021, it was possible to carry out CPTU tests even on the footprint of the existing building, which made possible to observe the effects of long time applied of foundation bearing pressure to the consolidation on the considered soil profile.Keywords: bearing capacity, cone penetration test, consolidation settlement, secondary settlement, soil liquefaction, etc
Procedia PDF Downloads 969571 Approach to Formulate Intuitionistic Fuzzy Regression Models
Authors: Liang-Hsuan Chen, Sheng-Shing Nien
Abstract:
This study aims to develop approaches to formulate intuitionistic fuzzy regression (IFR) models for many decision-making applications in the fuzzy environments using intuitionistic fuzzy observations. Intuitionistic fuzzy numbers (IFNs) are used to characterize the fuzzy input and output variables in the IFR formulation processes. A mathematical programming problem (MPP) is built up to optimally determine the IFR parameters. Each parameter in the MPP is defined as a couple of alternative numerical variables with opposite signs, and an intuitionistic fuzzy error term is added to the MPP to characterize the uncertainty of the model. The IFR model is formulated based on the distance measure to minimize the total distance errors between estimated and observed intuitionistic fuzzy responses in the MPP resolution processes. The proposed approaches are simple/efficient in the formulation/resolution processes, in which the sign of parameters can be determined so that the problem to predetermine the sign of parameters is avoided. Furthermore, the proposed approach has the advantage that the spread of the predicted IFN response will not be over-increased, since the parameters in the established IFR model are crisp. The performance of the obtained models is evaluated and compared with the existing approaches.Keywords: fuzzy sets, intuitionistic fuzzy number, intuitionistic fuzzy regression, mathematical programming method
Procedia PDF Downloads 1389570 Child Protection Decision Making in England and Finland: A Comparative Analysis
Authors: Rachel Falconer
Abstract:
Background: The United Nations Convention on the Rights of the Child sets out the duties placed on signatory nations to take measures to protect children from all forms of violence, abuse, neglect and maltreatment. The systems for ensuring this protection vary globally, shaped by national welfare policies. In England and Finland, past research has highlighted differences in how child protection issues are framed and how state agencies respond. However, less is known about how such differences impact processes of social work judgment and decision making in practice. Method: Data was collected as part of a wider PhD project in three stages. First, social workers in sites across England and Finland were asked to complete a short questionnaire. Participants were then asked to comment on two constructed case vignettes, and were interviewed about their experiences of child protection decision making at the point of referral. Interviews were analyzed using NVivo to draw out key themes. Findings: There were similarities in how the English and Finnish social workers responded to the case vignettes; for example, participants in both countries expressed concerns about similar risk factors and all felt further assessment was needed. Differences were observed, in particular, in regard to the sources of support and guidance participants referred to, with the English social workers appearing to rely more upon managerial input for their decisions than the Finnish social workers. These findings suggest evidence for two distinct decision making approaches: ‘supervised’ and ‘supported’ judgement. Implications for practice: The findings have relevance to the conference theme of research and evaluation of social work practice, and support the findings of previous studies that have emphasized the significance of organizational factors in child protection decision making. The comparative methodology has also helped to demonstrate how organizational factors can influence practice in different child protection system ‘orientations’. The presentation will discuss the potential practice implications of ‘supervised’, manager-led approaches to decision making as contrasted with ‘supported’, team-led approaches, inviting discussion about the relevance of these findings for social work in other countries.Keywords: child protection, comparative research, decision making, social work, vignettes
Procedia PDF Downloads 2539569 Structural Testing and the Finite Element Modelling of Anchors Loaded Against Partially Confined Surfaces
Authors: Ali Karrech, Alberto Puccini, Ben Galvin, Davide Galli
Abstract:
This paper summarises the laboratory tests, numerical models and statistical approach developed to investigate the behaviour of concrete blocks loaded in shear through metallic anchors. This research is proposed to bridge a gap in the state of the art and practice related to anchors loaded against partially confined concrete surfaces. Eight concrete blocks (420 mm x 500 mm x 1000 mm) with 150 and/or 250 deep anchors were tested. The stainless-steel anchors of diameter 16 mm were bonded with HIT-RE 500 V4 injection epoxy resin and were subjected to shear loading against partially supported edges. In addition, finite element models were constructed to validate the laboratory tests and explore the influence of key parameters such as anchor depth, anchor distance from the edge, and compressive strength on the stability of the block. Upon their validation experimentally, the numerical results were used to populate, develop and interpret a systematic parametric study based on the Design of Experiment approach through the Box-Behnken design and Response Surface Methodology. An empirical model has been derived based on this approach, which predicts the load capacity with the desirable intervals of confidence.Keywords: finite element modelling, design of experiment, response surface methodology, Box-Behnken design, empirical model, interval of confidence, load capacity
Procedia PDF Downloads 249568 Temporal Case-Based Reasoning System for Automatic Parking Complex
Authors: Alexander P. Eremeev, Ivan E. Kurilenko, Pavel R. Varshavskiy
Abstract:
In this paper, the problem of the application of temporal reasoning and case-based reasoning in intelligent decision support systems is considered. The method of case-based reasoning with temporal dependences for the solution of problems of real-time diagnostics and forecasting in intelligent decision support systems is described. This paper demonstrates how the temporal case-based reasoning system can be used in intelligent decision support systems of the car access control. This work was supported by RFBR.Keywords: analogous reasoning, case-based reasoning, intelligent decision support systems, temporal reasoning
Procedia PDF Downloads 5299567 Knowledge Management in a Combined/Joint Environment
Authors: Cory Cannon
Abstract:
In the current era of shrinking budgets, increasing amounts of worldwide natural disasters, state and non-state initiated conflicts within the world. The response has involved multinational coalitions to conduct effective military operations. The need for a Knowledge Management strategy when developing these coalitions have been overlooked in the past and the need for developing these accords early on will save time and help shape the way information and knowledge are transferred from the staff and action officers of the coalition to the decision-makers in order to make timely decisions within an ever changing environment. The aim of this paper is to show how Knowledge Management has developed within the United States military and how the transformation of working within a Combined/ Joint environment in both the Middle East and the Far East has improved relations between members of the coalitions as well as being more effective as a military force. These same principles could be applied to multinational corporations when dealing with cultures and decision-making processes.Keywords: civil-military, culture, joint environment, knowledge management
Procedia PDF Downloads 3649566 Risk-Realistic Decision Support Intervention for Women in the Workplace
Authors: Joshua Midha
Abstract:
This paper provides an evaluation of an intervention designed to promote a risk-realistic environment for women in the workplace and regulate their risk-related decision-making. In past research, women -specifically women of color- are highly risk-averse, and this may prove to be an innate obstacle in gender progress in corporations. By helping women see the risks and the benefits and increasing potential benefits, we can increase the chances of success in the workplace. Our intervention was a success and significantly increased comfort, trust, and frequency in the use of decision-making skills in the workplace. In this paper, we explore the intervention, the methods, the results, and the implications.Keywords: behavioral economics, decision support, risk, gender equality
Procedia PDF Downloads 221