Search results for: decision processing
6020 Ethical Decision-Making by Healthcare Professionals during Disasters: Izmir Province Case
Authors: Gulhan Sen
Abstract:
Disasters could result in many deaths and injuries. In these difficult times, accessible resources are limited, demand and supply balance is distorted, and there is a need to make urgent interventions. Disproportionateness between accessible resources and intervention capacity makes triage a necessity in every stage of disaster response. Healthcare professionals, who are in charge of triage, have to evaluate swiftly and make ethical decisions about which patients need priority and urgent intervention given the limited available resources. For such critical times in disaster triage, 'doing the greatest good for the greatest number of casualties' is adopted as a code of practice. But there is no guide for healthcare professionals about ethical decision-making during disasters, and this study is expected to use as a source in the preparation of the guide. This study aimed to examine whether the qualities healthcare professionals in Izmir related to disaster triage were adequate and whether these qualities influence their capacity to make ethical decisions. The researcher used a survey developed for data collection. The survey included two parts. In part one, 14 questions solicited information about socio-demographic characteristics and knowledge levels of the respondents on ethical principles of disaster triage and allocation of scarce resources. Part two included four disaster scenarios adopted from existing literature and respondents were asked to make ethical decisions in triage based on the provided scenarios. The survey was completed by 215 healthcare professional working in Emergency-Medical Stations, National Medical Rescue Teams and Search-Rescue-Health Teams in Izmir. The data was analyzed with SPSS software. Chi-Square Test, Mann-Whitney U Test, Kruskal-Wallis Test and Linear Regression Analysis were utilized. According to results, it was determined that 51.2% of the participants had inadequate knowledge level of ethical principles of disaster triage and allocation of scarce resources. It was also found that participants did not tend to make ethical decisions on four disaster scenarios which included ethical dilemmas. They stayed in ethical dilemmas that perform cardio-pulmonary resuscitation, manage limited resources and make decisions to die. Results also showed that participants who had more experience in disaster triage teams, were more likely to make ethical decisions on disaster triage than those with little or no experience in disaster triage teams(p < 0.01). Moreover, as their knowledge level of ethical principles of disaster triage and allocation of scarce resources increased, their tendency to make ethical decisions also increased(p < 0.001). In conclusion, having inadequate knowledge level of ethical principles and being inexperienced affect their ethical decision-making during disasters. So results of this study suggest that more training on disaster triage should be provided on the areas of the pre-impact phase of disaster. In addition, ethical dimension of disaster triage should be included in the syllabi of the ethics classes in the vocational training for healthcare professionals. Drill, simulations, and board exercises can be used to improve ethical decision making abilities of healthcare professionals. Disaster scenarios where ethical dilemmas are faced should be prepared for such applied training programs.Keywords: disaster triage, medical ethics, ethical principles of disaster triage, ethical decision-making
Procedia PDF Downloads 2456019 Studies of Carbohydrate, Antioxidant, Nutrient and Genomic DNA Characterization of Fresh Olive Treated with Alkaline and Acidic Solvent: An Innovation
Authors: A. B. M. S. Hossain, A. Abdelgadir, N. A. Ibrahim
Abstract:
Fresh ripen olive cannot be consumed immediately after harvest due to the excessive bitterness having polyphenol as antioxidant. Industrial processing needs to be edible the fruit. The laboratory processing technique has been used to make it edible by using acid (vinegar, 5% acetic acid) and alkaline solvent (NaOH). Based on the treatment and consequence, innovative data have been found in this regard. The experiment was conducted to investigate biochemical content, nutritional and DNA characterization of olive fruit treated with alkaline (Sodium chloride anhydrous) and acidic solvent (5% acetic acid, vinegar). The treatments were used as control (no water), water control, 10% sodium chloride anhydrous (NaOH), vinegar (5% acetic acid), vinegar + NaOH and vinegar + NaOH + hot water treatment. Our results showed that inverted sugar and glucose content were higher in the vinegar and NaOH treated olive than in other treatments. Fructose content was the highest in vinegar + NaOH treated fruit. Nutrient contents NO3 K, Ca and Na were found higher in the treated fruit than the control fruit. Moreover, maximum K content was observed in the case of all treatments compared to the other nutrient content. The highest acidic (lower pH) condition (sour) was found in treated fruit. DNA yield was found higher in water control than acid and alkaline treated olives. DNA band was wider in the olive treated water control compared to the NaOH, vinegar, vinegar + NaOH and vinegar + NaOH + Hot water treatment. Finally, results suggest that vinegar + NaOH treated olive fruit was the best for fresh olive homemade processing after harvesting for edible purpose.Keywords: olive, vinegar, sugars, DNA band, bioprocess biotechnology
Procedia PDF Downloads 1856018 A Soft System Approach to Explore Ill-Defined Issues in Distance Education System - A Case of Saudi Arabia
Authors: Sulafah Basahel
Abstract:
Nowadays, Higher Education Institutions (HEIs) around the world are attempting to utilize Information and Communication Technologies (ICTs) to enhance learning process and strategies of knowledge delivery for students through Distance Education (DE) system. Stakeholders in DE system face a complex situation of different ill-defined and related issues that influence decision making process. In this study system thinking as a body of knowledge is used to explore the emergent properties that produced from these connections between issues and could have either positive or negative outcomes for the DE development. Checkland Soft System Methodology (SSM) - Mode 2 is employed in a cultural context of Saudi Arabia for more knowledge acquisition purposes among multiple stakeholders in DE rather than solving problems to achieve an overall development of DE system. This paper will discuss some political, cultural issues and connections between them that impact on effectiveness of stakeholders’ activities and relations. This study will significantly contribute to both system thinking and education fields by leading decision makers in DE to reconsider future plans, strategies and right actions for more successful educational practices.Keywords: distance education, higher education institutions, ill-defined issues, soft system methodology-Mode 2
Procedia PDF Downloads 2706017 Critical Assessment of Herbal Medicine Usage and Efficacy by Pharmacy Students
Authors: Anton V. Dolzhenko, Tahir Mehmood Khan
Abstract:
An ability to make an evidence-based decision is a critically important skill required for practicing pharmacists. The development of this skill is incorporated into the pharmacy curriculum. We aimed in our study to estimate perception of pharmacy students regarding herbal medicines and their ability to assess information on herbal medicines professionally. The current Monash University curriculum in Pharmacy does not provide comprehensive study material on herbal medicines and students should find their way to find information, assess its quality and make a professional decision. In the Pharmacy course, students are trained how to apply this process to conventional medicines. In our survey of 93 undergraduate students from year 1-4 of Pharmacy course at Monash University Malaysia, we found that students’ view on herbal medicines is sometimes associated with common beliefs, which affect students’ ability to make evidence-based conclusions regarding the therapeutic potential of herbal medicines. The use of herbal medicines is widespread and 95.7% of the participated students have prior experience of using them. In the scale 1 to 10, students rated the importance of acquiring herbal medicine knowledge for them as 8.1±1.6. More than half (54.9%) agreed that herbal medicines have the same clinical significance as conventional medicines in treating diseases. Even more, students agreed that healthcare settings should give equal importance to both conventional and herbal medicine use (80.6%) and that herbal medicines should comply with strict quality control procedures as conventional medicines (84.9%). The latter statement also indicates that students consider safety issues associated with the use of herbal medicines seriously. It was further confirmed by 94.6% of students saying that the safety and toxicity information on herbs and spices are important to pharmacists and 95.7% of students admitting that drug-herb interactions may affect therapeutic outcome. Only 36.5% of students consider herbal medicines as s safer alternative to conventional medicines. The students use information on herbal medicines from various sources and media. Most of the students (81.7%) obtain information on herbal medicines from the Internet and only 20.4% mentioned lectures/workshop/seminars as a source of such information. Therefore, we can conclude that students attained the skills on the critical assessment of therapeutic properties of conventional medicines have a potential to use their skills for evidence-based decisions regarding herbal medicines.Keywords: evidence-based decision, pharmacy education, student perception, traditional medicines
Procedia PDF Downloads 2826016 Rule Based Architecture for Collaborative Multidisciplinary Aircraft Design Optimisation
Authors: Nickolay Jelev, Andy Keane, Carren Holden, András Sóbester
Abstract:
In aircraft design, the jump from the conceptual to preliminary design stage introduces a level of complexity which cannot be realistically handled by a single optimiser, be that a human (chief engineer) or an algorithm. The design process is often partitioned along disciplinary lines, with each discipline given a level of autonomy. This introduces a number of challenges including, but not limited to: coupling of design variables; coordinating disciplinary teams; handling of large amounts of analysis data; reaching an acceptable design within time constraints. A number of classical Multidisciplinary Design Optimisation (MDO) architectures exist in academia specifically designed to address these challenges. Their limited use in the industrial aircraft design process has inspired the authors of this paper to develop an alternative strategy based on well established ideas from Decision Support Systems. The proposed rule based architecture sacrifices possibly elusive guarantees of convergence for an attractive return in simplicity. The method is demonstrated on analytical and aircraft design test cases and its performance is compared to a number of classical distributed MDO architectures.Keywords: Multidisciplinary Design Optimisation, Rule Based Architecture, Aircraft Design, Decision Support System
Procedia PDF Downloads 3556015 Understanding Informal Settlements: The Role of Geo-Information Tools
Authors: Musyimi Mbathi
Abstract:
Information regarding social, political, demographic, economic and other attributes of human settlement is important for decision makers at all levels of planning, as they have to grapple with dynamic environments often associated with settlements. At the local level, it is particularly important for both communities and urban managers to have accurate and reliable information regarding all planning attributes. Settlement mapping, in particular, informal settlements mapping in Kenya, has over the past few years been carried out using modern tools like Geographic information systems (GIS) and remote sensing for spatial data analysis and planning. GIS tools offer a platform for integration of spatial and non-spatial data as well as visualisation of the settlements. The capabilities offered by these tools have enabled communities to participate especially in the planning and management of new infrastructure as well as settlement upgrading. Land tenure based projects within informal settlements have also relied on GIS and related tools with considerable success. Additionally, the adoption of participatory approaches and use of geo-information tools helped to provide a basis for all inclusive planning thus promoting accountability, transparency, legitimacy, and other dimensions of governance within human settlement planning. The paper examines the context and application of geo-information tools for planning within low-income settlements of Kenya. A case study of Kiambiu settlement will be used to demonstrate how the tools have been applied for planning and decision-making purposes.Keywords: informal settlements, GIS, governance, modern tools
Procedia PDF Downloads 5006014 The Integration of Geographical Information Systems and Capacitated Vehicle Routing Problem with Simulated Demand for Humanitarian Logistics in Tsunami-Prone Area: A Case Study of Phuket, Thailand
Authors: Kiatkulchai Jitt-Aer, Graham Wall, Dylan Jones
Abstract:
As a result of the Indian Ocean tsunami in 2004, logistics applied to disaster relief operations has received great attention in the humanitarian sector. As learned from such disaster, preparing and responding to the aspect of delivering essential items from distribution centres to affected locations are of the importance for relief operations as the nature of disasters is uncertain especially in suffering figures, which are normally proportional to quantity of supplies. Thus, this study proposes a spatial decision support system (SDSS) for humanitarian logistics by integrating Geographical Information Systems (GIS) and the capacitated vehicle routing problem (CVRP). The GIS is utilised for acquiring demands simulated from the tsunami flooding model of the affected area in the first stage, and visualising the simulation solutions in the last stage. While CVRP in this study encompasses designing the relief routes of a set of homogeneous vehicles from a relief centre to a set of geographically distributed evacuation points in which their demands are estimated by using both simulation and randomisation techniques. The CVRP is modeled as a multi-objective optimization problem where both total travelling distance and total transport resources used are minimized, while demand-cost efficiency of each route is maximized in order to determine route priority. As the model is a NP-hard combinatorial optimization problem, the Clarke and Wright Saving heuristics is proposed to solve the problem for the near-optimal solutions. The real-case instances in the coastal area of Phuket, Thailand are studied to perform the SDSS that allows a decision maker to visually analyse the simulation scenarios through different decision factors.Keywords: demand simulation, humanitarian logistics, geographical information systems, relief operations, capacitated vehicle routing problem
Procedia PDF Downloads 2486013 Binary Decision Diagram Based Methods to Evaluate the Reliability of Systems Considering Failure Dependencies
Authors: Siqi Qiu, Yijian Zheng, Xin Guo Ming
Abstract:
In many reliability and risk analysis, failures of components are supposed to be independent. However, in reality, the ignorance of failure dependencies among components may render the results of reliability and risk analysis incorrect. There are two principal ways to incorporate failure dependencies in system reliability and risk analysis: implicit and explicit methods. In the implicit method, failure dependencies can be modeled by joint probabilities, correlation values or conditional probabilities. In the explicit method, certain types of dependencies can be modeled in a fault tree as mutually independent basic events for specific component failures. In this paper, explicit and implicit methods based on BDD will be proposed to evaluate the reliability of systems considering failure dependencies. The obtained results prove the equivalence of the proposed implicit and explicit methods. It is found that the consideration of failure dependencies decreases the reliability of systems. This observation is intuitive, because more components fail due to failure dependencies. The consideration of failure dependencies helps designers to reduce the dependencies between components during the design phase to make the system more reliable.Keywords: reliability assessment, risk assessment, failure dependencies, binary decision diagram
Procedia PDF Downloads 4726012 A Survey and Analysis on Inflammatory Pain Detection and Standard Protocol Selection Using Medical Infrared Thermography from Image Processing View Point
Authors: Mrinal Kanti Bhowmik, Shawli Bardhan Jr., Debotosh Bhattacharjee
Abstract:
Human skin containing temperature value more than absolute zero, discharges infrared radiation related to the frequency of the body temperature. The difference in infrared radiation from the skin surface reflects the abnormality present in human body. Considering the difference, detection and forecasting the temperature variation of the skin surface is the main objective of using Medical Infrared Thermography(MIT) as a diagnostic tool for pain detection. Medical Infrared Thermography(MIT) is a non-invasive imaging technique that records and monitors the temperature flow in the body by receiving the infrared radiated from the skin and represent it through thermogram. The intensity of the thermogram measures the inflammation from the skin surface related to pain in human body. Analysis of thermograms provides automated anomaly detection associated with suspicious pain regions by following several image processing steps. The paper represents a rigorous study based survey related to the processing and analysis of thermograms based on the previous works published in the area of infrared thermal imaging for detecting inflammatory pain diseases like arthritis, spondylosis, shoulder impingement, etc. The study also explores the performance analysis of thermogram processing accompanied by thermogram acquisition protocols, thermography camera specification and the types of pain detected by thermography in summarized tabular format. The tabular format provides a clear structural vision of the past works. The major contribution of the paper introduces a new thermogram acquisition standard associated with inflammatory pain detection in human body to enhance the performance rate. The FLIR T650sc infrared camera with high sensitivity and resolution is adopted to increase the accuracy of thermogram acquisition and analysis. The survey of previous research work highlights that intensity distribution based comparison of comparable and symmetric region of interest and their statistical analysis assigns adequate result in case of identifying and detecting physiological disorder related to inflammatory diseases.Keywords: acquisition protocol, inflammatory pain detection, medical infrared thermography (MIT), statistical analysis
Procedia PDF Downloads 3436011 Resource-Constrained Assembly Line Balancing Problems with Multi-Manned Workstations
Authors: Yin-Yann Chen, Jia-Ying Li
Abstract:
Assembly line balancing problems can be categorized into one-sided, two-sided, and multi-manned ones by using the number of operators deployed at workstations. This study explores the balancing problem of a resource-constrained assembly line with multi-manned workstations. Resources include machines or tools in assembly lines such as jigs, fixtures, and hand tools. A mathematical programming model was developed to carry out decision-making and planning in order to minimize the numbers of workstations, resources, and operators for achieving optimal production efficiency. To improve the solution-finding efficiency, a genetic algorithm (GA) and a simulated annealing algorithm (SA) were designed and developed in this study to be combined with a practical case in car making. Results of the GA/SA and mathematics programming were compared to verify their validity. Finally, analysis and comparison were conducted in terms of the target values, production efficiency, and deployment combinations provided by the algorithms in order for the results of this study to provide references for decision-making on production deployment.Keywords: heuristic algorithms, line balancing, multi-manned workstation, resource-constrained
Procedia PDF Downloads 2086010 Detection of Powdery Mildew Disease in Strawberry Using Image Texture and Supervised Classifiers
Authors: Sultan Mahmud, Qamar Zaman, Travis Esau, Young Chang
Abstract:
Strawberry powdery mildew (PM) is a serious disease that has a significant impact on strawberry production. Field scouting is still a major way to find PM disease, which is not only labor intensive but also almost impossible to monitor disease severity. To reduce the loss caused by PM disease and achieve faster automatic detection of the disease, this paper proposes an approach for detection of the disease, based on image texture and classified with support vector machines (SVMs) and k-nearest neighbors (kNNs). The methodology of the proposed study is based on image processing which is composed of five main steps including image acquisition, pre-processing, segmentation, features extraction and classification. Two strawberry fields were used in this study. Images of healthy leaves and leaves infected with PM (Sphaerotheca macularis) disease under artificial cloud lighting condition. Colour thresholding was utilized to segment all images before textural analysis. Colour co-occurrence matrix (CCM) was introduced for extraction of textural features. Forty textural features, related to a physiological parameter of leaves were extracted from CCM of National television system committee (NTSC) luminance, hue, saturation and intensity (HSI) images. The normalized feature data were utilized for training and validation, respectively, using developed classifiers. The classifiers have experimented with internal, external and cross-validations. The best classifier was selected based on their performance and accuracy. Experimental results suggested that SVMs classifier showed 98.33%, 85.33%, 87.33%, 93.33% and 95.0% of accuracy on internal, external-I, external-II, 4-fold cross and 5-fold cross-validation, respectively. Whereas, kNNs results represented 90.0%, 72.00%, 74.66%, 89.33% and 90.3% of classification accuracy, respectively. The outcome of this study demonstrated that SVMs classified PM disease with a highest overall accuracy of 91.86% and 1.1211 seconds of processing time. Therefore, overall results concluded that the proposed study can significantly support an accurate and automatic identification and recognition of strawberry PM disease with SVMs classifier.Keywords: powdery mildew, image processing, textural analysis, color co-occurrence matrix, support vector machines, k-nearest neighbors
Procedia PDF Downloads 1206009 Artificial Law: Legal AI Systems and the Need to Satisfy Principles of Justice, Equality and the Protection of Human Rights
Authors: Begum Koru, Isik Aybay, Demet Celik Ulusoy
Abstract:
The discipline of law is quite complex and has its own terminology. Apart from written legal rules, there is also living law, which refers to legal practice. Basic legal rules aim at the happiness of individuals in social life and have different characteristics in different branches such as public or private law. On the other hand, law is a national phenomenon. The law of one nation and the legal system applied on the territory of another nation may be completely different. People who are experts in a particular field of law in one country may have insufficient expertise in the law of another country. Today, in addition to the local nature of law, international and even supranational law rules are applied in order to protect basic human values and ensure the protection of human rights around the world. Systems that offer algorithmic solutions to legal problems using artificial intelligence (AI) tools will perhaps serve to produce very meaningful results in terms of human rights. However, algorithms to be used should not be developed by only computer experts, but also need the contribution of people who are familiar with law, values, judicial decisions, and even the social and political culture of the society to which it will provide solutions. Otherwise, even if the algorithm works perfectly, it may not be compatible with the values of the society in which it is applied. The latest developments involving the use of AI techniques in legal systems indicate that artificial law will emerge as a new field in the discipline of law. More AI systems are already being applied in the field of law, with examples such as predicting judicial decisions, text summarization, decision support systems, and classification of documents. Algorithms for legal systems employing AI tools, especially in the field of prediction of judicial decisions and decision support systems, have the capacity to create automatic decisions instead of judges. When the judge is removed from this equation, artificial intelligence-made law created by an intelligent algorithm on its own emerges, whether the domain is national or international law. In this work, the aim is to make a general analysis of this new topic. Such an analysis needs both a literature survey and a perspective from computer experts' and lawyers' point of view. In some societies, the use of prediction or decision support systems may be useful to integrate international human rights safeguards. In this case, artificial law can serve to produce more comprehensive and human rights-protective results than written or living law. In non-democratic countries, it may even be thought that direct decisions and artificial intelligence-made law would be more protective instead of a decision "support" system. Since the values of law are directed towards "human happiness or well-being", it requires that the AI algorithms should always be capable of serving this purpose and based on the rule of law, the principle of justice and equality, and the protection of human rights.Keywords: AI and law, artificial law, protection of human rights, AI tools for legal systems
Procedia PDF Downloads 756008 A Multi-Criteria Decision Making (MCDM) Approach for Assessing the Sustainability Index of Building Façades
Authors: Golshid Gilani, Albert De La Fuente, Ana Blanco
Abstract:
Sustainability assessment of new and existing buildings has generated a growing interest due to the evident environmental, social and economic impacts during their construction and service life. Façades, as one of the most important exterior elements of a building, may contribute to the building sustainability by reducing the amount of energy consumption and providing thermal comfort for the inhabitants, thus minimizing the environmental impact on both the building and on the environment. Various methods have been used for the sustainability assessment of buildings due to the importance of this issue. However, most of the existing methods mainly concentrate on environmental and economic aspects, disregarding the third pillar of sustainability, which is the social aspect. Besides, there is a little focus on comprehensive sustainability assessment of facades, as an important element of a building. This confirms the need of developing methods for assessing the sustainable performance of building façades as an important step in achieving building sustainability. In this respect, this paper aims at presenting a model for assessing the global sustainability of façade systems. for that purpose, the Integrated Value Model for Sustainable Assessment (MIVES), a Multi-Criteria Decision Making model that integrates the main sustainability requirements (economic, environmental and social) and includes the concept of value functions, used as an assessment tool.Keywords: façade, MCDM, MIVES, sustainability
Procedia PDF Downloads 3456007 Design and Implementation of a Software Platform Based on Artificial Intelligence for Product Recommendation
Authors: Giuseppina Settanni, Antonio Panarese, Raffaele Vaira, Maurizio Galiano
Abstract:
Nowdays, artificial intelligence is used successfully in academia and industry for its ability to learn from a large amount of data. In particular, in recent years the use of machine learning algorithms in the field of e-commerce has spread worldwide. In this research study, a prototype software platform was designed and implemented in order to suggest to users the most suitable products for their needs. The platform includes a chatbot and a recommender system based on artificial intelligence algorithms that provide suggestions and decision support to the customer. The recommendation systems perform the important function of automatically filtering and personalizing information, thus allowing to manage with the IT overload to which the user is exposed on a daily basis. Recently, international research has experimented with the use of machine learning technologies with the aim to increase the potential of traditional recommendation systems. Specifically, support vector machine algorithms have been implemented combined with natural language processing techniques that allow the user to interact with the system, express their requests and receive suggestions. The interested user can access the web platform on the internet using a computer, tablet or mobile phone, register, provide the necessary information and view the products that the system deems them most appropriate. The platform also integrates a dashboard that allows the use of the various functions, which the platform is equipped with, in an intuitive and simple way. Artificial intelligence algorithms have been implemented and trained on historical data collected from user browsing. Finally, the testing phase allowed to validate the implemented model, which will be further tested by letting customers use it.Keywords: machine learning, recommender system, software platform, support vector machine
Procedia PDF Downloads 1346006 Detection and Classification of Rubber Tree Leaf Diseases Using Machine Learning
Authors: Kavyadevi N., Kaviya G., Gowsalya P., Janani M., Mohanraj S.
Abstract:
Hevea brasiliensis, also known as the rubber tree, is one of the foremost assets of crops in the world. One of the most significant advantages of the Rubber Plant in terms of air oxygenation is its capacity to reduce the likelihood of an individual developing respiratory allergies like asthma. To construct such a system that can properly identify crop diseases and pests and then create a database of insecticides for each pest and disease, we must first give treatment for the illness that has been detected. We shall primarily examine three major leaf diseases since they are economically deficient in this article, which is Bird's eye spot, algal spot and powdery mildew. And the recommended work focuses on disease identification on rubber tree leaves. It will be accomplished by employing one of the superior algorithms. Input, Preprocessing, Image Segmentation, Extraction Feature, and Classification will be followed by the processing technique. We will use time-consuming procedures that they use to detect the sickness. As a consequence, the main ailments, underlying causes, and signs and symptoms of diseases that harm the rubber tree are covered in this study.Keywords: image processing, python, convolution neural network (CNN), machine learning
Procedia PDF Downloads 766005 Mathematics Bridging Theory and Applications for a Data-Driven World
Authors: Zahid Ullah, Atlas Khan
Abstract:
In today's data-driven world, the role of mathematics in bridging the gap between theory and applications is becoming increasingly vital. This abstract highlights the significance of mathematics as a powerful tool for analyzing, interpreting, and extracting meaningful insights from vast amounts of data. By integrating mathematical principles with real-world applications, researchers can unlock the full potential of data-driven decision-making processes. This abstract delves into the various ways mathematics acts as a bridge connecting theoretical frameworks to practical applications. It explores the utilization of mathematical models, algorithms, and statistical techniques to uncover hidden patterns, trends, and correlations within complex datasets. Furthermore, it investigates the role of mathematics in enhancing predictive modeling, optimization, and risk assessment methodologies for improved decision-making in diverse fields such as finance, healthcare, engineering, and social sciences. The abstract also emphasizes the need for interdisciplinary collaboration between mathematicians, statisticians, computer scientists, and domain experts to tackle the challenges posed by the data-driven landscape. By fostering synergies between these disciplines, novel approaches can be developed to address complex problems and make data-driven insights accessible and actionable. Moreover, this abstract underscores the importance of robust mathematical foundations for ensuring the reliability and validity of data analysis. Rigorous mathematical frameworks not only provide a solid basis for understanding and interpreting results but also contribute to the development of innovative methodologies and techniques. In summary, this abstract advocates for the pivotal role of mathematics in bridging theory and applications in a data-driven world. By harnessing mathematical principles, researchers can unlock the transformative potential of data analysis, paving the way for evidence-based decision-making, optimized processes, and innovative solutions to the challenges of our rapidly evolving society.Keywords: mathematics, bridging theory and applications, data-driven world, mathematical models
Procedia PDF Downloads 756004 Bi-objective Network Optimization in Disaster Relief Logistics
Authors: Katharina Eberhardt, Florian Klaus Kaiser, Frank Schultmann
Abstract:
Last-mile distribution is one of the most critical parts of a disaster relief operation. Various uncertainties, such as infrastructure conditions, resource availability, and fluctuating beneficiary demand, render last-mile distribution challenging in disaster relief operations. The need to balance critical performance criteria like response time, meeting demand and cost-effectiveness further complicates the task. The occurrence of disasters cannot be controlled, and the magnitude is often challenging to assess. In summary, these uncertainties create a need for additional flexibility, agility, and preparedness in logistics operations. As a result, strategic planning and efficient network design are critical for an effective and efficient response. Furthermore, the increasing frequency of disasters and the rising cost of logistical operations amplify the need to provide robust and resilient solutions in this area. Therefore, we formulate a scenario-based bi-objective optimization model that integrates pre-positioning, allocation, and distribution of relief supplies extending the general form of a covering location problem. The proposed model aims to minimize underlying logistics costs while maximizing demand coverage. Using a set of disruption scenarios, the model allows decision-makers to identify optimal network solutions to address the risk of disruptions. We provide an empirical case study of the public authorities’ emergency food storage strategy in Germany to illustrate the potential applicability of the model and provide implications for decision-makers in a real-world setting. Also, we conduct a sensitivity analysis focusing on the impact of varying stockpile capacities, single-site outages, and limited transportation capacities on the objective value. The results show that the stockpiling strategy needs to be consistent with the optimal number of depots and inventory based on minimizing costs and maximizing demand satisfaction. The strategy has the potential for optimization, as network coverage is insufficient and relies on very high transportation and personnel capacity levels. As such, the model provides decision support for public authorities to determine an efficient stockpiling strategy and distribution network and provides recommendations for increased resilience. However, certain factors have yet to be considered in this study and should be addressed in future works, such as additional network constraints and heuristic algorithms.Keywords: humanitarian logistics, bi-objective optimization, pre-positioning, last mile distribution, decision support, disaster relief networks
Procedia PDF Downloads 796003 Prediction of Coronary Artery Stenosis Severity Based on Machine Learning Algorithms
Authors: Yu-Jia Jian, Emily Chia-Yu Su, Hui-Ling Hsu, Jian-Jhih Chen
Abstract:
Coronary artery is the major supplier of myocardial blood flow. When fat and cholesterol are deposit in the coronary arterial wall, narrowing and stenosis of the artery occurs, which may lead to myocardial ischemia and eventually infarction. According to the World Health Organization (WHO), estimated 740 million people have died of coronary heart disease in 2015. According to Statistics from Ministry of Health and Welfare in Taiwan, heart disease (except for hypertensive diseases) ranked the second among the top 10 causes of death from 2013 to 2016, and it still shows a growing trend. According to American Heart Association (AHA), the risk factors for coronary heart disease including: age (> 65 years), sex (men to women with 2:1 ratio), obesity, diabetes, hypertension, hyperlipidemia, smoking, family history, lack of exercise and more. We have collected a dataset of 421 patients from a hospital located in northern Taiwan who received coronary computed tomography (CT) angiography. There were 300 males (71.26%) and 121 females (28.74%), with age ranging from 24 to 92 years, and a mean age of 56.3 years. Prior to coronary CT angiography, basic data of the patients, including age, gender, obesity index (BMI), diastolic blood pressure, systolic blood pressure, diabetes, hypertension, hyperlipidemia, smoking, family history of coronary heart disease and exercise habits, were collected and used as input variables. The output variable of the prediction module is the degree of coronary artery stenosis. The output variable of the prediction module is the narrow constriction of the coronary artery. In this study, the dataset was randomly divided into 80% as training set and 20% as test set. Four machine learning algorithms, including logistic regression, stepwise regression, neural network and decision tree, were incorporated to generate prediction results. We used area under curve (AUC) / accuracy (Acc.) to compare the four models, the best model is neural network, followed by stepwise logistic regression, decision tree, and logistic regression, with 0.68 / 79 %, 0.68 / 74%, 0.65 / 78%, and 0.65 / 74%, respectively. Sensitivity of neural network was 27.3%, specificity was 90.8%, stepwise Logistic regression sensitivity was 18.2%, specificity was 92.3%, decision tree sensitivity was 13.6%, specificity was 100%, logistic regression sensitivity was 27.3%, specificity 89.2%. From the result of this study, we hope to improve the accuracy by improving the module parameters or other methods in the future and we hope to solve the problem of low sensitivity by adjusting the imbalanced proportion of positive and negative data.Keywords: decision support, computed tomography, coronary artery, machine learning
Procedia PDF Downloads 2296002 Structure and Properties of Meltblown Polyetherimide as High Temperature Filter Media
Authors: Gajanan Bhat, Vincent Kandagor, Daniel Prather, Ramesh Bhave
Abstract:
Polyetherimide (PEI), an engineering plastic with very high glass transition temperature and excellent chemical and thermal stability, has been processed into a controlled porosity filter media of varying pore size, performance, and surface characteristics. A special grade of the PEI was processed by melt blowing to produce microfiber nonwovens suitable as filter media. The resulting microfiber webs were characterized to evaluate their structure and properties. The fiber webs were further modified by hot pressing, a post processing technique, which reduces the pore size in order to improve the barrier properties of the resulting membranes. This ongoing research has shown that PEI can be a good candidate for filter media requiring high temperature and chemical resistance with good mechanical properties. Also, by selecting the appropriate processing conditions, it is possible to achieve desired filtration performance from this engineering plastic.Keywords: nonwovens, melt blowing, polyehterimide, filter media, microfibers
Procedia PDF Downloads 3156001 Drying of Agro-Industrial Wastes Using a Cabinet Type Solar Dryer
Authors: N. Metidji, O. Badaoui, A. Djebli, H. Bendjebbas, R. Sellami
Abstract:
The agro-industry is considered as one of the most waste producing industrial fields as a result of food processing. Upgrading and reuse of these wastes as animal or poultry food seems to be a promising alternative. Combined with the use of clean energy resources, the recovery process would contribute more to the environment protection. It is in this framework that a new solar dryer has been designed in the Unit of Solar Equipment Development. Direct solar drying has, also, many advantages compared to natural sun drying. In fact, the first does not cause product degradation as it is protected by the drying chamber from direct sun, insects and exterior environment. The aim of this work is to study the drying kinetics of waste, generated during the processing of pepper, by using a direct natural convection solar dryer at 35◦C and 55◦C. The rate of moisture removal from the product to be dried has been found to be directly related to temperature, humidity and flow rate. The characterization of these parameters has allowed the determination of the appropriate drying time for this product namely peppers waste.Keywords: solar energy, solar dryer, energy conversion, pepper drying, forced convection solar dryer
Procedia PDF Downloads 4116000 Working Memory and Phonological Short-Term Memory in the Acquisition of Academic Formulaic Language
Authors: Zhicheng Han
Abstract:
This study examines the correlation between knowledge of formulaic language, working memory (WM), and phonological short-term memory (PSTM) in Chinese L2 learners of English. This study investigates if WM and PSTM correlate differently to the acquisition of formulaic language, which may be relevant for the discourse around the conceptualization of formulas. Connectionist approaches have lead scholars to argue that formulas are form-meaning connections stored whole, making PSTM significant in the acquisitional process as it pertains to the storage and retrieval of chunk information. Generativist scholars, on the other hand, argued for active participation of interlanguage grammar in the acquisition and use of formulaic language, where formulas are represented in the mind but retain the internal structure built around a lexical core. This would make WM, especially the processing component of WM an important cognitive factor since it plays a role in processing and holding information for further analysis and manipulation. The current study asked L1 Chinese learners of English enrolled in graduate programs in China to complete a preference raking task where they rank their preference for formulas, grammatical non-formulaic expressions, and ungrammatical phrases with and without the lexical core in academic contexts. Participants were asked to rank the options in order of the likeliness of them encountering these phrases in the test sentences within academic contexts. Participants’ syntactic proficiency is controlled with a cloze test and grammar test. Regression analysis found a significant relationship between the processing component of WM and preference of formulaic expressions in the preference ranking task while no significant correlation is found for PSTM or syntactic proficiency. The correlational analysis found that WM, PSTM, and the two proficiency test scores have significant covariates. However, WM and PSTM have different predictor values for participants’ preference for formulaic language. Both storage and processing components of WM are significantly correlated with the preference for formulaic expressions while PSTM is not. These findings are in favor of the role of interlanguage grammar and syntactic knowledge in the acquisition of formulaic expressions. The differing effects of WM and PSTM suggest that selective attention to and processing of the input beyond simple retention play a key role in successfully acquiring formulaic language. Similar correlational patterns were found for preferring the ungrammatical phrase with the lexical core of the formula over the ones without the lexical core, attesting to learners’ awareness of the lexical core around which formulas are constructed. These findings support the view that formulaic phrases retain internal syntactic structures that are recognized and processed by the learners.Keywords: formulaic language, working memory, phonological short-term memory, academic language
Procedia PDF Downloads 635999 Different Motor Inhibition Processes in Action Selection Stage: A Study with Spatial Stroop Paradigm
Authors: German Galvez-Garcia, Javier Albayay, Javiera Peña, Marta Lavin, George A. Michael
Abstract:
The aim of this research was to investigate whether the selection of the actions needs different inhibition processes during the response selection stage. In Experiment 1, we compared the magnitude of the Spatial Stroop effect, which occurs in response selection stage, in two motor actions (lifting vs reaching) when the participants performed both actions in the same block or in different blocks (mixed block vs. pure blocks).Within pure blocks, we obtained faster latencies when lifting actions were performed, but no differences in the magnitude of the Spatial Stroop effect were observed. Within mixed block, we obtained faster latencies as well as bigger-magnitude for Spatial Stroop effect when reaching actions were performed. We concluded that when no action selection is required (the pure blocks condition), inhibition works as a unitary system, whereas in the mixed block condition, where action selection is required, different inhibitory processes take place within a common processing stage. In Experiment 2, we investigated this common processing stage in depth by limiting participants’ available resources, requiring them to engage in a concurrent auditory task within a mixed block condition. The Spatial Stroop effect interacted with Movement as it did in Experiment 1, but it did not significantly interact with available resources (Auditory task x Spatial Stroop effect x Movement interaction). Thus, we concluded that available resources are distributed equally to both inhibition processes; this reinforces the likelihood of there being a common processing stage in which the different inhibitory processes take place.Keywords: inhibition process, motor processes, selective inhibition, dual task
Procedia PDF Downloads 3925998 How Rational Decision-Making Mechanisms of Individuals Are Corrupted under the Presence of Others and the Reflection of This on Financial Crisis Management Situations
Authors: Gultekin Gurcay
Abstract:
It is known that the most crucial influence of the psychological, social and emotional factors that affect any human behavior is to corrupt the rational decision making mechanism of the individuals and cause them to display irrational behaviors. In this regard, the social context of human beings influences the rationality of our decisions, and people tend to display different behaviors when they were alone compared to when they were surrounded by others. At this point, the interaction and interdependence of the behavioral finance and economics with the area of social psychology comes, where intentions and the behaviors of the individuals are being analyzed in the actual or implied presence of others comes into prominence. Within the context of this study, the prevalent theories of behavioral finance, which are The Prospect Theory, The Utility Theory Given Uncertainty and the Five Axioms of Choice under Uncertainty, Veblen’s Hidden Utility Theory, and the concept of ‘Overreaction’ has been examined and demonstrated; and the meaning, existence and validity of these theories together with the social context has been assessed. Finally, in this study the behavior of the individuals in financial crisis situations where the majority of the society is being affected from the same negative conditions at the same time has been analyzed, by taking into account how individual behavior will change according to the presence of the others.Keywords: conditional variance coefficient, financial crisis, garch model, stock market
Procedia PDF Downloads 2405997 A Framework for Early Differential Diagnosis of Tropical Confusable Diseases Using the Fuzzy Cognitive Map Engine
Authors: Faith-Michael E. Uzoka, Boluwaji A. Akinnuwesi, Taiwo Amoo, Flora Aladi, Stephen Fashoto, Moses Olaniyan, Joseph Osuji
Abstract:
The overarching aim of this study is to develop a soft-computing system for the differential diagnosis of tropical diseases. These conditions are of concern to health bodies, physicians, and the community at large because of their mortality rates, and difficulties in early diagnosis due to the fact that they present with symptoms that overlap, and thus become ‘confusable’. We report on the first phase of our study, which focuses on the development of a fuzzy cognitive map model for early differential diagnosis of tropical diseases. We used malaria as a case disease to show the effectiveness of the FCM technology as an aid to the medical practitioner in the diagnosis of tropical diseases. Our model takes cognizance of manifested symptoms and other non-clinical factors that could contribute to symptoms manifestations. Our model showed 85% accuracy in diagnosis, as against the physicians’ initial hypothesis, which stood at 55% accuracy. It is expected that the next stage of our study will provide a multi-disease, multi-symptom model that also improves efficiency by utilizing a decision support filter that works on an algorithm, which mimics the physician’s diagnosis process.Keywords: medical diagnosis, tropical diseases, fuzzy cognitive map, decision support filters, malaria differential diagnosis
Procedia PDF Downloads 3205996 A Study on the Different Components of a Typical Back-Scattered Chipless RFID Tag Reflection
Authors: Fatemeh Babaeian, Nemai Chandra Karmakar
Abstract:
Chipless RFID system is a wireless system for tracking and identification which use passive tags for encoding data. The advantage of using chipless RFID tag is having a planar tag which is printable on different low-cost materials like paper and plastic. The printed tag can be attached to different items in the labelling level. Since the price of chipless RFID tag can be as low as a fraction of a cent, this technology has the potential to compete with the conventional optical barcode labels. However, due to the passive structure of the tag, data processing of the reflection signal is a crucial challenge. The captured reflected signal from a tag attached to an item consists of different components which are the reflection from the reader antenna, the reflection from the item, the tag structural mode RCS component and the antenna mode RCS of the tag. All these components are summed up in both time and frequency domains. The effect of reflection from the item and the structural mode RCS component can distort/saturate the frequency domain signal and cause difficulties in extracting the desired component which is the antenna mode RCS. Therefore, it is required to study the reflection of the tag in both time and frequency domains to have a better understanding of the nature of the captured chipless RFID signal. The other benefits of this study can be to find an optimised encoding technique in tag design level and to find the best processing algorithm the chipless RFID signal in decoding level. In this paper, the reflection from a typical backscattered chipless RFID tag with six resonances is analysed, and different components of the signal are separated in both time and frequency domains. Moreover, the time domain signal corresponding to each resonator of the tag is studied. The data for this processing was captured from simulation in CST Microwave Studio 2017. The outcome of this study is understanding different components of a measured signal in a chipless RFID system and a discovering a research gap which is a need to find an optimum detection algorithm for tag ID extraction.Keywords: antenna mode RCS, chipless RFID tag, resonance, structural mode RCS
Procedia PDF Downloads 2005995 Wobbled Laser Beam Welding for Macro-to Micro-Fabrication Process
Authors: Farzad Vakili-Farahani, Joern Lungershausen, Kilian Wasmer
Abstract:
Wobbled laser beam welding, fast oscillations of a tiny laser beam within a designed path (weld geometry) during the laser pulse illumination, opens new possibilities to improve the marco-to micro-manufacturing process. The present work introduces the wobbled laser beam welding as a robust welding strategy for improving macro-to micro-fabrication process, e.g., the laser processing for gap-bridging and packaging industry. The typical requisites and relevant equipment for the development of a wobbled laser processing unit are addressed, including a suitable laser source, light delivery system, optics, proper beam deflection system and the design geometry. In addition, experiments have been carried out on titanium plate to compare the results of wobbled laser welding with conventional pulsed laser welding. As compared to the pulsed laser welding, the wobbled laser welding offers a much greater fusion area (i.e. additional molten material) while minimizing the HAZ and provides a better confinement of the material microstructural changes.Keywords: wobbled laser beam welding, wobbling function, beam oscillation, micro welding
Procedia PDF Downloads 3285994 Consumer Behaviour Model for Apparel E-Tailers Using Structural Equation Modelling
Authors: Halima Akhtar, Abhijeet Chandra
Abstract:
The paper attempts to analyze the factors that influence the Consumer Behavior to purchase apparel through the internet. The intentions to buy apparels online were based on in terms of user style, orientation, size and reputation of the merchant, social influence, perceived information utility, perceived ease of use, perceived pleasure and attractiveness and perceived trust and risk. The basic framework used was Technology acceptance model to explain apparels acceptance. A survey was conducted to gather the data from 200 people. The measures and hypotheses were analyzed using Correlation testing and would be further validated by the Structural Equation Modelling. The implications of the findings for theory and practice could be used by marketers of online apparel websites. Based on the values obtained, we can conclude that the factors such as social influence, Perceived information utility, attractiveness and trust influence the decision for a user to buy apparels online. The major factors which are found to influence an online apparel buying decision are ease of use, attractiveness that a website can offer and the trust factor which a user shares with the website.Keywords: E-tailers, consumer behaviour, technology acceptance model, structural modelling
Procedia PDF Downloads 1865993 Moral Decision-Making in the Criminal Justice System: The Influence of Gruesome Descriptions
Authors: Michel Patiño-Sáenz, Martín Haissiner, Jorge Martínez-Cotrina, Daniel Pastor, Hernando Santamaría-García, Maria-Alejandra Tangarife, Agustin Ibáñez, Sandra Baez
Abstract:
It has been shown that gruesome descriptions of harm can increase the punishment given to a transgressor. This biasing effect is mediated by negative emotions, which are elicited upon the presentation of gruesome descriptions. However, there is a lack of studies inquiring the influence of such descriptions on moral decision-making in people involved in the criminal justice system. Such populations are of special interest since they have experience dealing with gruesome evidence, but also formal education on how to assess evidence and gauge the appropriate punishment according to the law. Likewise, they are expected to be objective and rational when performing their duty, because their decisions can impact profoundly people`s lives. Considering these antecedents, the objective of this study was to explore the influence gruesome written descriptions on moral decision-making in this group of people. To that end, we recruited attorneys, judges and public prosecutors (Criminal justice group, CJ, n=30) whose field of specialty is criminal law. In addition, we included a control group of people who did not have a formal education in law (n=30), but who were paired in age and years of education with the CJ group. All participants completed an online, Spanish-adapted version of a moral decision-making task, which was previously reported in the literature and also standardized and validated in the Latin-American context. A series of text-based stories describing two characters, one inflicting harm on the other, were presented to participants. Transgressor's intentionality (accidental vs. intentional harm) and language (gruesome vs. plain) used to describe harm were manipulated employing a within-subjects and a between-subjects design, respectively. After reading each story, participants were asked to rate (a) the harmful action's moral adequacy, (b) the amount of punishment deserving the transgressor and (c) how damaging was his behavior. Results showed main effects of group, intentionality and type of language on all dependent measures. In both groups, intentional harmful actions were rated as significantly less morally adequate, were punished more severely and were deemed as more damaging. Moreover, control subjects deemed more damaging and punished more severely any type of action than the CJ group. In addition, there was an interaction between intentionality and group. People in the control group rated harmful actions as less morally adequate than the CJ group, but only when the action was accidental. Also, there was an interaction between intentionality and language on punishment ratings. Controls punished more when harm was described using gruesome language. However, that was not the case of people in the CJ group, who assigned the same amount of punishment in both conditions. In conclusion, participants with job experience in the criminal justice system or criminal law differ in the way they make moral decisions. Particularly, it seems that they are less sensitive to the biasing effect of gruesome evidence, which is probably explained by their formal education or their experience in dealing with such evidence. Nonetheless, more studies are needed to determine the impact this phenomenon has on the fulfillment of their duty.Keywords: criminal justice system, emotions, gruesome descriptions, intentionality, moral decision-making
Procedia PDF Downloads 1885992 The Antecedent Variables of Government Financial Accounting System (SAKD) Implementation and Its Consequences: Empirical Study on the Device of Regional Coordinating Agency for Development of Cross County, City Region III Central Java Province, Indo
Authors: Dona Primasari
Abstract:
This study examines the antecedent variables of Government Financial Acccounting System (SAKD) implementation and its consequence. The antecedent variables are: decentralization of decision making, adaptation, and the manager support. The consequences are satisfaction and performance officer. This research represents the empirical test which used convenience sampling technics in data collection. The data were collected from 167 officers of local government in the Regional Coordinating Agency for Development of Cross County/City Region III Central Java Province. Data analysis used Structural Equation Model (SEM) with the AMOS 18.0 program. The result of hypothesis examination indicates that six raised hypothesis are accepted and two hypothesis are rejected.Keywords: decentralization of decision making, adaptation officer, manager support, implementation of Government Accounting Financial System (SAKD), satisfaction and performance officer
Procedia PDF Downloads 3895991 Strategic Innovation of Nanotechnology: Novel Applications of Biomimetics and Microfluidics in Food Safety
Authors: Boce Zhang
Abstract:
Strategic innovation of nanotechnology to promote food safety has drawn tremendous attentions among research groups, which includes the need for research support during the implementation of the Food Safety Modernization Act (FSMA) in the United States. There are urgent demands and knowledge gaps to the understanding of a) food-water-bacteria interface as for how pathogens persist and transmit during food processing and storage; b) minimum processing requirement needed to prevent pathogen cross-contamination in the food system. These knowledge gaps are of critical importance to the food industry. However, the lack of knowledge is largely hindered by the limitations of research tools. Our groups recently endeavored two novel engineering systems with biomimetics and microfluidics as a holistic approach to hazard analysis and risk mitigation, which provided unprecedented research opportunities to study pathogen behavior, in particular, contamination, and cross-contamination, at the critical food-water-pathogen interface. First, biomimetically-patterned surfaces (BPS) were developed to replicate the identical surface topography and chemistry of a natural food surface. We demonstrated that BPS is a superior research tool that empowers the study of a) how pathogens persist through sanitizer treatment, b) how to apply fluidic shear-force and surface tension to increase the vulnerability of the bacterial cells, by detaching them from a protected area, etc. Secondly, microfluidic devices were designed and fabricated to study the bactericidal kinetics in the sub-second time frame (0.1~1 second). The sub-second kinetics is critical because the cross-contamination process, which includes detachment, migration, and reattachment, can occur in a very short timeframe. With this microfluidic device, we were able to simulate and study these sub-second cross-contamination scenarios, and to further investigate the minimum sanitizer concentration needed to sufficiently prevent pathogen cross-contamination during the food processing. We anticipate that the findings from these studies will provide critical insight on bacterial behavior at the food-water-cell interface, and the kinetics of bacterial inactivation from a broad range of sanitizers and processing conditions, thus facilitating the development and implementation of science-based food safety regulations and practices to mitigate the food safety risks.Keywords: biomimetic materials, microbial food safety, microfluidic device, nanotechnology
Procedia PDF Downloads 359