Search results for: midpoint rule
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 799

Search results for: midpoint rule

619 Combat Capability Improvement Using Sleep Analysis

Authors: Gabriela Kloudova, Miloslav Stehlik, Peter Sos

Abstract:

The quality of sleep can affect combat performance where the vigilance, accuracy and reaction time are a decisive factor. In the present study, airborne and special units are measured on duty using actigraphy fingerprint scoring algorithm and QEEG (quantitative EEG). Actigraphic variables of interest will be: mean nightly sleep duration, mean napping duration, mean 24-h sleep duration, mean sleep latency, mean sleep maintenance efficiency, mean sleep fragmentation index, mean sleep onset time, mean sleep offset time and mean midpoint time. In an attempt to determine the individual somnotype of each subject, the data like sleep pattern, chronotype (morning and evening lateness), biological need for sleep (daytime and anytime sleepability) and trototype (daytime and anytime wakeability) will be extracted. Subsequently, a series of recommendations will be included in the training plan based on daily routine, timing of the day and night activities, duration of sleep and the number of sleeping blocks in a defined time. The aim of these modifications in the training plan is to reduce day-time sleepiness, improve vigilance, attention, accuracy, speed of the conducted tasks and to optimize energy supplies. Regular improvement of the training supposed to have long-term neurobiological consequences including neuronal activity changes measured by QEEG. Subsequently, that should enhance cognitive functioning in subjects assessed by the digital cognitive test batteries and improve their overall performance.

Keywords: sleep quality, combat performance, actigraph, somnotype

Procedia PDF Downloads 134
618 Predicting the Human Impact of Natural Onset Disasters Using Pattern Recognition Techniques and Rule Based Clustering

Authors: Sara Hasani

Abstract:

This research focuses on natural sudden onset disasters characterised as ‘occurring with little or no warning and often cause excessive injuries far surpassing the national response capacities’. Based on the panel analysis of the historic record of 4,252 natural onset disasters between 1980 to 2015, a predictive method was developed to predict the human impact of the disaster (fatality, injured, homeless) with less than 3% of errors. The geographical dispersion of the disasters includes every country where the data were available and cross-examined from various humanitarian sources. The records were then filtered into 4252 records of the disasters where the five predictive variables (disaster type, HDI, DRI, population, and population density) were clearly stated. The procedure was designed based on a combination of pattern recognition techniques and rule-based clustering for prediction and discrimination analysis to validate the results further. The result indicates that there is a relationship between the disaster human impact and the five socio-economic characteristics of the affected country mentioned above. As a result, a framework was put forward, which could predict the disaster’s human impact based on their severity rank in the early hours of disaster strike. The predictions in this model were outlined in two worst and best-case scenarios, which respectively inform the lower range and higher range of the prediction. A necessity to develop the predictive framework can be highlighted by noticing that despite the existing research in literature, a framework for predicting the human impact and estimating the needs at the time of the disaster is yet to be developed. This can further be used to allocate the resources at the response phase of the disaster where the data is scarce.

Keywords: disaster management, natural disaster, pattern recognition, prediction

Procedia PDF Downloads 128
617 Revisiting Dispute Resolution Mechanisms in the Southern African Development Community: A Proposal for Synchronization

Authors: Tapiwa Shumba, Nyaradzo D. T. Karubwa

Abstract:

Dispute resolution is the plinth of regional integration initiatives anchored on the rule of law and compliance with obligations. Without effective and reliable despite resolution mechanisms, it may be difficult to foster deeper integration. Within the Southern African Development Community (SADC) legal and institutional framework exists an apparent recognition that dispute resolution is an integral part of the regional integration. Almost all legal instruments of SADC include some provision for dispute resolution. Institutionally, the somewhat now defunct SADC Tribunal is meant to be the fulcrum for resolving disputes that arise under SADC instruments. However, after a closer analysis of the substance of these legal provisions and the attendant procedural mechanisms for addressing disputes, an argument can be made that dispute resolution in SADC is somewhat scant, fragmented and neglected. In most instruments, the common provision on dispute resolution appears to be a ‘mid-night clause’. In other instruments which have specialised provisions and procedures, questions of practicality and genius cannot be avoided. Worse still there now appears to be a lack of magnanimity between the substantive provisions in various instruments and the role of the transformed Tribunal. This scant, fragmented and neglected dispute resolution system may have an impact on the observance of the rule of law and compliance with obligations in the rules-based SADC system. This all, in turn, has an effect on the common agenda for deeper regional integration. This article seeks to expose this scant, fragmented and neglected SADC dispute resolution system and to propose a harmonised system that addresses these challenges. A ‘one stop shop’ system under a strengthened SADC tribunal is proposed as a responsive solution.

Keywords: regional integration, harmonisation, SADC tribunal, dispute resolution

Procedia PDF Downloads 167
616 Analyzing of Speed Disparity in Mixed Vehicle Technologies on Horizontal Curves

Authors: Tahmina Sultana, Yasser Hassan

Abstract:

Vehicle technologies rapidly evolving due to their multifaceted advantages. Adapted different vehicle technologies like connectivity and automation on the same roads with conventional vehicles controlled by human drivers may increase speed disparity in mixed vehicle technologies. Identifying relationships between speed distribution measures of different vehicles and road geometry can be an indicator of speed disparity in mixed technologies. Previous studies proved that speed disparity measures and traffic accidents are inextricably related. Horizontal curves from three geographic areas were selected based on relevant criteria, and speed data were collected at the midpoint of the preceding tangent and starting, ending, and middle point of the curve. Multiple linear mixed effect models (LME) were developed using the instantaneous speed measures representing the speed of vehicles at different points of horizontal curves to recognize relationships between speed variance (standard deviation) and road geometry. A simulation-based framework (Monte Carlo) was introduced to check the speed disparity on horizontal curves in mixed vehicle technologies when consideration is given to the interactions among connected vehicles (CVs), autonomous vehicles (AVs), and non-connected vehicles (NCVs) on horizontal curves. The Monte Carlo method was used in the simulation to randomly sample values for the various parameters from their respective distributions. Theresults show that NCVs had higher speed variation than CVs and AVs. In addition, AVs and CVs contributed to reduce speed disparity in the mixed vehicle technologies in any penetration rates.

Keywords: autonomous vehicles, connected vehicles, non-connected vehicles, speed variance

Procedia PDF Downloads 110
615 Fatigue Life Prediction under Variable Loading Based a Non-Linear Energy Model

Authors: Aid Abdelkrim

Abstract:

A method of fatigue damage accumulation based upon application of energy parameters of the fatigue process is proposed in the paper. Using this model is simple, it has no parameter to be determined, it requires only the knowledge of the curve W–N (W: strain energy density N: number of cycles at failure) determined from the experimental Wöhler curve. To examine the performance of nonlinear models proposed in the estimation of fatigue damage and fatigue life of components under random loading, a batch of specimens made of 6082 T 6 aluminium alloy has been studied and some of the results are reported in the present paper. The paper describes an algorithm and suggests a fatigue cumulative damage model, especially when random loading is considered. This work contains the results of uni-axial random load fatigue tests with different mean and amplitude values performed on 6082T6 aluminium alloy specimens. The proposed model has been formulated to take into account the damage evolution at different load levels and it allows the effect of the loading sequence to be included by means of a recurrence formula derived for multilevel loading, considering complex load sequences. It is concluded that a ‘damaged stress interaction damage rule’ proposed here allows a better fatigue damage prediction than the widely used Palmgren–Miner rule, and a formula derived in random fatigue could be used to predict the fatigue damage and fatigue lifetime very easily. The results obtained by the model are compared with the experimental results and those calculated by the most fatigue damage model used in fatigue (Miner’s model). The comparison shows that the proposed model, presents a good estimation of the experimental results. Moreover, the error is minimized in comparison to the Miner’s model.

Keywords: damage accumulation, energy model, damage indicator, variable loading, random loading

Procedia PDF Downloads 369
614 The Roman Fora in North Africa Towards a Supportive Protocol to the Decision for the Morphological Restitution

Authors: Dhouha Laribi Galalou, Najla Allani Bouhoula, Atef Hammouda

Abstract:

This research delves into the fundamental question of the morphological restitution of built archaeology in order to place it in its paradigmatic context and to seek answers to it. Indeed, the understanding of the object of the study, its analysis, and the methodology of solving the morphological problem posed, are manageable aspects only by means of a thoughtful strategy that draws on well-defined epistemological scaffolding. In this stream, the crisis of natural reasoning in archaeology has generated multiple changes in this field, ranging from the use of new tools to the integration of an archaeological information system where urbanization involves the interplay of several disciplines. The built archaeological topic is also an architectural and morphological object. It is also a set of articulated elementary data, the understanding of which is about to be approached from a logicist point of view. Morphological restitution is no exception to the rule, and the inter-exchange between the different disciplines uses the capacity of each to frame the reflection on the incomplete elements of a given architecture or on its different phases and multiple states of existence. The logicist sequence is furnished by the set of scattered or destroyed elements found, but also by what can be called a rule base which contains the set of rules for the architectural construction of the object. The knowledge base built from the archaeological literature also provides a reference that enters into the game of searching for forms and articulations. The choice of the Roman Forum in North Africa is justified by the great urban and architectural characteristics of this entity. The research on the forum involves both a fairly large knowledge base but also provides the researcher with material to study - from a morphological and architectural point of view - starting from the scale of the city down to the architectural detail. The experimentation of the knowledge deduced on the paradigmatic level, as well as the deduction of an analysis model, is then carried out on the basis of a well-defined context which contextualises the experimentation from the elaboration of the morphological information container attached to the rule base and the knowledge base. The use of logicist analysis and artificial intelligence has allowed us to first question the aspects already known in order to measure the credibility of our system, which remains above all a decision support tool for the morphological restitution of Roman Fora in North Africa. This paper presents a first experimentation of the model elaborated during this research, a model framed by a paradigmatic discussion and thus trying to position the research in relation to the existing paradigmatic and experimental knowledge on the issue.

Keywords: classical reasoning, logicist reasoning, archaeology, architecture, roman forum, morphology, calculation

Procedia PDF Downloads 116
613 Investigating the Environmental Impact of Additive Manufacturing Compared to Conventional Manufacturing through Life Cycle Assessment

Authors: Gustavo Menezes De Souza Melo, Arnaud Heitz, Johannes Henrich Schleifenbaum

Abstract:

Additive manufacturing is a growing market that is taking over in many industries as it offers numerous advantages like new design possibilities, weight-saving solutions, ease of manufacture, and simplification of assemblies. These are all unquestionable technical or financial assets. As to the environmental aspect, additive manufacturing is often discussed whether it is the best solution to decarbonize our industries or if conventional manufacturing remains cleaner. This work presents a life cycle assessment (LCA) comparison based on the technological case of a motorbike swing-arm. We compare the original equipment manufacturer part made with conventional manufacturing (CM) methods to an additive manufacturing (AM) version printed using the laser powder bed fusion process. The AM version has been modified and optimized to achieve better dynamic performance without any regard to weight saving. Lightweight not being a priority in the creation of the 3D printed part brings us a unique perspective in this study. To achieve the LCA, we are using the open-source life cycle, and sustainability software OpenLCA combined with the ReCiPe 2016 at midpoint and endpoint level method. This allows the calculation and the presentation of the results through indicators such as global warming, water use, resource scarcity, etc. The results are then showing the relative impact of the AM version compared to the CM one and give us a key to understand and answer questions about the environmental sustainability of additive manufacturing.

Keywords: additive manufacturing, environmental impact, life cycle assessment, laser powder bed fusion

Procedia PDF Downloads 225
612 The Jury System in the Courts in Nineteenth Century Assam: Power Negotiations and Politics in an Institutional Rubric of a Colonial Regime

Authors: Jahnu Bharadwaj

Abstract:

In the third decade of the 19th century, the political landscape of the Brahmaputra valley changed at many levels. The establishment of East India Company’s authority in ‘Assam’ was complete with the Treaty of Yandaboo. The whole phenomenon of the annexation of Assam into the British Indian Empire led to several administrative reorganizations and reforms under the new regime. British colonial rule was distinguished by new systems and institutions of governance. This paper broadly looks at the historical proceedings of the introduction of the Rule of Law and a new legal structure in the region of ‘Assam’. With numerous archival data, this paper seeks to chiefly examine the trajectory of an important element in the new legal apparatus, i.e. the jury in the British criminal courts introduced in the newly annexed region. Right from the beginning of colonial legal innovations with the establishment of the panchayats and the parallel courts in Assam, the jury became an important element in the structure of the judicial system. In both civil and criminal courts, the jury was to be formed from the learned members of the ‘native’ society. In the working of the criminal court, the jury became significantly powerful and influential. The structure meant that the judge or the British authority eventually had no compulsion to obey the verdict of the jury. However, the structure also provided that the jury had a considerable say in matters of the court proceedings, and their verdict had significant weight. This study seeks to look at certain important criminal cases pertaining to the nineteenth century and the functioning of the jury in those cases. The power play at display between the British officials, judges and the members of the jury would be helpful in highlighting the important deliberations and politics that were in place in the functioning of the British criminal legal apparatus in colonial Assam. The working and the politics of the members of the jury in many cases exerted considerable influence in the court proceedings. The interesting negotiations of the British officials or judges also present us with vital insights. By reflecting on the difficulty that the British officials and judges felt with the considerable space for opinion and difference that was provided to important members of the local society, this paper seeks to locate, with evidence, the racial politics at play within the official formulations of the legal apparatus in the colonial rule in Assam. This study seeks to argue that despite the rhetorical claims of legal equality within the Empire, racial consideration and racial politics was a reality even in the making of the structure itself. This in a way helps to enrich our ideas about the racial elements at work in numerous layers sustaining the colonial regime.

Keywords: criminal courts, colonial regime, jury, race

Procedia PDF Downloads 149
611 Optimal Operation of Bakhtiari and Roudbar Dam Using Differential Evolution Algorithms

Authors: Ramin Mansouri

Abstract:

Due to the contrast of rivers discharge regime with water demands, one of the best ways to use water resources is to regulate the natural flow of the rivers and supplying water needs to construct dams. Optimal utilization of reservoirs, consideration of multiple important goals together at the same is of very high importance. To study about analyzing this method, statistical data of Bakhtiari and Roudbar dam over 46 years (1955 until 2001) is used. Initially an appropriate objective function was specified and using DE algorithm, the rule curve was developed. In continue, operation policy using rule curves was compared to standard comparative operation policy. The proposed method distributed the lack to the whole year and lowest damage was inflicted to the system. The standard deviation of monthly shortfall of each year with the proposed algorithm was less deviated than the other two methods. The Results show that median values for the coefficients of F and Cr provide the optimum situation and cause DE algorithm not to be trapped in local optimum. The most optimal answer for coefficients are 0.6 and 0.5 for F and Cr coefficients, respectively. After finding the best combination of coefficients values F and CR, algorithms for solving the independent populations were examined. For this purpose, the population of 4, 25, 50, 100, 500 and 1000 members were studied in two generations (G=50 and 100). result indicates that the generation number 200 is suitable for optimizing. The increase in time per the number of population has almost a linear trend, which indicates the effect of population in the runtime algorithm. Hence specifying suitable population to obtain an optimal results is very important. Standard operation policy had better reversibility percentage, but inflicts severe vulnerability to the system. The results obtained in years of low rainfall had very good results compared to other comparative methods.

Keywords: reservoirs, differential evolution, dam, Optimal operation

Procedia PDF Downloads 51
610 PM Air Quality of Windsor Regional Scale Transport’s Impact and Climate Change

Authors: Moustafa Osman Mohammed

Abstract:

This paper is mapping air quality model to engineering the industrial system that ultimately utilized in extensive range of energy systems, distribution resources, and end-user technologies. The model is determining long-range transport patterns contribution as area source can either traced from 48 hrs backward trajectory model or remotely described from background measurements data in those days. The trajectory model will be run within stable conditions and quite constant parameters of the atmospheric pressure at the most time of the year. Air parcel trajectory is necessary for estimating the long-range transport of pollutants and other chemical species. It provides a better understanding of airflow patterns. Since a large amount of meteorological data and a great number of calculations are required to drive trajectory, it will be very useful to apply HYPSLIT model to locate areas and boundaries influence air quality at regional location of Windsor. 2–days backward trajectories model at high and low concentration measurements below and upward the benchmark which was areas influence air quality measurement levels. The benchmark level will be considered as 30 (μg/m3) as the moderate level for Ontario region. Thereby, air quality model is incorporating a midpoint concept between biotic and abiotic components to broaden the scope of quantification impact. The later outcomes’ theories of environmental obligation suggest either a recommendation or a decision of what is a legislative should be achieved in mitigation measures of air emission impact ultimately.

Keywords: air quality, management systems, environmental impact assessment, industrial ecology, climate change

Procedia PDF Downloads 211
609 Life Cycle Assessment of Almond Processing: Off-ground Harvesting Scenarios

Authors: Jessica Bain, Greg Thoma, Marty Matlock, Jeyam Subbiah, Ebenezer Kwofie

Abstract:

The environmental impact and particulate matter emissions (PM) associated with the production and packaging of 1 kg of almonds were evaluated using life cycle assessment (LCA). The assessment began at the point of ready to harvest with a system boundary was a cradle-to-gate assessment of almond packaging in California. The assessment included three scenarios of off-ground harvesting of almonds. The three general off-ground harvesting scenarios with variations include the harvested almonds solar dried on a paper tarp in the orchard, the harvested almonds solar dried on the floor in a separate lot, and the harvested almonds dried mechanically. The life cycle inventory (LCI) data for almond production were based on previously published literature and data provided by Almond Board of California (ABC). The ReCiPe 2016 method was used to calculate the midpoint impacts. Using consequential LCA model, the global warming potential (GWP) for the three harvesting scenarios are 2.90, 2.86, and 3.09 kg CO2 eq/ kg of packaged almond for scenarios 1, 2a, and 3a, respectively. The global warming potential for conventional harvesting method was 2.89 kg CO2 eq/ kg of packaged almond. The particulate matter emissions for each scenario per hectare for each off-ground harvesting scenario is 77.14, 9.56, 66.86, and 8.75 for conventional harvesting and scenarios 1, 2, and 3, respectively. The most significant contributions to the overall emissions were from almond production. The farm gate almond production had a global warming potential of 2.12 kg CO2 eq/ kg of packaged almond, approximately 73% of the overall emissions. Based on comparisons between the GWP and PM emissions, scenario 2a was the best tradeoff between GHG and PM production.

Keywords: life cycle assessment, low moisture foods, sustainability, LCA

Procedia PDF Downloads 52
608 Using Printouts as Social Media Evidence and Its Authentication in the Courtroom

Authors: Chih-Ping Chang

Abstract:

Different from traditional objective evidence, social media evidence has its own characteristics with easily tampering, recoverability, and cannot be read without using other devices (such as a computer). Simply taking a screenshot from social network sites must be questioned its original identity. When the police search and seizure digital information, a common way they use is to directly print out digital data obtained and ask the signature of the parties at the presence, without taking original digital data back. In addition to the issue on its original identity, this conduct to obtain evidence may have another two results. First, it will easily allege that is tampering evidence because the police wanted to frame the suspect and falsified evidence. Second, it is not easy to discovery hidden information. The core evidence associated with crime may not appear in the contents of files. Through discovery the original file, data related to the file, such as the original producer, creation time, modification date, and even GPS location display can be revealed from hidden information. Therefore, how to show this kind of evidence in the courtroom will be arguably the most important task for ruling social media evidence. This article, first, will introduce forensic software, like EnCase, TCT, FTK, and analyze their function to prove the identity with another digital data. Then turning back to the court, the second part of this article will discuss legal standard for authentication of social media evidence and application of that forensic software in the courtroom. As the conclusion, this article will provide a rethinking, that is, what kind of authenticity is this rule of evidence chase for. Does legal system automatically operate the transcription of scientific knowledge? Or furthermore, it wants to better render justice, not only under scientific fact, but through multivariate debating.

Keywords: federal rule of evidence, internet forensic, printouts as evidence, social media evidence, United States v. Vayner

Procedia PDF Downloads 268
607 Optimized Techniques for Reducing the Reactive Power Generation in Offshore Wind Farms in India

Authors: Pardhasaradhi Gudla, Imanual A.

Abstract:

The generated electrical power in offshore needs to be transmitted to grid which is located in onshore by using subsea cables. Long subsea cables produce reactive power, which should be compensated in order to limit transmission losses, to optimize the transmission capacity, and to keep the grid voltage within the safe operational limits. Installation cost of wind farm includes the structure design cost and electrical system cost. India has targeted to achieve 175GW of renewable energy capacity by 2022 including offshore wind power generation. Due to sea depth is more in India, the installation cost will be further high when compared to European countries where offshore wind energy is already generating successfully. So innovations are required to reduce the offshore wind power project cost. This paper presents the optimized techniques to reduce the installation cost of offshore wind firm with respect to electrical transmission systems. This technical paper provides the techniques for increasing the current carrying capacity of subsea cable by decreasing the reactive power generation (capacitance effect) of the subsea cable. There are many methods for reactive power compensation in wind power plants so far in execution. The main reason for the need of reactive power compensation is capacitance effect of subsea cable. So if we diminish the cable capacitance of cable then the requirement of the reactive power compensation will be reduced or optimized by avoiding the intermediate substation at midpoint of the transmission network.

Keywords: offshore wind power, optimized techniques, power system, sub sea cable

Procedia PDF Downloads 158
606 Design of a Fuzzy Expert System for the Impact of Diabetes Mellitus on Cardiac and Renal Impediments

Authors: E. Rama Devi Jothilingam

Abstract:

Diabetes mellitus is now one of the most common non communicable diseases globally. India leads the world with largest number of diabetic subjects earning the title "diabetes capital of the world". In order to reduce the mortality rate, a fuzzy expert system is designed to predict the severity of cardiac and renal problems of diabetic patients using fuzzy logic. Since uncertainty is inherent in medicine, fuzzy logic is used in this research work to remove the inherent fuzziness of linguistic concepts and uncertain status in diabetes mellitus which is the prime cause for the cardiac arrest and renal failure. In this work, the controllable risk factors "blood sugar, insulin, ketones, lipids, obesity, blood pressure and protein/creatinine ratio" are considered as input parameters and the "the stages of cardiac" (SOC)" and the stages of renal" (SORD) are considered as the output parameters. The triangular membership functions are used to model the input and output parameters. The rule base is constructed for the proposed expert system based on the knowledge from the medical experts. Mamdani inference engine is used to infer the information based on the rule base to take major decision in diagnosis. Mean of maximum is used to get a non fuzzy control action that best represent possibility distribution of an inferred fuzzy control action. The proposed system also classifies the patients with high risk and low risk using fuzzy c means clustering techniques so that the patients with high risk are treated immediately. The system is validated with Matlab and is used as a tracking system with accuracy and robustness.

Keywords: Diabetes mellitus, fuzzy expert system, Mamdani, MATLAB

Procedia PDF Downloads 263
605 Design Components and Reliability Aspects of Municipal Waste Water and SEIG Based Micro Hydro Power Plant

Authors: R. K. Saket

Abstract:

This paper presents design aspects and probabilistic approach for generation reliability evaluation of an alternative resource: municipal waste water based micro hydro power generation system. Annual and daily flow duration curves have been obtained for design, installation, development, scientific analysis and reliability evaluation of the MHPP. The hydro potential of the waste water flowing through sewage system of the BHU campus has been determined to produce annual flow duration and daily flow duration curves by ordering the recorded water flows from maximum to minimum values. Design pressure, the roughness of the pipe’s interior surface, method of joining, weight, ease of installation, accessibility to the sewage system, design life, maintenance, weather conditions, availability of material, related cost and likelihood of structural damage have been considered for design of a particular penstock for reliable operation of the MHPP. A MHPGS based on MWW and SEIG is designed, developed, and practically implemented to provide reliable electric energy to suitable load in the campus of the Banaras Hindu University, Varanasi, (UP), India. Generation reliability evaluation of the developed MHPP using Gaussian distribution approach, safety factor concept, peak load consideration and Simpson 1/3rd rule has presented in this paper.

Keywords: self excited induction generator, annual and daily flow duration curve, sewage system, municipal waste water, reliability evaluation, Gaussian distribution, Simpson 1/3rd rule

Procedia PDF Downloads 534
604 Finding the Association Rule between Nursing Interventions and Early Evaluation Results of In-Hospital Cardiac Arrest to Improve Patient Safety

Authors: Wei-Chih Huang, Pei-Lung Chung, Ching-Heng Lin, Hsuan-Chia Yang, Der-Ming Liou

Abstract:

Background: In-Hospital Cardiac Arrest (IHCA) threaten life of the inpatients, cause serious effect to patient safety, quality of inpatients care and hospital service. Health providers must identify the signs of IHCA early to avoid the occurrence of IHCA. This study will consider the potential association between early signs of IHCA and the essence of patient care provided by nurses and other professionals before an IHCA occurs. The aim of this study is to identify significant associations between nursing interventions and abnormal early evaluation results of IHCA that can assist health care providers in monitoring inpatients at risk of IHCA to increase opportunities of IHCA early detection and prevention. Materials and Methods: This study used one of the data mining techniques called association rules mining to compute associations between nursing interventions and abnormal early evaluation results of IHCA. The nursing interventions and abnormal early evaluation results of IHCA were considered to be co-occurring if nursing interventions were provided within 24 hours of last being observed in abnormal early evaluation results of IHCA. The rule based methods were utilized 23.6 million electronic medical records (EMR) from a medical center in Taipei, Taiwan. This dataset includes 733 concepts of nursing interventions that coded by clinical care classification (CCC) codes and 13 early evaluation results of IHCA with binary codes. The values of interestingness and lift were computed as Q values to measure the co-occurrence and associations’ strength between all in-hospital patient care measures and abnormal early evaluation results of IHCA. The associations were evaluated by comparing the results of Q values and verified by medical experts. Results and Conclusions: The results show that there are 4195 pairs of associations between nursing interventions and abnormal early evaluation results of IHCA with their Q values. The indication of positive association is 203 pairs with Q values greater than 5. Inpatients with high blood sugar level (hyperglycemia) have positive association with having heart rate lower than 50 beats per minute or higher than 120 beats per minute, Q value is 6.636. Inpatients with temporary pacemaker (TPM) have significant association with high risk of IHCA, Q value is 47.403. There is significant positive correlation between inpatients with hypovolemia and happened abnormal heart rhythms (arrhythmias), Q value is 127.49. The results of this study can help to prevent IHCA from occurring by making health care providers early recognition of inpatients at risk of IHCA, assist with monitoring patients for providing quality of care to patients, improve IHCA surveillance and quality of in-hospital care.

Keywords: in-hospital cardiac arrest, patient safety, nursing intervention, association rule mining

Procedia PDF Downloads 245
603 Auditory and Visual Perceptual Category Learning in Adults with ADHD: Implications for Learning Systems and Domain-General Factors

Authors: Yafit Gabay

Abstract:

Attention deficit hyperactivity disorder (ADHD) has been associated with both suboptimal functioning in the striatum and prefrontal cortex. Such abnormalities may impede the acquisition of perceptual categories, which are important for fundamental abilities such as object recognition and speech perception. Indeed, prior research has supported this possibility, demonstrating that children with ADHD have similar visual category learning performance as their neurotypical peers but use suboptimal learning strategies. However, much less is known about category learning processes in the auditory domain or among adults with ADHD in which prefrontal functions are more mature compared to children. Here, we investigated auditory and visual perceptual category learning in adults with ADHD and neurotypical individuals. Specifically, we examined learning of rule-based categories – presumed to be optimally learned by a frontal cortex-mediated hypothesis testing – and information-integration categories – hypothesized to be optimally learned by a striatally-mediated reinforcement learning system. Consistent with striatal and prefrontal cortical impairments observed in ADHD, our results show that across sensory modalities, both rule-based and information-integration category learning is impaired in adults with ADHD. Computational modeling analyses revealed that individuals with ADHD were slower to shift to optimal strategies than neurotypicals, regardless of category type or modality. Taken together, these results suggest that both explicit, frontally mediated and implicit, striatally mediated category learning are impaired in ADHD. These results suggest impairments across multiple learning systems in young adults with ADHD that extend across sensory modalities and likely arise from domain-general mechanisms.

Keywords: ADHD, category learning, modality, computational modeling

Procedia PDF Downloads 6
602 The Possible Application of Artificial Intelligence in Hungarian Court Practice

Authors: László Schmidt

Abstract:

In the context of artificial intelligence, we need to pay primary and particular attention to ethical principles not only in the design process but also during the application process. According to the European Commission's Ethical Guidelines, AI must have three main characteristics: it must be legal, ethical and stabil. We must never lose sight of the ethical principles because we risk that this new technology will not help democratic decision-making under the rule of law, but will, on the contrary, destroy it. The rapid spread and use of artificial intelligence poses an enormous challenge to both lawmaking and law enforcement. On legislation because AI permeates many areas of our daily lives that the legislator must regulate. We can see how challenging it is to regulate e.g., selfdriving cars/taxis/vans etc. Not to mention, more recently, cryptocurrencies and Chat GPT, the use of which also requires legislative intervention, from copyright to scientific use and even law of succession. Artificial intelligence also poses an extraordinary challenge to law enforcement. In criminal cases, police and prosecutors can make great use of AI in investigations, e.g. in forensics, DNA samples, reconstruction, identification, etc. But it can also be of great help in the detection of crimes committed in cyberspace. In criminal or civil court proceedings, AI can also play a major role in the evaluation of evidence and proof. For example, a photo or video or audio recording could be immediately revealed as genuine or fake. Likewise, the authenticity or falsification of a document could be determined much more quickly and cheaply than with current procedure (expert witnesses). Neither the current Hungarian Civil Procedure Act nor the Criminal Procedure Act allows the use of artificial intelligence in the evidentiary process. However, this should be changed. To use this technology in court proceedings would be very useful. The procedures would be faster, simpler, and therefore cheaper. Artificial intelligence could also replace much of the work of expert witnesses. Its introduction into judicial procedures would certainly be justified, but with due respect for human rights, the right to a fair trial and other democratic and rule of law guarantees.

Keywords: artificial intelligence, judiciary, Hungarian, court practice

Procedia PDF Downloads 48
601 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models

Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin

Abstract:

Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.

Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR

Procedia PDF Downloads 118
600 Organ Donation after Medical Aid in Dying: A Critical Study of Clinical Processes and Legal Rules in Place

Authors: Louise Bernier

Abstract:

Under some jurisdictions (including Canada), eligible patients can request and receive medical assistance in dying (MAiD) through lethal injections, inducing their cardiocirculatory death. Those same patients can also wish to donate their organs in the process. If they qualify as organ donors, a clinical and ethical rule called the 'dead donor rule' (DDR) requires the transplant teams to wait after cardiocirculatory death is confirmed, followed by a 'no touch' period (5 minutes in Canada) before they can proceed with organ removal. The medical procedures (lethal injections) as well as the delays associated with the DDR can damage organs (mostly thoracic organs) due to prolonged anoxia. Yet, strong scientific evidences demonstrate that operating differently and reconsidering the DDR would result in more organs of better quality available for transplant. This idea generates discomfort and resistance, but it is also worth considering, especially in a context of chronic shortage of available organs. One option that could be examined for MAiD’ patients who wish and can be organ donors would be to remove vital organs while patients are still alive (and under sedation). This would imply accepting that patient’s death would occur through organ donation instead of lethal injections required under MAiD’ legal rules. It would also mean that patients requesting MAiD and wishing to be organ donors could aspire to donate better quality organs, including their heart, an altruistic gesture that carries important symbolic value for many donors and their families. Following a patient centered approach, our hypothesis is that preventing vital organ donation from a living donor in all circumstance is neither perfectly coherent with how legal mentalities have evolved lately in the field of fundamental rights nor compatible with the clinical and ethical frameworks that shape the landscape in which those complex medical decisions unfold. Through a study of the legal, ethical, and clinical rules in place, both at the national and international levels, this analysis raises questions on the numerous inconsistencies associated with respecting the DDR with patients who have chosen to die through MAiD. We will begin with an assessment of the erosion of certain national legal frameworks that pertain to the sacred nature of the right to life which now also includes the right to choose how one wishes to die. We will then study recent innovative clinical protocols tested in different countries to help address acute organ shortage problems in creative ways. We will conclude this analysis with an ethical assessment of the situation, referring to principles such as justice, autonomy, altruism, beneficence, and non-malfeasance. This study will build a strong argument in favor of starting to allow vital organ donations from living donors in countries where MAiD is already permitted.

Keywords: altruism, autonomy, dead donor rule, medical assistance in dying, non-malfeasance, organ donation

Procedia PDF Downloads 151
599 Moral Wrongdoers: Evaluating the Value of Moral Actions Performed by War Criminals

Authors: Jean-Francois Caron

Abstract:

This text explores the value of moral acts performed by war criminals, and the extent to which they should alleviate the punishment these individuals ought to receive for violating the rules of war. Without neglecting the necessity of retribution in war crimes cases, it argues from an ethical perspective that we should not rule out the possibility of considering lesser punishments for war criminals who decide to perform a moral act, as it might produce significant positive moral outcomes. This text also analyzes how such a norm could be justified from a moral perspective.

Keywords: war criminals, pardon, amnesty, retribution

Procedia PDF Downloads 250
598 Uncontrollable Inaccuracy in Inverse Problems

Authors: Yu Menshikov

Abstract:

In this paper the influence of errors of function derivatives in initial time which have been obtained by experiment (uncontrollable inaccuracy) to the results of inverse problem solution was investigated. It was shown that these errors distort the inverse problem solution as a rule near the beginning of interval where the solution are analyzed. Several methods for remove the influence of uncontrollable inaccuracy have been suggested.

Keywords: inverse problems, filtration, uncontrollable inaccuracy

Procedia PDF Downloads 484
597 Impact of FACTS Devices on Power Networks Reliability

Authors: Alireza Alesaadi

Abstract:

Flexible AC transmission system (FACTS) devices have an important rule on expnded electrical transmission networks. In this paper, the effect of these diveces on reliability of electrical networks is studied and it is shown that using of FACTS devices can improve the relibiability of power networks, significantly.

Keywords: FACTS devices, power networks, reliability

Procedia PDF Downloads 384
596 An Analysis of the Dominance of Migrants in the South African Spaza and Retail market: A Relationship-Based Network Perspective

Authors: Meron Okbandrias

Abstract:

The South African formal economy is rule-based economy, unlike most African and Asian markets. It has a highly developed financial market. In such a market, foreign migrants have dominated the small or spaza shops that service the poor. They are highly competitive and capture significant market share in South Africa. This paper analyses the factors that assisted the foreign migrants in having a competitive age. It does that by interviewing Somali, Bangladesh, and Ethiopian shop owners in Cape Town analysing the data through a narrative analysis. The paper also analyses the 2019 South African consumer report. The three migrant nationalities mentioned above dominate the spaza shop business and have significant distribution networks. The findings of the paper indicate that family, ethnic, and nationality based network, in that order of importance, form bases for a relationship-based business network that has trust as its mainstay. Therefore, this network ensures the pooling of resources and abiding by certain principles outside the South African rule-based system. The research identified practises like bulk buying within a community of traders, sharing information, buying from a within community distribution business, community based transportation system and providing seed capital for people from the community to start a business is all based on that relationship-based system. The consequences of not abiding by the rules of these networks are social and economic exclusion. In addition, these networks have their own commercial and social conflict resolution mechanisms aside from the South African justice system. Network theory and relationship based systems theory form the theoretical foundations of this paper.

Keywords: migrant, spaza shops, relationship-based system, South Africa

Procedia PDF Downloads 99
595 Generation of Knowlege with Self-Learning Methods for Ophthalmic Data

Authors: Klaus Peter Scherer, Daniel Knöll, Constantin Rieder

Abstract:

Problem and Purpose: Intelligent systems are available and helpful to support the human being decision process, especially when complex surgical eye interventions are necessary and must be performed. Normally, such a decision support system consists of a knowledge-based module, which is responsible for the real assistance power, given by an explanation and logical reasoning processes. The interview based acquisition and generation of the complex knowledge itself is very crucial, because there are different correlations between the complex parameters. So, in this project (semi)automated self-learning methods are researched and developed for an enhancement of the quality of such a decision support system. Methods: For ophthalmic data sets of real patients in a hospital, advanced data mining procedures seem to be very helpful. Especially subgroup analysis methods are developed, extended and used to analyze and find out the correlations and conditional dependencies between the structured patient data. After finding causal dependencies, a ranking must be performed for the generation of rule-based representations. For this, anonymous patient data are transformed into a special machine language format. The imported data are used as input for algorithms of conditioned probability methods to calculate the parameter distributions concerning a special given goal parameter. Results: In the field of knowledge discovery advanced methods and applications could be performed to produce operation and patient related correlations. So, new knowledge was generated by finding causal relations between the operational equipment, the medical instances and patient specific history by a dependency ranking process. After transformation in association rules logically based representations were available for the clinical experts to evaluate the new knowledge. The structured data sets take account of about 80 parameters as special characteristic features per patient. For different extended patient groups (100, 300, 500), as well one target value as well multi-target values were set for the subgroup analysis. So the newly generated hypotheses could be interpreted regarding the dependency or independency of patient number. Conclusions: The aim and the advantage of such a semi-automatically self-learning process are the extensions of the knowledge base by finding new parameter correlations. The discovered knowledge is transformed into association rules and serves as rule-based representation of the knowledge in the knowledge base. Even more, than one goal parameter of interest can be considered by the semi-automated learning process. With ranking procedures, the most strong premises and also conjunctive associated conditions can be found to conclude the interested goal parameter. So the knowledge, hidden in structured tables or lists can be extracted as rule-based representation. This is a real assistance power for the communication with the clinical experts.

Keywords: an expert system, knowledge-based support, ophthalmic decision support, self-learning methods

Procedia PDF Downloads 230
594 Association Rules Mining Task Using Metaheuristics: Review

Authors: Abir Derouiche, Abdesslem Layeb

Abstract:

Association Rule Mining (ARM) is one of the most popular data mining tasks and it is widely used in various areas. The search for association rules is an NP-complete problem that is why metaheuristics have been widely used to solve it. The present paper presents the ARM as an optimization problem and surveys the proposed approaches in the literature based on metaheuristics.

Keywords: Optimization, Metaheuristics, Data Mining, Association rules Mining

Procedia PDF Downloads 134
593 Developing Index of Democratic Institutions' Vulnerability

Authors: Kamil Jonski

Abstract:

Last year vividly demonstrated, that populism and political instability can endanger democratic institutions in countries regarded as democratic transition champions (Poland) or cornerstones of liberal order (UK, US). So called ‘illiberal democracy’ is winning hearts and minds of voters, keen to believe that rule of strongman is a viable alternative to perceived decay of western values and institutions. These developments pose a serious threat to the democratic institutions (including rule of law), proven critical for both personal freedom and economic development. Although scholars proposed some structural explanations of the illiberal wave (notably focusing on inequality, stagnant incomes and drawbacks of globalization), they seem to have little predictive value. Indeed, events like Trump’s victory, Brexit or Polish shift towards populist nationalism always came as a surprise. Intriguingly, in the case of US election, simple rules like ‘Bread and Peace model’ gauged prospects of Trump’s victory better than pundits and pollsters. This paper attempts to compile set of indicators, in order to gauge various democracies’ vulnerability to populism, instability and pursuance of ‘illiberal’ projects. Among them, it identifies the gap between consensus assessment of institutional performance (as measured by WGI indicators) and citizens’ subjective assessment (survey based confidence in institutions). Plotting these variables against each other, reveals three clusters of countries – ‘predictable’ (good institutions and high confidence, poor institutions and low confidence), ‘blind’ (poor institutions, high confidence e.g. Uzbekistan or Azerbaijan) and ‘disillusioned’ (good institutions, low confidence e.g. Spain, Chile, Poland and US). It seems that this clustering – carried out separately for various institutions (like legislature, executive and courts) and blended with economic indicators like inequality and living standards (using PCA) – offers reasonably good watchlist of countries, that should ‘expect the unexpected’.

Keywords: illiberal democracy, populism, political instability, political risk measurement

Procedia PDF Downloads 175
592 Transfigurative Changes of Governmental Responsibility

Authors: Ákos Cserny

Abstract:

The unequivocal increase of the area of operation of the executive power can happen with the appearance of new areas to be influenced and its integration in the power, or at the expense of the scopes of other organs with public authority. The extension of the executive can only be accepted within the framework of the rule of law if parallel with this process we get constitutional guarantees that the exercise of power is kept within constitutional framework. Failure to do so, however, may result in the lack, deficit of democracy and democratic sense, and may cause an overwhelming dominance of the executive power. Therefore, the aim of this paper is to present executive power and responsibility in the context of different dimensions.

Keywords: confidence, constitution, executive power, liabiliy, parliamentarism

Procedia PDF Downloads 372
591 Fuzzy Control and Pertinence Functions

Authors: Luiz F. J. Maia

Abstract:

This paper presents an approach to fuzzy control, with the use of new pertinence functions, applied in the case of an inverted pendulum. Appropriate definitions of pertinence functions to fuzzy sets make possible the implementation of the controller with only one control rule, resulting in a smooth control surface. The fuzzy control system can be implemented with analog devices, affording a true real-time performance.

Keywords: control surface, fuzzy control, Inverted pendulum, pertinence functions

Procedia PDF Downloads 413
590 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images

Authors: Amit Kumar Happy

Abstract:

This paper is motivated by the importance of multi-sensor image fusion with a specific focus on infrared (IR) and visual image (VI) fusion for various applications, including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like visible camera & IR thermal imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (infrared) that may be reflected or self-emitted. A digital color camera captures the visible source image, and a thermal infrared camera acquires the thermal source image. In this paper, some image fusion algorithms based upon multi-scale transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes the implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, they also make it hard to become deployed in systems and applications that require a real-time operation, high flexibility, and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.

Keywords: image fusion, IR thermal imager, multi-sensor, multi-scale transform

Procedia PDF Downloads 84