Search results for: complex event processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9520

Search results for: complex event processing

9010 A Simulation of Patient Queuing System on Radiology Department at Tertiary Specialized Referral Hospital in Indonesia

Authors: Yonathan Audhitya Suthihono, Ratih Dyah Kusumastuti

Abstract:

The radiology department in a tertiary referral hospital faces service operation challenges such as huge and various patient arrival, which can increase the probability of patient queuing. During the COVID-19 pandemic, it is mandatory to apply social distancing protocol in the radiology department. A strategy to prevent the accumulation of patients at one spot would be required. The aim of this study is to identify an alternative solution which can reduce the patient’s waiting time in radiology department. Discrete event simulation (DES) is used for this study by constructing several improvement scenarios with Arena simulation software. Statistical analysis is used to test the validity of the base case scenario model and to investigate the performance of the improvement scenarios. The result of this study shows that the selected scenario is able to reduce patient waiting time significantly, which leads to more efficient services in a radiology department, be able to serve patients more effectively, and thus increase patient satisfaction. The result of the simulation can be used by the hospital management to improve the operational performance of the radiology department.

Keywords: discrete event simulation, hospital management patient queuing model, radiology department services

Procedia PDF Downloads 112
9009 Embedded Acoustic Signal Processing System Using OpenMP Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

In this paper, altera de1-SoC FPGA board technology is utilized as a distinguished tool for nondestructive characterization of an aluminum circular cylindrical shell of radius ratio b/a (a: outer radius; b: inner radius). The acoustic backscattered signal processing system has been developed using OpenMP architecture. The design is built in three blocks; it is implemented per functional block, in a heterogeneous Intel-Altera system running under Linux. The useful data to determine the performances of SoC FPGA is computed by the analytical method. The exploitation of SoC FPGA has lead to obtain the backscattering form function and resonance spectra. A0 and S0 modes of propagation in the tube are shown. The findings are then compared to those achieved from the Matlab simulation of analytical method. A good agreement has, therefore, been noted. Moreover, the detailed SoC FPGA-based system has shown that acoustic spectra are performed at up to 5 times faster than the Matlab implementation using almost the same data. This FPGA-based system implementation of processing algorithms is realized with a coefficient of correlation R and absolute error respectively about 0.962 and 5 10⁻⁵.

Keywords: OpenMP, signal processing system, acoustic backscattering, nondestructive characterization, thin tubes

Procedia PDF Downloads 88
9008 Applied Complement of Probability and Information Entropy for Prediction in Student Learning

Authors: Kennedy Efosa Ehimwenma, Sujatha Krishnamoorthy, Safiya Al‑Sharji

Abstract:

The probability computation of events is in the interval of [0, 1], which are values that are determined by the number of outcomes of events in a sample space S. The probability Pr(A) that an event A will never occur is 0. The probability Pr(B) that event B will certainly occur is 1. This makes both events A and B a certainty. Furthermore, the sum of probabilities Pr(E₁) + Pr(E₂) + … + Pr(Eₙ) of a finite set of events in a given sample space S equals 1. Conversely, the difference of the sum of two probabilities that will certainly occur is 0. This paper first discusses Bayes, the complement of probability, and the difference of probability for occurrences of learning-events before applying them in the prediction of learning objects in student learning. Given the sum of 1; to make a recommendation for student learning, this paper proposes that the difference of argMaxPr(S) and the probability of student-performance quantifies the weight of learning objects for students. Using a dataset of skill-set, the computational procedure demonstrates i) the probability of skill-set events that have occurred that would lead to higher-level learning; ii) the probability of the events that have not occurred that requires subject-matter relearning; iii) accuracy of the decision tree in the prediction of student performance into class labels and iv) information entropy about skill-set data and its implication on student cognitive performance and recommendation of learning.

Keywords: complement of probability, Bayes’ rule, prediction, pre-assessments, computational education, information theory

Procedia PDF Downloads 156
9007 Thermo-Mechanical Processing Scheme to Obtain Micro-Duplex Structure Favoring Superplasticity in an As-Cast and Homogenized Medium Alloyed Nickel Base Superalloy

Authors: K. Sahithya, I. Balasundar, Pritapant, T. Raghua

Abstract:

Ni-based superalloy with a nominal composition Ni-14% Cr-11% Co-5.8% Mo-2.4% Ti-2.4% Nb-2.8% Al-0.26 % Fe-0.032% Si-0.069% C (all in wt %) is used as turbine discs in a variety of aero engines. Like any other superalloy, the primary processing of the as-cast superalloy poses a major challenge due to its complex alloy chemistry. The challenge was circumvented by characterizing the different phases present in the material, optimizing the homogenization treatment, identifying a suitable thermomechanical processing window using dynamic materials modeling. The as-cast material was subjected to homogenization at 1200°C for a soaking period of 8 hours and quenched using different media. Water quenching (WQ) after homogenization resulted in very fine spherical γꞌ precipitates of sizes 30-50 nm, whereas furnace cooling (FC) after homogenization resulted in bimodal distribution of precipitates (primary gamma prime of size 300nm and secondary gamma prime of size 5-10 nm). MC type primary carbides that are stable till the melting point of the material were found in both WQ and FC samples. Deformation behaviour of both the materials below (1000-1100°C) and above gamma prime solvus (1100-1175°C) was evaluated by subjecting the material to series of compression tests at different constant true strain rates (0.0001/sec-1/sec). An in-detail examination of the precipitate dislocation interaction mechanisms carried out using TEM revealed precipitate shearing and Orowan looping as the mechanisms governing deformation in WQ and FC, respectively. Incoherent/semi coherent gamma prime precipitates in the case of FC material facilitates better workability of the material, whereas the coherent precipitates in WQ material contributed to higher resistance to deformation of the material. Both the materials exhibited discontinuous dynamic recrystallization (DDRX) above gamma prime solvus temperature. The recrystallization kinetics was slower in the case of WQ material. Very fine grain boundary carbides ( ≤ 300 nm) retarded the recrystallisation kinetics in WQ. Coarse carbides (1-5 µm) facilitate particle stimulated nucleation in FC material. The FC material was cogged (primary hot working) 1120˚C, 0.03/sec resulting in significant grain refinement, i.e., from 3000 μm to 100 μm. The primary processed material was subjected to intensive thermomechanical deformation subsequently by reducing the temperature by 50˚C in each processing step with intermittent heterogenization treatment at selected temperatures aimed at simultaneous coarsening of the gamma prime precipitates and refinement of the gamma matrix grains. The heterogeneous annealing treatment carried out, resulted in gamma grains of 10 μm and gamma prime precipitates of 1-2 μm. Further thermo mechanical processing of the material was carried out at 1025˚C to increase the homogeneity of the obtained micro-duplex structure.

Keywords: superalloys, dynamic material modeling, nickel alloys, dynamic recrystallization, superplasticity

Procedia PDF Downloads 116
9006 Investigating the Relationship between Bank and Cloud Provider

Authors: Hatim Elhag

Abstract:

Banking and Financial Service Institutions are possibly the most advanced in terms of technology adoption and use it as a key differentiator. With high levels of business process automation, maturity in the functional portfolio, straight through processing and proven technology outsourcing benefits, Banking sector stand to benefit significantly from Cloud computing capabilities. Additionally, with complex Compliance and Regulatory policies, combined with expansive products and geography coverage, the business impact is even greater. While the benefits are exponential, there are also significant challenges in adopting this model– including Legal, Security, Performance, Reliability, Transformation complexity, Operating control and Governance and most importantly proof for the promised cost benefits. However, new architecture designed should be implemented to align this approach.

Keywords: security, cloud, banking sector, cloud computing

Procedia PDF Downloads 496
9005 Typology of the Physic-Chemical Quality of the Water of the Area of Touggourt Case: Aquifers of the Intercalary Continental and the Terminal Complex, S-E of Algeria

Authors: Habes Sameh, Bettahar Asma, Nezli Imad Eddine

Abstract:

The region of Touggourt is situated in the southern part is Algeria, it receives important quantities of waters, the latter are extracted from the fossil groundwater (the Intercalary Continental and the Terminal Complex). The mineralization of these waters of the Terminal Complex is between 3 and 6,5 g/l and for waters of Intercalary Continental is 1,8 and 8,7 g/l, thus it constitutes an obstacle as for its use. To highlight the origins of this mineralization, we used the hydrochemical tool. So the chemical analyses in our ownership, were treated by means of the software "Statistica", what allowed us to realize an analysis in main components (ACP), the latter showed a competition between sodic or magnesian chlorinated water and calcic bicarbonate water, rich in potassium for the TC, while for the IC, we have a competition between sodic or calcic chlorinated and magnesian water treated with copper sulphate waters. The simulation realized thermodynamics showed a variation of the index of saturation which do not exceed zero, for waters of two aquifer TC and IC, so indicating one under saturation of waters towards minerals, highlighting the influence of the geologic formation in the outcrop on the quality of waters. However, we notice that these waters remain acceptable for the irrigation of plants but must be treated before what are consumed by the human being.

Keywords: ACP, intercalary, continental, mineralization, SI, Terminal Complex

Procedia PDF Downloads 524
9004 The Impact of Artificial Intelligence on Food Industry

Authors: George Hanna Abdelmelek Henien

Abstract:

Quality and safety issues are common in Ethiopia's food processing industry, which can negatively impact consumers' health and livelihoods. The country is known for its various agricultural products that are important to the economy. However, food quality and safety policies and management practices in the food processing industry have led to many health problems, foodborne illnesses and economic losses. This article aims to show the causes and consequences of food safety and quality problems in the food processing industry in Ethiopia and discuss possible solutions to solve them. One of the main reasons for food quality and safety in Ethiopia's food processing industry is the lack of adequate regulation and enforcement mechanisms. Inadequate food safety and quality policies have led to inefficiencies in food production. Additionally, the failure to monitor and enforce existing regulations has created a good opportunity for unscrupulous companies to engage in harmful practices that endanger the lives of citizens. The impact on food quality and safety is significant due to loss of life, high medical costs, and loss of consumer confidence in the food processing industry. Foodborne diseases such as diarrhoea, typhoid and cholera are common in Ethiopia, and food quality and safety play an important role in . Additionally, food recalls due to contamination or contamination often cause significant economic losses in the food processing industry. To solve these problems, the Ethiopian government began taking measures to improve food quality and safety in the food processing industry. One of the most prominent initiatives is the Ethiopian Food and Drug Administration (EFDA), which was established in 2010 to monitor and control the quality and safety of food and beverage products in the country. EFDA has implemented many measures to improve food safety, such as carrying out routine inspections, monitoring the import of food products and implementing labeling requirements. Another solution that can improve food quality and safety in the food processing industry in Ethiopia is the implementation of food safety management system (FSMS). FSMS is a set of procedures and policies designed to identify, assess and control food safety risks during food processing. Implementing a FSMS can help companies in the food processing industry identify and address potential risks before they harm consumers. Additionally, implementing an FSMS can help companies comply with current safety and security regulations. Consequently, improving food safety policy and management system in Ethiopia's food processing industry is important to protect people's health and improve the country's economy. . Addressing the root causes of food quality and safety and implementing practical solutions that can help improve the overall food safety and quality in the country, such as establishing regulatory bodies and implementing food management systems.

Keywords: food quality, food safety, policy, management system, food processing industry food traceability, industry 4.0, internet of things, block chain, best worst method, marcos

Procedia PDF Downloads 56
9003 Absorbed Dose Estimation of 68Ga-EDTMP in Human Organs

Authors: S. Zolghadri, H. Yousefnia, A. R. Jalilian

Abstract:

Bone metastases are observed in a wide range of cancers leading to intolerable pain. While early detection can help the physicians in the decision of the type of treatment, various radiopharmaceuticals using phosphonates like 68Ga-EDTMP have been developed. In this work, due to the importance of absorbed dose, human absorbed dose of this new agent was calculated for the first time based on biodistribution data in Wild-type rats. 68Ga was obtained from 68Ge/68Ga generator with radionuclidic purity and radiochemical purity of higher than 99%. The radiolabeled complex was prepared in the optimized conditions. Radiochemical purity of the radiolabeled complex was checked by instant thin layer chromatography (ITLC) method using Whatman No. 2 paper and saline. The results indicated the radiochemical purity of higher than 99%. The radiolabelled complex was injected into the Wild-type rats and its biodistribution was studied up to 120 min. As expected, major accumulation was observed in the bone. Absorbed dose of each human organ was calculated based on biodistribution in the rats using RADAR method. Bone surface and bone marrow with 0.112 and 0.053 mSv/MBq, respectively, received the highest absorbed dose. According to these results, the radiolabeled complex is a suitable and safe option for PET bone imaging.

Keywords: absorbed dose, EDTMP, ⁶⁸Ga, rats

Procedia PDF Downloads 192
9002 Subjective Probability and the Intertemporal Dimension of Probability to Correct the Misrelation Between Risk and Return of a Financial Asset as Perceived by Investors. Extension of Prospect Theory to Better Describe Risk Aversion

Authors: Roberta Martino, Viviana Ventre

Abstract:

From a theoretical point of view, the relationship between the risk associated with an investment and the expected value are directly proportional, in the sense that the market allows a greater result to those who are willing to take a greater risk. However, empirical evidence proves that this relationship is distorted in the minds of investors and is perceived exactly the opposite. To deepen and understand the discrepancy between the actual actions of the investor and the theoretical predictions, this paper analyzes the essential parameters used for the valuation of financial assets with greater attention to two elements: probability and the passage of time. Although these may seem at first glance to be two distinct elements, they are closely related. In particular, the error in the theoretical description of the relationship between risk and return lies in the failure to consider the impatience that is generated in the decision-maker when events that have not yet happened occur in the decision-making context. In this context, probability loses its objective meaning and in relation to the psychological aspects of the investor, it can only be understood as the degree of confidence that the investor has in the occurrence or non-occurrence of an event. Moreover, the concept of objective probability does not consider the inter-temporality that characterizes financial activities and does not consider the condition of limited cognitive capacity of the decision maker. Cognitive psychology has made it possible to understand that the mind acts with a compromise between quality and effort when faced with very complex choices. To evaluate an event that has not yet happened, it is necessary to imagine that it happens in your head. This projection into the future requires a cognitive effort and is what differentiates choices under conditions of risk and choices under conditions of uncertainty. In fact, since the receipt of the outcome in choices under risk conditions is imminent, the mechanism of self-projection into the future is not necessary to imagine the consequence of the choice and the decision makers dwell on the objective analysis of possibilities. Financial activities, on the other hand, develop over time and the objective probability is too static to consider the anticipatory emotions that the self-projection mechanism generates in the investor. Assuming that uncertainty is inherent in valuations of events that have not yet occurred, the focus must shift from risk management to uncertainty management. Only in this way the intertemporal dimension of the decision-making environment and the haste generated by the financial market can be cautioned and considered. The work considers an extension of the prospectus theory with the temporal component with the aim of providing a description of the attitude towards risk with respect to the passage of time.

Keywords: impatience, risk aversion, subjective probability, uncertainty

Procedia PDF Downloads 106
9001 A Fuzzy TOPSIS Based Model for Safety Risk Assessment of Operational Flight Data

Authors: N. Borjalilu, P. Rabiei, A. Enjoo

Abstract:

Flight Data Monitoring (FDM) program assists an operator in aviation industries to identify, quantify, assess and address operational safety risks, in order to improve safety of flight operations. FDM is a powerful tool for an aircraft operator integrated into the operator’s Safety Management System (SMS), allowing to detect, confirm, and assess safety issues and to check the effectiveness of corrective actions, associated with human errors. This article proposes a model for safety risk assessment level of flight data in a different aspect of event focus based on fuzzy set values. It permits to evaluate the operational safety level from the point of view of flight activities. The main advantages of this method are proposed qualitative safety analysis of flight data. This research applies the opinions of the aviation experts through a number of questionnaires Related to flight data in four categories of occurrence that can take place during an accident or an incident such as: Runway Excursions (RE), Controlled Flight Into Terrain (CFIT), Mid-Air Collision (MAC), Loss of Control in Flight (LOC-I). By weighting each one (by F-TOPSIS) and applying it to the number of risks of the event, the safety risk of each related events can be obtained.

Keywords: F-topsis, fuzzy set, flight data monitoring (FDM), flight safety

Procedia PDF Downloads 163
9000 Radar-Based Classification of Pedestrian and Dog Using High-Resolution Raw Range-Doppler Signatures

Authors: C. Mayr, J. Periya, A. Kariminezhad

Abstract:

In this paper, we developed a learning framework for the classification of vulnerable road users (VRU) by their range-Doppler signatures. The frequency-modulated continuous-wave (FMCW) radar raw data is first pre-processed to obtain robust object range-Doppler maps per coherent time interval. The complex-valued range-Doppler maps captured from our outdoor measurements are further fed into a convolutional neural network (CNN) to learn the classification. This CNN has gone through a hyperparameter optimization process for improved learning. By learning VRU range-Doppler signatures, the three classes 'pedestrian', 'dog', and 'noise' are classified with an average accuracy of almost 95%. Interestingly, this classification accuracy holds for a combined longitudinal and lateral object trajectories.

Keywords: machine learning, radar, signal processing, autonomous driving

Procedia PDF Downloads 236
8999 Empirical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;

Procedia PDF Downloads 76
8998 Effective Emergency Response and Disaster Prevention: A Decision Support System for Urban Critical Infrastructure Management

Authors: M. Shahab Uddin, Pennung Warnitchai

Abstract:

Currently more than half of the world’s populations are living in cities, and the number and sizes of cities are growing faster than ever. Cities rely on the effective functioning of complex and interdependent critical infrastructures networks to provide public services, enhance the quality of life, and save the community from hazards and disasters. In contrast, complex connectivity and interdependency among the urban critical infrastructures bring management challenges and make the urban system prone to the domino effect. Unplanned rapid growth, increased connectivity, and interdependency among the infrastructures, resource scarcity, and many other socio-political factors are affecting the typical state of an urban system and making it susceptible to numerous sorts of diversion. In addition to internal vulnerabilities, urban systems are consistently facing external threats from natural and manmade hazards. Cities are not just complex, interdependent system, but also makeup hubs of the economy, politics, culture, education, etc. For survival and sustainability, complex urban systems in the current world need to manage their vulnerabilities and hazardous incidents more wisely and more interactively. Coordinated management in such systems makes for huge potential when it comes to absorbing negative effects in case some of its components were to function improperly. On the other hand, ineffective management during a similar situation of overall disorder from hazards devastation may make the system more fragile and push the system to an ultimate collapse. Following the quantum, the current research hypothesizes that a hazardous event starts its journey as an emergency, and the system’s internal vulnerability and response capacity determine its destination. Connectivity and interdependency among the urban critical infrastructures during this stage may transform its vulnerabilities into dynamic damaging force. An emergency may turn into a disaster in the absence of effective management; similarly, mismanagement or lack of management may lead the situation towards a catastrophe. Situation awareness and factual decision-making is the key to win a battle. The current research proposed a contextual decision support system for an urban critical infrastructure system while integrating three different models: 1) Damage cascade model which demonstrates damage propagation among the infrastructures through their connectivity and interdependency, 2) Restoration model, a dynamic restoration process of individual infrastructure, which is based on facility damage state and overall disruptions in surrounding support environment, and 3) Optimization model that ensures optimized utilization and distribution of available resources in and among the facilities. All three models are tightly connected, mutually interdependent, and together can assess the situation and forecast the dynamic outputs of every input. Moreover, this integrated model will hold disaster managers and decision makers responsible when it comes to checking all the alternative decision before any implementation, and support to produce maximum possible outputs from the available limited inputs. This proposed model will not only support to reduce the extent of damage cascade but will ensure priority restoration and optimize resource utilization through adaptive and collaborative management. Complex systems predictably fail but in unpredictable ways. System understanding, situation awareness, and factual decisions may significantly help urban system to survive and sustain.

Keywords: disaster prevention, decision support system, emergency response, urban critical infrastructure system

Procedia PDF Downloads 218
8997 The Differences and Similarities in Neurocognitive Deficits in Mild Traumatic Brain Injury and Depression

Authors: Boris Ershov

Abstract:

Depression is the most common mood disorder experienced by patients who have sustained a traumatic brain injury (TBI) and is associated with poorer cognitive functional outcomes. However, in some cases, similar cognitive impairments can also be observed in depression. There is not enough information about the features of the cognitive deficit in patients with TBI in relation to patients with depression. TBI patients without depressive symptoms (TBInD, n25), TBI patients with depressive symptoms (TBID, n31), and 28 patients with bipolar II disorder (BP) were included in the study. There were no significant differences in participants in respect to age, handedness and educational level. The patients clinical status was determined by using Montgomery–Asberg Depression Rating Scale (MADRS). All participants completed a cognitive battery (The Brief Assessment of Cognition in Affective Disorders (BAC-A)). Additionally, the Rey–Osterrieth Complex Figure (ROCF) was used to assess visuospatial construction abilities and visual memory, as well as planning and organizational skills. Compared to BP, TBInD and TBID showed a significant impairments in visuomotor abilities, verbal and visual memory. There were no significant differences between BP and TBID groups in working memory, speed of information processing, problem solving. Interference effect (cognitive inhibition) was significantly greater in TBInD and TBID compared to BP. Memory bias towards mood-related information in BP and TBID was greater in comparison with TBInD. These results suggest that depressive symptoms are associated with impairments some executive functions in combination at decrease of speed of information processing.

Keywords: bipolar II disorder, depression, neurocognitive deficits, traumatic brain injury

Procedia PDF Downloads 344
8996 Rewritten Oedipus Complex: Huo Datong’s Complex of Generation

Authors: Xinyu Chen

Abstract:

This article reviews Chinese psychoanalytic theorist, Dr. Huo Datong’s notion, the complex of generation, around which Huo conceptualizes a localized set to recapitulate the unconscious structure of Chinese people. Psychoanalysis underwent constant localization influenced by the socio-cultural milieu and endeavored by scholars receiving training backgrounds from different psychoanalytic schools. Dr. Huo Datong is one of the representatives with a Sino-French background of psychoanalytic training, whose enterprise has demonstrated psychoanalysis's cultural and ideological accommodability. Insufficient academic attention has been paid to this concept as the core of Huo’s re-framework. This notion is put forward by sharing a western psychoanalytic reading of Chinese mythologies to contour Chinese unconsciousness. Regarding Huo’s interpretation of the Chinese kinship network as the basis to propose an omnipotent symbolic mother rather than an Oedipal father, this article intends to review this notion in terms of its mythological root to evaluate the theoretical practicality.

Keywords: psychoanalysis, China, Huo Datong, mythology

Procedia PDF Downloads 248
8995 Study on Optimization of Air Infiltration at Entrance of a Commercial Complex in Zhejiang Province

Authors: Yujie Zhao, Jiantao Weng

Abstract:

In the past decade, with the rapid development of China's economy, the purchasing power and physical demand of residents have been improved, which results in the vast emergence of public buildings like large shopping malls. However, the architects usually focus on the internal functions and streamlines of these buildings, ignoring the impact of the environment on the subjective feelings of building users. Only in Zhejiang province, the infiltration of cold air in winter frequently occurs at the entrance of sizeable commercial complex buildings that have been in operation, which will affect the environmental comfort of the building lobby and internal public spaces. At present, to reduce these adverse effects, it is usually adopted to add active equipment, such as setting air curtains to block air exchange or adding heating air conditioners. From the perspective of energy consumption, the infiltration of cold air into the entrance will increase the heat consumption of indoor heating equipment, which will indirectly cause considerable economic losses during the whole winter heating stage. Therefore, it is of considerable significance to explore the suitable entrance forms for improving the environmental comfort of commercial buildings and saving energy. In this paper, a commercial complex with apparent cold air infiltration problem in Hangzhou is selected as the research object to establish a model. The environmental parameters of the building entrance, including temperature, wind speed, and infiltration air volume, are obtained by Computational Fluid Dynamics (CFD) simulation, from which the heat consumption caused by the natural air infiltration in the winter and its potential economic loss is estimated as the objective metric. This study finally obtains the optimization direction of the building entrance form of the commercial complex by comparing the simulation results of other local commercial complex projects with different entrance forms. The conclusions will guide the entrance design of the same type of commercial complex in this area.

Keywords: air infiltration, commercial complex, heat consumption, CFD simulation

Procedia PDF Downloads 130
8994 Poincare Plot for Heart Rate Variability

Authors: Mazhar B. Tayel, Eslam I. AlSaba

Abstract:

The heart is the most important part in any body organisms. It effects and affected by any factor in the body. Therefore, it is a good detector of any matter in the body. When the heart signal is non-stationary signal, therefore, it should be study its variability. So, the Heart Rate Variability (HRV) has attracted considerable attention in psychology, medicine and have become important dependent measure in psychophysiology and behavioral medicine. Quantification and interpretation of heart rate variability. However, remain complex issues are fraught with pitfalls. This paper presents one of the non-linear techniques to analyze HRV. It discusses 'What Poincare plot is?', 'How it is work?', 'its usage benefits especially in HRV', 'the limitation of Poincare cause of standard deviation SD1, SD2', and 'How overcome this limitation by using complex correlation measure (CCM)'. The CCM is most sensitive to changes in temporal structure of the Poincaré plot as compared to SD1 and SD2.

Keywords: heart rate variability, chaotic system, poincare, variance, standard deviation, complex correlation measure

Procedia PDF Downloads 396
8993 Spatial-Temporal Characteristics of Bacterioplankton in the Upper Part of Taktakorpu Water Complex

Authors: Fidan Z. Aliyeva

Abstract:

In the presented article, the formation of the microbiological regime in the Takhtakorpu water complex, as well as spatial-temporal changes in the quantitative indicators of bacterioplankton, were studied. Taktakorpu water complex was built as a continuation of the reconstruction and expansion project of the Samur-Absheron irrigation system in Shabran on the northeastern slope of our republic. It should be noted that with the implementation of the project, the water supply of up to 150 thousand ha of useful land in the northern region has been improved, and the drinking, technical, and irrigation water needs of the population of Baku, Sumgayit and also the Absheron Peninsula, and industrial and agricultural areas, joining the agricultural circulation of new soil areas, Takhtakorpu reservoir with a volume of 238.4 million m³, connected with them -Valvalachay- Takhtakorpu and Takhtakorpu-Jeyranbatan canals have been created, conditions have been created to increase the resources of the Jeyranbatan reservoir. Special attention is paid to the study of saprophytic bacteria in order to determine the development dynamics and biochemical activity of the microbiological regime in the Takhtakorpu Water Complex, which is of great strategic importance for our republic, to evaluate changes under the influence of anthropogenic factors, as well as to evaluate the properties of self-cleaning, mineralization features of organic substances of allochthon and autochthonous origin. One of the main goals of our research is to determine the main structural indicators of bacterioplankton in the upper part of Takhtakorpu water complex in the first three stations and analyzing their quantitative values in a certain time aspect.

Keywords: water, irrigation, sewage, wastewater

Procedia PDF Downloads 69
8992 Fire Safe Medical Oxygen Delivery for Aerospace Environments

Authors: M. A. Rahman, A. T. Ohta, H. V. Trinh, J. Hyvl

Abstract:

Atmospheric pressure and oxygen (O2) concentration are critical life support parameters for human-occupied aerospace vehicles and habitats. Various medical conditions may require medical O2; for example, the American Medical Association has determined that commercial air travel exposes passengers to altitude-related hypoxia and gas expansion. It may cause some passengers to experience significant symptoms and medical complications during the flight, requiring supplemental medical-grade O2 to maintain adequate tissue oxygenation and prevent hypoxemic complications. Although supplemental medical grade O2 is a successful lifesaver for respiratory and cardiac failure, O2-enriched exhaled air can contain more than 95 % O2, increasing the likelihood of a fire. In an aerospace environment, a localized high concentration O2 bubble forms around a patient being treated for hypoxia, increasing the cabin O2 beyond the safe limit. To address this problem, this work describes a medical O2 delivery system that can reduce the O2 concentration from patient-exhaled O2-rich air to safe levels while maintaining the prescribed O2 administration to the patient. The O2 delivery system is designed to be a part of the medical O2 kit. The system uses cationic multimetallic cobalt complexes to reversibly, selectively, and stoichiometrically chemisorb O2 from the exhaled air. An air-release sub-system monitors the exhaled air, and as soon the O2 percentage falls below 21%, the air is released to the room air. The O2-enriched exhaled air is channeled through a layer of porous, thin-film heaters coated with the cobalt complex. The complex absorbs O2, and when saturated, the complex is heated to 100°C using the thin-film heater. Upon heating, the complex desorbs O2 and is once again ready to absorb or remove the excess O2 from exhaled air. The O2 absorption is a sub-second process, and desorption is a multi-second process. While heating at 0.685 °C/sec, the complex desorbs ~90% O2 in 110 sec. These fast reaction times mean that a simultaneous absorb/desorb process in the O2 delivery system will create a continuous absorption of O2. Moreover, the complex can concentrate O2 by a factor of 160 times that in air and desorb over 90% of the O2 at 100°C. Over 12 cycles of thermogravimetry measurement, less than 0.1% decrease in reversibility in O2 uptake was observed. The 1 kg complex can desorb over 20L of O2, so simultaneous O2 desorption by 0.5 kg of complex and absorption by 0.5 kg of complex can potentially continuously remove 9L/min O2 (~90% desorbed at 100°C) from exhaled air. The complex is synthesized and characterized for reversible O2 absorption and efficacy. The complex changes its color from dark brown to light gray after O2 desorption. In addition to thermogravimetric analysis, the O2 absorption/desorption cycle is characterized using optical imaging, showing stable color changes over ten cycles. The complex was also tested at room temperature in a low O2 environment in its O2 desorbed state, and observed to hold the deoxygenated state under these conditions. The results show the feasibility of using the complex for reversible O2 absorption in the proposed fire safe medical O2 delivery system.

Keywords: fire risk, medical oxygen, oxygen removal, reversible absorption

Procedia PDF Downloads 100
8991 Atomic Force Microscopy Studies of DNA Binding Properties of the Archaeal Mini Chromosome Maintenance Complex

Authors: Amna Abdalla Mohammed Khalid, Pietro Parisse, Silvia Onesti, Loredana Casalis

Abstract:

Basic cellular processes as DNA replication are crucial to cell life. Understanding at the molecular level the mechanisms that govern DNA replication in proliferating cells is fundamental to understand disease connected to genomic instabilities, as a genetic disease and cancer. A key step for DNA replication to take place, is unwinding the DNA double helix and this carried out by proteins called helicases. The archaeal MCM (minichromosome maintenance) complex from Methanothermobacter thermautotrophicus have being studied using Atomic Force Microscopy (AFM), imaging in air and liquid (Physiological environment). The accurate analysis of AFM topographic images allowed to understand the static conformations as well the interaction dynamic of MCM and DNA double helix in the present of ATP.

Keywords: DNA, protein-DNA interaction, MCM (mini chromosome manteinance) complex, atomic force microscopy (AFM)

Procedia PDF Downloads 304
8990 Two-Level Graph Causality to Detect and Predict Random Cyber-Attacks

Authors: Van Trieu, Shouhuai Xu, Yusheng Feng

Abstract:

Tracking attack trajectories can be difficult, with limited information about the nature of the attack. Even more difficult as attack information is collected by Intrusion Detection Systems (IDSs) due to the current IDSs having some limitations in identifying malicious and anomalous traffic. Moreover, IDSs only point out the suspicious events but do not show how the events relate to each other or which event possibly cause the other event to happen. Because of this, it is important to investigate new methods capable of performing the tracking of attack trajectories task quickly with less attack information and dependency on IDSs, in order to prioritize actions during incident responses. This paper proposes a two-level graph causality framework for tracking attack trajectories in internet networks by leveraging observable malicious behaviors to detect what is the most probable attack events that can cause another event to occur in the system. Technically, given the time series of malicious events, the framework extracts events with useful features, such as attack time and port number, to apply to the conditional independent tests to detect the relationship between attack events. Using the academic datasets collected by IDSs, experimental results show that the framework can quickly detect the causal pairs that offer meaningful insights into the nature of the internet network, given only reasonable restrictions on network size and structure. Without the framework’s guidance, these insights would not be able to discover by the existing tools, such as IDSs. It would cost expert human analysts a significant time if possible. The computational results from the proposed two-level graph network model reveal the obvious pattern and trends. In fact, more than 85% of causal pairs have the average time difference between the causal and effect events in both computed and observed data within 5 minutes. This result can be used as a preventive measure against future attacks. Although the forecast may be short, from 0.24 seconds to 5 minutes, it is long enough to be used to design a prevention protocol to block those attacks.

Keywords: causality, multilevel graph, cyber-attacks, prediction

Procedia PDF Downloads 155
8989 Finding the Reaction Constant between Humic Acid and Aluminum Ion by Fluorescence Quenching Effect

Authors: Wen Po Cheng, Chen Zhao Feng, Ruey Fang Yu, Lin Jia Jun, Lin Ji Ye, Chen Yuan Wei

Abstract:

Humic acid was used as the removal target for evaluating the coagulation efficiency in this study. When the coagulant ions mix with a humic acid solution, a Fluorescence quenching effect may be observed conditionally. This effect can be described by Stern-Volmer linear equation which can be used for quantifying the quenching value (Kq) of the Fluorescence quenching effect. In addition, a Complex-Formation Titration (CFT) theory was conducted and the result was used to explain the electron-neutralization capability of the coagulant (AlCl₃) at different pH. The results indicated that when pH of the ACl₃ solution was between 6 and 8, fluorescence quenching effect obviously occurred. The maximum Kq value was found to be 102,524 at pH 6. It means that the higher the Kq value is, the better complex reaction between a humic acid and aluminum salts will be. Through the Kq value study, the optimum pH can be quantified when the humic acid solution is coagulated with aluminum ions.

Keywords: humic acid, fluorescence quenching effect, complex reaction, titration

Procedia PDF Downloads 575
8988 Cognitive Dysfunctioning and the Fronto-Limbic Network in Bipolar Disorder Patients: A Fmri Meta-Analysis

Authors: Rahele Mesbah, Nic Van Der Wee, Manja Koenders, Erik Giltay, Albert Van Hemert, Max De Leeuw

Abstract:

Introduction: Patients with bipolar disorder (BD), characterized by depressive and manic episodes, often suffer from cognitive dysfunction. An up-to-date meta-analysis of functional Magnetic Resonance Imaging (fMRI) studies examining cognitive function in BD is lacking. Objective: The aim of the current fMRI meta-analysis is to investigate brain functioning of bipolar patients compared with healthy subjects within three domains of emotion processing, reward processing, and working memory. Method: Differences in brain regions activation were tested within whole-brain analysis using the activation likelihood estimation (ALE) method. Separate analyses were performed for each cognitive domain. Results: A total of 50 fMRI studies were included: 20 studies used an emotion processing (316 BD and 369 HC) task, 9 studies a reward processing task (215 BD and 213 HC), and 21 studies used a working memory task (503 BD and 445 HC). During emotion processing, BD patients hyperactivated parts of the left amygdala and hippocampus as compared to HC’s, but showed hypoactivation in the inferior frontal gyrus (IFG). Regarding reward processing, BD patients showed hyperactivation in part of the orbitofrontal cortex (OFC). During working memory, BD patients showed increased activity in the prefrontal cortex (PFC) and anterior cingulate cortex (ACC). Conclusions: This meta-analysis revealed evidence for activity disturbances in several brain areas involved in the cognitive functioning of BD patients. Furthermore, most of the found regions are part of the so-called fronto-limbic network which is hypothesized to be affected as a result of BD candidate genes' expression.

Keywords: cognitive functioning, fMRI analysis, bipolar disorder, fronto-limbic network

Procedia PDF Downloads 455
8987 Characterisation of Wind-Driven Ventilation in Complex Terrain Conditions

Authors: Daniel Micallef, Damien Bounaudet, Robert N. Farrugia, Simon P. Borg, Vincent Buhagiar, Tonio Sant

Abstract:

The physical effects of upstream flow obstructions such as vegetation on cross-ventilation phenomena of a building are important for issues such as indoor thermal comfort. Modelling such effects in Computational Fluid Dynamics simulations may also be challenging. The aim of this work is to establish the cross-ventilation jet behaviour in such complex terrain conditions as well as to provide guidelines on the implementation of CFD numerical simulations in order to model complex terrain features such as vegetation in an efficient manner. The methodology consists of onsite measurements on a test cell coupled with numerical simulations. It was found that the cross-ventilation flow is highly turbulent despite the very low velocities encountered internally within the test cells. While no direct measurement of the jet direction was made, the measurements indicate that flow tends to be reversed from the leeward to the windward side. Modelling such a phenomenon proves challenging and is strongly influenced by how vegetation is modelled. A solid vegetation tends to predict better the direction and magnitude of the flow than a porous vegetation approach. A simplified terrain model was also shown to provide good comparisons with observation. The findings have important implications on the study of cross-ventilation in complex terrain conditions since the flow direction does not remain trivial, as with the traditional isolated building case.

Keywords: complex terrain, cross-ventilation, wind driven ventilation, wind resource, computational fluid dynamics, CFD

Procedia PDF Downloads 393
8986 Performance Evaluation of Production Schedules Based on Process Mining

Authors: Kwan Hee Han

Abstract:

External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.

Keywords: data mining, event log, process mining, production scheduling

Procedia PDF Downloads 278
8985 Spatial Audio Player Using Musical Genre Classification

Authors: Jun-Yong Lee, Hyoung-Gook Kim

Abstract:

In this paper, we propose a smart music player that combines the musical genre classification and the spatial audio processing. The musical genre is classified based on content analysis of the musical segment detected from the audio stream. In parallel with the classification, the spatial audio quality is achieved by adding an artificial reverberation in a virtual acoustic space to the input mono sound. Thereafter, the spatial sound is boosted with the given frequency gains based on the musical genre when played back. Experiments measured the accuracy of detecting the musical segment from the audio stream and its musical genre classification. A listening test was performed based on the virtual acoustic space based spatial audio processing.

Keywords: automatic equalization, genre classification, music segment detection, spatial audio processing

Procedia PDF Downloads 424
8984 Co-Participation: Towards the Sustainable Micro-Rural Complex in China

Authors: Danhua Xu, Zhenlan Qian, Zhu Wang, Jiayan Fu, Ling Wang

Abstract:

A new business mode called rural complex is proposed by the China’s government to promote the development the economy in the rural area. However, for the sake of current national conditions including the great number of labor farmers owning the small scale farmlands and the uncertain enthusiasm from the enterprises, it is challenging to develop the big scale rural complex. To react to the dilemmas, this paper puts forward the micro-rural complex to boost the small scale farms by co-participation from a bottom-up mode. By analyzing the potential opportunities to find the suitable mode, exploring the interdisciplinary and interdepartmental co-participation way beyond architecture design and spatial planning between different actors, the paper tries to find a complete process towards the sustainable micro-rural complex and conducts an ongoing practice to optimize it, to bring new insights and reference to the rural development. According to the transformation of the economy, the micro-rural complex will develop into two phases, both of which can be discussed in three parts, the economic mode, the spatial support, and the Cooperating mechanism. The first stage is the agriculture co-participation based on the rise of Community supported agriculture (CSA) in which the consumers buy the products planted in an organic way from the farmers directly with a higher price to support the small-scale agriculture and overcome the food safety issues. The following stage sets up the agritourism catering the citizens with the restaurants, inns and other tourist service facilities to be planned and designed. In the whole process, the interdisciplinary co-participation will play an important role to provide the guidelines and consultation from the agronomists, architects and rural planners to the farmers. This mode has been applied to an on-going farm project, from which to explore the mode in a more practical way. In conclusion, the micro-rural complex aims at creating a balanced urban-rural relationship by co-participation taking advantage of the different actors. The spatial development is considered from the economic mode and social organization. The integration of the mode based on the small-scale agriculture will contribute to a sustainable growth and realize the long run development in the rural area.

Keywords: micro-rural complex, co-participation, sustainable development, China

Procedia PDF Downloads 259
8983 Socioeconomic Burden of Life Long Disease: A Case of Diabetes Care in Bangladesh

Authors: Samira Humaira Habib

Abstract:

Diabetes has profound effects on individuals and their families. If diabetes is not well monitored and managed, then it leads to long-term complications and a large and growing cost to the health care system. Prevalence and socioeconomic burden of diabetes and relative return of investment for the elimination or the reduction of the burden are much more important regarding its cost burden. Various studies regarding the socioeconomic cost burden of diabetes are well explored in developed countries but almost absent in developing countries like Bangladesh. The main objective of the study is to estimate the total socioeconomic burden of diabetes. It is a prospective longitudinal follow up study which is analytical in nature. Primary and secondary data are collected from patients who are undergoing treatment for diabetes at the out-patient department of Bangladesh Institute of Research & Rehabilitation in Diabetes, Endocrine & Metabolic Disorders (BIRDEM). Of the 2115 diabetic subjects, females constitute around 50.35% of the study subject, and the rest are male (49.65%). Among the subjects, 1323 are controlled, and 792 are uncontrolled diabetes. Cost analysis of 2115 diabetic patients shows that the total cost of diabetes management and treatment is US$ 903018 with an average of US$ 426.95 per patient. In direct cost, the investigation and medical treatment at hospital along with investigation constitute most of the cost in diabetes. The average cost of a hospital is US$ 311.79, which indicates an alarming warn for diabetic patients. The indirect cost shows that cost of productivity loss (US$ 51110.1) is higher among the all indirect item. All constitute total indirect cost as US$ 69215.7. The incremental cost of intensive management of uncontrolled diabetes is US$ 101.54 per patient and event-free time gained in this group is 0.55 years and the life years gain is 1.19 years. The incremental cost per event-free year gained is US$ 198.12. The incremental cost of intensive management of the controlled group is US$ 89.54 per patient and event-free time gained is 0.68 years, and the life year gain is 1.12 years. The incremental cost per event-free year gained is US$ 223.34. The EuroQoL difference between the groups is found to be 64.04. The cost-effective ratio is found to be US$ 1.64 cost per effect in case of controlled diabetes and US$ 1.69 cost per effect in case of uncontrolled diabetes. So management of diabetes is much more cost-effective. Cost of young type 1 diabetic patient showed upper socioeconomic class, and with the increase of the duration of diabetes, the cost increased also. The dietary pattern showed macronutrients intake and cost are significantly higher in the uncontrolled group than their counterparts. Proper management and control of diabetes can decrease the cost of care for the long term.

Keywords: cost, cost-effective, chronic diseases, diabetes care, burden, Bangladesh

Procedia PDF Downloads 146
8982 Production of High Purity Cellulose Products from Sawdust Waste Material

Authors: Simiksha Balkissoon, Jerome Andrew, Bruce Sithole

Abstract:

Approximately half of the wood processed in the Forestry, Timber, Pulp and Paper (FTPP) sector is accumulated as waste. The concept of a “green economy” encourages industries to employ revolutionary, transformative technologies to eliminate waste generation by exploring the development of new value chains. The transition towards an almost paperless world driven by the rise of digital media has resulted in a decline in traditional paper markets, prompting the FTTP sector to reposition itself and expand its product offerings by unlocking the potential of value-adding opportunities from renewable resources such as wood to generate revenue and mitigate its environmental impact. The production of valuable products from wood waste such as sawdust has been extensively explored in recent years. Wood components such as lignin, cellulose and hemicelluloses, which can be extracted selectively by chemical processing, are suitable candidates for producing numerous high-value products. In this study, a novel approach to produce high-value cellulose products, such as dissolving wood pulp (DWP), from sawdust was developed. DWP is a high purity cellulose product used in several applications such as pharmaceutical, textile, food, paint and coatings industries. The proposed approach demonstrates the potential to eliminate several complex processing stages, such as pulping and bleaching, which are associated with traditional commercial processes to produce high purity cellulose products such as DWP, making it less chemically energy and water-intensive. The developed process followed the path of experimentally designed lab tests evaluating typical processing conditions such as residence time, chemical concentrations, liquid-to-solid ratios and temperature, followed by the application of suitable purification steps. Characterization of the product from the initial stage was conducted using commercially available DWP grades as reference materials. The chemical characteristics of the products thus far have shown similar properties to commercial products, making the proposed process a promising and viable option for the production of DWP from sawdust.

Keywords: biomass, cellulose, chemical treatment, dissolving wood pulp

Procedia PDF Downloads 181
8981 Optimization of Real Time Measured Data Transmission, Given the Amount of Data Transmitted

Authors: Michal Kopcek, Tomas Skulavik, Michal Kebisek, Gabriela Krizanova

Abstract:

The operation of nuclear power plants involves continuous monitoring of the environment in their area. This monitoring is performed using a complex data acquisition system, which collects status information about the system itself and values of many important physical variables e.g. temperature, humidity, dose rate etc. This paper describes a proposal and optimization of communication that takes place in teledosimetric system between the central control server responsible for the data processing and storing and the decentralized measuring stations, which are measuring the physical variables. Analyzes of ongoing communication were performed and consequently the optimization of the system architecture and communication was done.

Keywords: communication protocol, transmission optimization, data acquisition, system architecture

Procedia PDF Downloads 514