Search results for: transit electronic portal imaging device
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5089

Search results for: transit electronic portal imaging device

3469 Process Flows and Risk Analysis for the Global E-SMC

Authors: Taeho Park, Ming Zhou, Sangryul Shim

Abstract:

With the emergence of the global economy, today’s business environment is getting more competitive than ever in the past. And many supply chain (SC) strategies and operations have significantly been altered over the past decade to overcome more complexities and risks imposed onto the global business. First, offshoring and outsourcing are more adopted as operational strategies. Manufacturing continues to move to better locations for enhancing competitiveness. Second, international operations are a challenge to a company’s SC system. Third, the products traded in the SC system are not just physical goods, but also digital goods (e.g., software, e-books, music, video materials). There are three main flows involved in fulfilling the activities in the SC system: physical flow, information flow, and financial flow. An advance of the Internet and electronic communication technologies has enabled companies to perform the flows of SC activities in electronic formats, resulting in the advent of an electronic supply chain management (e-SCM) system. A SC system for digital goods is somewhat different from the supply chain system for physical goods. However, it involves many similar or identical SC activities and flows. For example, like the production of physical goods, many third parties are also involved in producing digital goods for the production of components and even final products. This research aims at identifying process flows of both physical and digital goods in a SC system, and then investigating all risk elements involved in the physical, information, and financial flows during the fulfilment of SC activities. There are many risks inherent in the e-SCM system. Some risks may have severe impact on a company’s business, and some occur frequently but are not detrimental enough to jeopardize a company. Thus, companies should assess the impact and frequency of those risks, and then prioritize them in terms of their severity, frequency, budget, and time in order to be carefully maintained. We found risks involved in the global trading of physical and digital goods in four different categories: environmental risk, strategic risk, technological risk, and operational risk. And then the significance of those risks was investigated through a survey. The survey asked companies about the frequency and severity of the identified risks. They were also asked whether they had faced those risks in the past. Since the characteristics and supply chain flows of digital goods are varying industry by industry and country by country, it is more meaningful and useful to analyze risks by industry and country. To this end, more data in each industry sector and country should be collected, which could be accomplished in the future research.

Keywords: digital goods, e-SCM, risk analysis, supply chain flows

Procedia PDF Downloads 422
3468 Optimization the Conditions of Electrophoretic Deposition Fabrication of Graphene-Based Electrode to Consider Applications in Electro-Optical Sensors

Authors: Sepehr Lajevardi Esfahani, Shohre Rouhani, Zahra Ranjbar

Abstract:

Graphene has gained much attention owing to its unique optical and electrical properties. Charge carriers in graphene sheets (GS) carry out a linear dispersion relation near the Fermi energy and behave as massless Dirac fermions resulting in unusual attributes such as the quantum Hall effect and ambipolar electric field effect. It also exhibits nondispersive transport characteristics with an extremely high electron mobility (15000 cm2/(Vs)) at room temperature. Recently, several progresses have been achieved in the fabrication of single- or multilayer GS for functional device applications in the fields of optoelectronic such as field-effect transistors ultrasensitive sensors and organic photovoltaic cells. In addition to device applications, graphene also can serve as reinforcement to enhance mechanical, thermal, or electrical properties of composite materials. Electrophoretic deposition (EPD) is an attractive method for development of various coatings and films. It readily applied to any powdered solid that forms a stable suspension. The deposition parameters were controlled in various thicknesses. In this study, the graphene electrodeposition conditions were optimized. The results were obtained from SEM, Ohm resistance measuring technique and AFM characteristic tests. The minimum sheet resistance of electrodeposited reduced graphene oxide layers is achieved at conditions of 2 V in 10 s and it is annealed at 200 °C for 1 minute.

Keywords: electrophoretic deposition (EPD), graphene oxide (GO), electrical conductivity, electro-optical devices

Procedia PDF Downloads 190
3467 Analysis of the CO2 Emissions of Public Passenger Transport in Tianjin City of China

Authors: Tao Zhao, Xianshuo Xu

Abstract:

Low-carbon public passenger transport is an important part of low carbon city. The CO2 emissions of public passenger transport in Tianjin from 1995 to 2010 are estimated with IPCC CO2 counting method, which shows that the total CO2 emissions of Tianjin public passenger transport have gradually become stable at 1,425.1 thousand tons. And then the CO2 emissions of the buses, taxies, and rail transits are calculated respectively. A CO2 emission of 829.9 thousand tons makes taxies become the largest CO2 emissions source among the public passenger transport in Tianjin. Combining with passenger volume, this paper analyzes the CO2 emissions proportion of the buses, taxies, and rail transits compare the passenger transport rate with the proportion of CO2 emissions, as well as the CO2 emissions change of per 10,000 people. The passenger volume proportion of bus among the three public means of transport is 72.62% which is much higher than its CO2 emissions proportion of 36.01%, with the minimum number of CO2 emissions per 10,000 people of 4.90 tons. The countermeasures to reduce CO2 emissions of public passenger transport in Tianjin are to develop rail transit, update vehicles and use alternative fuel vehicles.

Keywords: public passenger transport, carbon emissions, countermeasures, China

Procedia PDF Downloads 429
3466 A Study on Abnormal Behavior Detection in BYOD Environment

Authors: Dongwan Kang, Joohyung Oh, Chaetae Im

Abstract:

Advancement of communication technologies and smart devices in the recent times is leading to changes into the integrated wired and wireless communication environments. Since early days, businesses had started introducing environments for mobile device application to their operations in order to improve productivity (efficiency) and the closed corporate environment gradually shifted to an open structure. Recently, individual user's interest in working environment using mobile devices has increased and a new corporate working environment under the concept of BYOD is drawing attention. BYOD (bring your own device) is a concept where individuals bring in and use their own devices in business activities. Through BYOD, businesses can anticipate improved productivity (efficiency) and also a reduction in the cost of purchasing devices. However, as a result of security threats caused by frequent loss and theft of personal devices and corporate data leaks due to low security, companies are reluctant about adopting BYOD system. In addition, without considerations to diverse devices and connection environments, there are limitations in detecting abnormal behaviors such as information leaks which use the existing network-based security equipment. This study suggests a method to detect abnormal behaviors according to individual behavioral patterns, rather than the existing signature-based malicious behavior detection and discusses applications of this method in BYOD environment.

Keywords: BYOD, security, anomaly behavior detection, security equipment, communication technologies

Procedia PDF Downloads 324
3465 Assessment of Hepatosteatosis Among Diabetic and Nondiabetic Patients Using Biochemical Parameters and Noninvasive Imaging Techniques

Authors: Tugba Sevinc Gamsiz, Emine Koroglu, Ozcan Keskin

Abstract:

Aim: Nonalcoholic fatty liver disease (NAFLD) is considered the most common chronic liver disease in the general population. The higher mortality and morbidity among NAFLD patients and lack of symptoms makes early detection and management important. In our study, we aimed to evaluate the relationship between noninvasive imaging and biochemical markers in diabetic and nondiabetic patients diagnosed with NAFLD. Materials and Methods: The study was conducted from (September 2017) to (December 2017) on adults admitted to Internal Medicine and Gastroenterology outpatient clinics with hepatic steatosis reported on ultrasound or transient elastography within the last six months that exclude patients with other liver diseases or alcohol abuse. The data were collected and analyzed retrospectively. Number cruncher statistical system (NCSS) 2007 program was used for statistical analysis. Results: 116 patients were included in this study. Diabetic patients compared to nondiabetics had significantly higher Controlled Attenuation Parameter (CAP), Liver Stiffness Measurement (LSM) and fibrosis values. Also, hypertension, hepatomegaly, high BMI, hypertriglyceridemia, hyperglycemia, high A1c, and hyperuricemia were found to be risk factors for NAFLD progression to fibrosis. Advanced fibrosis (F3, F4) was present in 18,6 % of all our patients; 35,8 % of diabetic and 5,7 % of nondiabetic patients diagnosed with hepatic steatosis. Conclusion: Transient elastography is now used in daily clinical practice as an accurate noninvasive tool during follow-up of patients with fatty liver. Early diagnosis of the stage of liver fibrosis improves the monitoring and management of patients, especially in those with metabolic syndrome criteria.

Keywords: diabetes, elastography, fatty liver, fibrosis, metabolic syndrome

Procedia PDF Downloads 152
3464 A Radiomics Approach to Predict the Evolution of Prostate Imaging Reporting and Data System Score 3/5 Prostate Areas in Multiparametric Magnetic Resonance

Authors: Natascha C. D'Amico, Enzo Grossi, Giovanni Valbusa, Ala Malasevschi, Gianpiero Cardone, Sergio Papa

Abstract:

Purpose: To characterize, through a radiomic approach, the nature of areas classified PI-RADS (Prostate Imaging Reporting and Data System) 3/5, recognized in multiparametric prostate magnetic resonance with T2-weighted (T2w), diffusion and perfusion sequences with paramagnetic contrast. Methods and Materials: 24 cases undergoing multiparametric prostate MR and biopsy were admitted to this pilot study. Clinical outcome of the PI-RADS 3/5 was found through biopsy, finding 8 malignant tumours. The analysed images were acquired with a Philips achieva 1.5T machine with a CE- T2-weighted sequence in the axial plane. Semi-automatic tumour segmentation was carried out on MR images using 3DSlicer image analysis software. 45 shape-based, intensity-based and texture-based features were extracted and represented the input for preprocessing. An evolutionary algorithm (a TWIST system based on KNN algorithm) was used to subdivide the dataset into training and testing set and select features yielding the maximal amount of information. After this pre-processing 20 input variables were selected and different machine learning systems were used to develop a predictive model based on a training testing crossover procedure. Results: The best machine learning system (three-layers feed-forward neural network) obtained a global accuracy of 90% ( 80 % sensitivity and 100% specificity ) with a ROC of 0.82. Conclusion: Machine learning systems coupled with radiomics show a promising potential in distinguishing benign from malign tumours in PI-RADS 3/5 areas.

Keywords: machine learning, MR prostate, PI-Rads 3, radiomics

Procedia PDF Downloads 188
3463 A Simulation-Based Study of Dust Ingression into Microphone of Indoor Consumer Electronic Devices

Authors: Zhichao Song, Swanand Vaidya

Abstract:

Nowadays, most portable (e.g., smartphones) and wearable (e.g., smartwatches and earphones) consumer hardware are designed to be dustproof following IP5 or IP6 ratings to ensure the product is able to handle potentially dusty outdoor environments. On the other hand, the design guideline is relatively vague for indoor devices (e.g., smart displays and speakers). While it is generally believed that the indoor environment is much less dusty, in certain circumstances, dust ingression is still able to cause functional failures, such as microphone frequency response shift and camera black spot, or cosmetic dissatisfaction, mainly the dust build up in visible pockets and gaps which is hard to clean. In this paper, we developed a simulation methodology to analyze dust settlement and ingression into known ports of a device. A closed system is initialized with dust particles whose sizes follow Weibull distribution based on data collected in a user study, and dust particle movement was approximated as a settlement in stationary fluid, which is governed by Stokes’ law. Following this method, we simulated dust ingression into MEMS microphone through the acoustic port and protective mesh. Various design and environmental parameters are evaluated including mesh pore size, acoustic port depth-to-diameter ratio, mass density of dust material and inclined angle of microphone port. Although the dependencies of dust resistance on these parameters are all monotonic, smaller mesh pore size, larger acoustic depth-to-opening ratio and more inclined microphone placement (towards horizontal direction) are preferred for dust resistance; these preferences may represent certain trade-offs in audio performance and compromise in industrial design. The simulation results suggest the quantitative ranges of these parameters, with more pronounced effects in the improvement of dust resistance. Based on the simulation results, we proposed several design guidelines that intend to achieve an overall balanced design from audio performance, dust resistance, and flexibility in industrial design.

Keywords: dust settlement, numerical simulation, microphone design, Weibull distribution, Stoke's equation

Procedia PDF Downloads 107
3462 Remote BioMonitoring of Mothers and Newborns for Temperature Surveillance Using a Smart Wearable Sensor: Techno-Feasibility Study and Clinical Trial in Southern India

Authors: Prem K. Mony, Bharadwaj Amrutur, Prashanth Thankachan, Swarnarekha Bhat, Suman Rao, Maryann Washington, Annamma Thomas, N. Sheela, Hiteshwar Rao, Sumi Antony

Abstract:

The disease burden among mothers and newborns is caused mostly by a handful of avoidable conditions occurring around the time of childbirth and within the first month following delivery. Real-time monitoring of vital parameters of mothers and neonates offers a potential opportunity to impact access as well as the quality of care in vulnerable populations. We describe the design, development and testing of an innovative wearable device for remote biomonitoring (RBM) of body temperatures in mothers and neonates in a hospital in southern India. The architecture consists of: [1] a low-cost, wearable sensor tag; [2] a gateway device for ‘real-time’ communication link; [3] piggy-backing on a commercial GSM communication network; and [4] an algorithm-based data analytics system. Requirements for the device were: long battery-life upto 28 days (with sampling frequency 5/hr); robustness; IP 68 hermetic sealing; and human-centric design. We undertook pre-clinical laboratory testing followed by clinical trial phases I & IIa for evaluation of safety and efficacy in the following sequence: seven healthy adult volunteers; 18 healthy mothers; and three sets of babies – 3 healthy babies; 10 stable babies in the Neonatal Intensive Care Unit (NICU) and 1 baby with hypoxic ischaemic encephalopathy (HIE). The 3-coin thickness, pebble-design sensor weighing about 8 gms was secured onto the abdomen for the baby and over the upper arm for adults. In the laboratory setting, the response-time of the sensor device to attain thermal equilibrium with the surroundings was 4 minutes vis-a-vis 3 minutes observed with a precision-grade digital thermometer used as a reference standard. The accuracy was ±0.1°C of the reference standard within the temperature range of 25-40°C. The adult volunteers, aged 20 to 45 years, contributed a total of 345 hours of readings over a 7-day period and the postnatal mothers provided a total of 403 paired readings. The mean skin temperatures measured in the adults by the sensor were about 2°C lower than the axillary temperature readings (sensor =34.1 vs digital = 36.1); this difference was statistically significant (t-test=13.8; p<0.001). The healthy neonates provided a total of 39 paired readings; the mean difference in temperature was 0.13°C (sensor =36.9 vs digital = 36.7; p=0.2). The neonates in the NICU provided a total of 130 paired readings. Their mean skin temperature measured by the sensor was 0.6°C lower than that measured by the radiant warmer probe (sensor =35.9 vs warmer probe = 36.5; p < 0.001). The neonate with HIE provided a total of 25 paired readings with the mean sensor reading being not different from the radian warmer probe reading (sensor =33.5 vs warmer probe = 33.5; p=0.8). No major adverse events were noted in both the adults and neonates; four adult volunteers reported mild sweating under the device/arm band and one volunteer developed mild skin allergy. This proof-of-concept study shows that real-time monitoring of temperatures is technically feasible and that this innovation appears to be promising in terms of both safety and accuracy (with appropriate calibration) for improved maternal and neonatal health.

Keywords: public health, remote biomonitoring, temperature surveillance, wearable sensors, mothers and newborns

Procedia PDF Downloads 208
3461 Convergence of Media in New Era

Authors: Mohamad Reza Asariha

Abstract:

The development and extension of modern communication innovations at an extraordinary speed has caused crucial changes in all financial, social, social and political areas of the world. The improvement of toady and cable innovations, in expansion to expanding the generation and dissemination needs of worldwide programs; the financial defense made it more appealing. The alter of the administration of mechanical economy to data economy and benefit economy in created nations brought approximately uncommon advancements within the standards of world exchange and as a result, it caused the extension of media organizations in outside measurements, and the advancement of financial speculations in many Asian nations, beside the worldwide demand for the utilization of media merchandise, made new markets, and the media both within the household scene of the nations and within the universal field. Universal and financial are of great significance and have and viable and compelling nearness within the condition of picking up, keeping up and expanding financial control and riches within the world. Moreover, mechanical progresses and mechanical joining are critical components in media auxiliary alter. This auxiliary alter took put beneath the impact of digitalization. That’s, the method that broke the boundaries between electronic media administrations. Until presently, the direction of mass media was totally subordinate on certain styles of data transmission that were for the most part utilized. Digitization made it conceivable for any content to be effortlessly transmitted through distinctive electronic transmission styles, and this media merging has had clear impacts on media approaches and the way mass media are controlled.

Keywords: media, digital era, digital ages, media convergence

Procedia PDF Downloads 74
3460 Using Biopolymer Materials to Enhance Sandy Soil Behavior

Authors: Mohamed Ayeldeen, Abdelazim Negm

Abstract:

Nowadays, strength characteristics of soils have more importance due to increasing building loads. In some projects, geotechnical properties of the soils are be improved using man-made materials varying from cement-based to chemical-based. These materials have proven successful in improving the engineering properties of the soil such as shear strength, compressibility, permeability, bearing capacity etc.. However, the use of these artificial injection formulas often modifies the pH level of soil, contaminates soil and groundwater. This is attributed to their toxic and hazardous characteristics. Recently, an environmentally friendly soil treatment method or Biological Treatment Method (BTM) was to bond particles of loose sandy soils. This research paper presents the preliminary results of using biopolymers for strengthening cohesionless soil. Xanthan gum was identified for further study over a range of concentrations varying from 0.25% to 2.00%. Xanthan gum is a polysaccharide secreted by the bacterium Xanthomonas campestris, used as a food additive and it is a nontoxic material. A series of direct shear, unconfined compressive strength, and permeability tests were carried out to investigate the behavior of sandy soil treated with Xanthan gum with different concentration ratios and at different curing times. Laser microscopy imaging was also conducted to study the microstructure of the treated sand. Experimental results demonstrated the compatibility of Xanthan gum to improve the geotechnical properties of sandy soil. Depending on the biopolymer concentration, it was observed that the biopolymers effectively increased the cohesion intercept and stiffness of the treated sand and reduced the permeability of sand. The microscopy imaging indicates that the cross-links of the biopolymers through and over the soil particles increase with the increase of the biopolymer concentration.

Keywords: biopolymer, direct shear, permeability, sand, shear strength, Xanthan gum

Procedia PDF Downloads 277
3459 Modeling of an Insulin Mircopump

Authors: Ahmed Slami, Med El Amine Brixi Nigassa, Nassima Labdelli, Sofiane Soulimane, Arnaud Pothier

Abstract:

Many people suffer from diabetes, a disease marked by abnormal levels of sugar in the blood; 285 million people have diabetes, 6.6% of the world adult population (in 2010), according to the International Diabetes Federation. Insulin medicament is invented to be injected into the body. Generally, the injection requires the patient to do it manually. However, in many cases he will be unable to inject the drug, saw that among the side effects of hyperglycemia is the weakness of the whole body. The researchers designed a medical device that injects insulin too autonomously by using micro-pumps. Many micro-pumps of concepts have been investigated during the last two decades for injecting molecules in blood or in the body. However, all these micro-pumps are intended for slow infusion of drug (injection of few microliters by minute). Now, the challenge is to develop micro-pumps for fast injections (1 microliter in 10 seconds) with accuracy of the order of microliter. Recently, studies have shown that only piezoelectric actuators can achieve this performance, knowing that few systems at the microscopic level were presented. These reasons lead us to design new smart microsystems injection drugs. Therefore, many technological advances are still to achieve the improvement of materials to their uses, while going through their characterization and modeling action mechanisms themselves. Moreover, it remains to study the integration of the piezoelectric micro-pump in the microfluidic platform features to explore and evaluate the performance of these new micro devices. In this work, we propose a new micro-pump model based on piezoelectric actuation with a new design. Here, we use a finite element model with Comsol software. Our device is composed of two pumping chambers, two diaphragms and two actuators (piezoelectric disks). The latter parts will apply a mechanical force on the membrane in a periodic manner. The membrane deformation allows the fluid pumping, the suction and discharge of the liquid. In this study, we present the modeling results as function as device geometry properties, films thickness, and materials properties. Here, we demonstrate that we can achieve fast injection. The results of these simulations will provide quantitative performance of our micro-pumps. Concern the spatial actuation, fluid rate and allows optimization of the fabrication process in terms of materials and integration steps.

Keywords: COMSOL software, piezoelectric, micro-pump, microfluidic

Procedia PDF Downloads 342
3458 Design and Control of a Brake-by-Wire System Using a Permanent Magnet Synchronous Motor

Authors: Daniel S. Gamba, Marc Sánchez, Javier Pérez, Juan J. Castillo, Juan A. Cabrera

Abstract:

The conventional hydraulic braking system operates through the activation of a master cylinder and solenoid valves that distribute and regulate brake fluid flow, adjusting the pressure at each wheel to prevent locking during sudden braking. However, in recent years, there has been a significant increase in the integration of electronic units into various vehicle control systems. In this context, one of the technologies most recently researched is the Brake-by-wire system, which combines electronic, hydraulic, and mechanical technologies to manage braking. This proposal introduces the design and control of a Brake-by-wire system, which will be part of a fully electric and teleoperated vehicle. This vehicle will have independent four-wheel drive, braking, and steering systems. The vehicle will be operated by embedded controllers programmed into a Speedgoat test system, which allows programming through Simulink and real-time capabilities. The braking system comprises all mechanical and electrical components, a vehicle control unit (VCU), and an electronic control unit (ECU). The mechanical and electrical components include a permanent magnet synchronous motor from Odrive and its inverter, the mechanical transmission system responsible for converting torque into pressure, and the hydraulic system that transmits this pressure to the brake caliper. The VCU is responsible for controlling the pressure and communicates with the other components through the CAN protocol, minimizing response times. The ECU, in turn, transmits the information obtained by a sensor installed in the caliper to the central computer, enabling the control loop to continuously regulate pressure by controlling the motor's speed and current. To achieve this, tree controllers are used, operating in a nested configuration for effective control. Since the computer allows programming in Simulink, a digital model of the braking system has been developed in Simscape, which makes it possible to reproduce different operating conditions, faithfully simulate the performance of alternative brake control systems, and compare the results with data obtained in various real tests. These tests involve evaluating the system's response to sinusoidal and square wave inputs at different frequencies, with the results compared to those obtained from conventional braking systems.

Keywords: braking, CAN protocol, permanent magnet motor, pressure control

Procedia PDF Downloads 19
3457 FEDBD Plasma, A Promising Approach for Skin Rejuvenation

Authors: P. Charipoor, M. Khani, H. Mahmoudi, E. Ghasemi, P. Akbartehrani, B. Shokri

Abstract:

Cold air plasma could have a variety of effects on cells and living organisms and also shows good results in medical and cosmetic cases. Herein, plasma floating electrode dielectric barrier discharge (FEDBD) plasma was designed for mouse skin rejuvenation purposes. It is safe and easy to use in clinics, laboratories, and homes. The effects of this device were investigated on mouse skin. Vitamin C ointment in combination with plasma was also used as a new method to improve FEDBD results. In this study, 20 Wistar rats were evaluated in four groups. The first group received high-dose plasma, the second group received moderate-dose plasma (with vitamin C cream), the third group received low-dose plasma (with vitamin C cream) for 6 minutes, and the fourth group received only vitamin C cream. This process was done 3 times a week for 4 weeks. Skin temperature was monitored to evaluate the thermal effect of plasma. The presence of reactive species was also demonstrated using optical spectroscopy. Mechanical assays were performed to evaluate the effect of plasma and vitamin C on the mechanical strength of the tissue, which showed a positive effect of plasma on the treated tissue compared to the control group. Using pathological and biometric skin tests, an increase in collagen levels, epidermal thickness, and an increase in fibroblasts was observed in rat skin, as well as increased skin elasticity. This study showed the positive effect of using the FEDBD plasma device on the effective parameters in skin rejuvenation.

Keywords: plasma, skin rejuvenation, collagen, epidermal thickness

Procedia PDF Downloads 258
3456 Security Practices of the European Union on Migration: An Analysis of the Frontex Within the Framework of Biopolitics

Authors: Gizem Ertürk, Nursena Dinç

Abstract:

The Aegean area has always been an important transit point for migration; however, the establishment of the European Union gave further impetus to the migration phenomenon and increased the significance of the area within this context. The migration waves have been more visible in the area in recent decades, and particularly after the “2015 Migration Crisis,” this issue has been subject to further securitization in the EU. In this conjuncture, the Frontex, which is an agency set up by the EU in 2005 for the purpose of managing and coordinating the border control efforts, has become more functional in the relevant area, but at the same time, have some questionable actions within the context of human rights. This paper problematizes the rationality behind the existence and practices of such a structure and attempts to make a political and legal analysis of the security practices of the European Union against migration within a framework based on the biopolitics approaches of Michel Foucault and Giorgio Agamben. The dataset of this paper, which focuses on the agency in question by taking it as a case, is formed by making use of the existing literature on the EU’s security policies, the relevant official texts of the Union and Frontex reports on migration practices in and around the Aegean Sea.

Keywords: migration, biopolitics, Frontex, security, European union, securitization

Procedia PDF Downloads 138
3455 Evaluation of the Appropriateness of Common Oxidants for Ruthenium (II) Chemiluminescence in a Microfluidic Detection Device Coupled to Microbore High Performance Liquid Chromatography for the Analysis of Drugs in Formulations and Biological Fluids

Authors: Afsal Mohammed Kadavilpparampu, Haider A. J. Al Lawati, Fakhr Eldin O. Suliman, Salma M. Z. Al Kindy

Abstract:

In this work, we evaluated the appropriateness of various oxidants that can be used potentially with Ru(bipy)32+ CL system while performing CL detection in a microfluidic device using eight common active pharmaceutical ingredients- ciprofloxacin, hydrochlorothiazide, norfloxacin, buspirone, fexofenadine, cetirizine, codeine, and dextromethorphan. This is because, microfludics have very small channel volume and the residence time is also very short. Hence, a highly efficient oxidant is required for on-chip CL detection to obtain analytically acceptable CL emission. Three common oxidants were evaluated, lead dioxide, cerium ammonium sulphate and ammonium peroxydisulphate. Results obtained showed that ammonium peroxydisulphate is the most appropriate oxidant which can be used in microfluidic setup and all the tested analyte give strong CL emission while using this oxidant. We also found that Ru(bipy)33+ generated off-line by oxidizing [Ru(bipy)3]Cl2.6H2O in acetonitrile under acidic condition with lead dioxide was stable for more than 72 hrs. A highly sensitive microbore HPLC- CL method using ammonium peroxydisulphate as an oxidant in a microfluidic on-chip CL detection has been developed for the analyses of fixed-dose combinations of pseudoephedrine (PSE), fexofenadine (FEX) and cetirizine (CIT) in biological fluids and pharmaceutical formulations with minimum sample pre-treatment.

Keywords: oxidants, microbore High Performance Liquid Chromatography, chemiluminescence, microfluidics

Procedia PDF Downloads 449
3454 The Role of Libraries in the Context of Indian Knowledge Based Society

Authors: Sanjeev Sharma

Abstract:

We are living in the information age. Information is not only important to an individual but also to researchers, scientists, academicians and all others who are doing work in their respective fields. The 21st century which is also known as the electronic era has brought several changes in the mechanism of the libraries in their working environment. In the present scenario, acquisition of information resources and implementation of new strategies have brought a revolution in the library’s structures and their principles. In the digital era, the role of the library has become important as new information is coming at every minute. The knowledge society wants to seek information at their desk. The libraries are managing electronic services and web-based information sources constantly in a democratic way. The basic objective of every library is to save the time of user which is based on the quality and user-orientation of services. With the advancement of information communication and technology, the libraries should pay more devotion to the development trends of the information society that would help to adjust their development strategies and information needs of the knowledge society. The knowledge-based society demands to re-define the position and objectives of all the institutions which work with information, knowledge, and culture. The situation is the era of digital India is changing at a fast speed. Everyone wants information 24x7 and libraries have been recognized as one of the key elements for open access to information, which is crucial not only to individual but also to democratic knowledge-based information society. Libraries are especially important now a day the whole concept of education is focusing more and more independent e-learning and their acting. The citizens of India must be able to find and use the relevant information. Here we can see libraries enter the stage: The essential features of libraries are to acquire, organize, store and retrieve for use and preserve publicly available material irrespective of the print as well as non-print form in which it is packaged in such a way that, when it is needed, it can be found and put to use.

Keywords: knowledge, society, libraries, culture

Procedia PDF Downloads 140
3453 Unlocking Health Insights: Studying Data for Better Care

Authors: Valentina Marutyan

Abstract:

Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.

Keywords: data mining, healthcare, big data, large amounts of data

Procedia PDF Downloads 76
3452 How Validated Nursing Workload and Patient Acuity Data Can Promote Sustained Change and Improvements within District Health Boards. the New Zealand Experience

Authors: Rebecca Oakes

Abstract:

In the New Zealand public health system, work has been taking place to use electronic systems to convey data from the ‘floor to the board’ that makes patient needs, and therefore nursing work, visible. For nurses, these developments in health information technology puts us in a very new and exciting position of being able to articulate the work of nursing through a language understood at all levels of an organisation, the language of acuity. Nurses increasingly have a considerable stake-hold in patient acuity data. Patient acuity systems, when used well, can assist greatly in demonstrating how much work is required, the type of work, and when it will be required. The New Zealand Safe Staffing Unit is supporting New Zealand nurses to create a culture of shared governance, where nursing data is informing policies, staffing methodologies and forecasting within their organisations. Assisting organisations to understand their acuity data, strengthening user confidence in using electronic patient acuity systems, and ensuring nursing and midwifery workload is accurately reflected is critical to the success of the safe staffing programme. Nurses and midwives have the capacity via an acuity tool to become key informers of organisational planning. Quality patient care, best use of health resources and a quality work environment are essential components of a safe, resilient and well resourced organisation. Nurses are the key informers of this information. In New Zealand a national level approach is paving the way for significant changes to the understanding and use of patient acuity and nursing workload information.

Keywords: nursing workload, patient acuity, safe staffing, New Zealand

Procedia PDF Downloads 382
3451 A Comprehensive Planning Model for Amalgamation of Intensification and Green Infrastructure

Authors: Sara Saboonian, Pierre Filion

Abstract:

The dispersed-suburban model has been the dominant one across North America for the past seventy years, characterized by automobile reliance, low density, and land-use specialization. Two planning models have emerged as possible alternatives to address the ills inflicted by this development pattern. First, there is intensification, which promotes efficient infrastructure by connecting high-density, multi-functional, and walkable nodes with public transit services within the suburban landscape. Second is green infrastructure, which provides environmental health and human well-being by preserving and restoring ecosystem services. This research studies incompatibilities and the possibility of amalgamating the two alternatives in an attempt to develop a comprehensive alternative to suburban model that advocates density, multi-functionality and transit- and pedestrian-conduciveness, with measures capable of mitigating the adverse environmental impacts of compactness. The research investigates three Canadian urban growth centers, where intensification is the current planning practice, and the awareness of green infrastructure benefits is on the rise. However, these three centers are contrasted by their development stage, the presence or absence of protected natural land, their environmental approach, and their adverse environmental consequences according to the planning cannons of different periods. The methods include reviewing the literature on green infrastructure planning, criticizing the Ontario provincial plans for intensification, surveying residents’ preferences for alternative models, and interviewing officials who deal with the local planning for the centers. Moreover, the research draws on recalling debates between New Urbanism and Landscape/Ecological Urbanism. The case studies expose the difficulties in creating urban growth centres that accommodate green infrastructure while adhering to intensification principles. First, the dominant status of intensification and the obstacles confronting intensification have monopolized the planners’ concerns. Second, the tension between green infrastructure and intensification explains the absence of the green infrastructure typologies that correspond to intensification-compatible forms and dynamics. Finally, the lack of highlighted social-economic benefits of green infrastructure reduces residents’ participation. Moreover, the results from the research provide insight into predominating urbanization theories, New Urbanism and Landscape/Ecological Urbanism. In order to understand political, planning, and ecological dynamics of such blending, dexterous context-specific planning is required. Findings suggest the influence of the following factors on amalgamating intensification and green infrastructure. Initially, producing ecosystem services-based justifications for green infrastructure development in the intensification context provides an expert-driven backbone for the implementation programs. This knowledge-base should be translated to effectively imbue different urban stakeholders. Moreover, due to the limited greenfields in intensified areas, spatial distribution and development of multi-level corridors such as pedestrian-hospitable settings and transportation networks along green infrastructure measures are required. Finally, to ensure the long-term integrity of implemented green infrastructure measures, significant investment in public engagement and education, as well as clarification of management responsibilities is essential.

Keywords: ecosystem services, green infrastructure, intensification, planning

Procedia PDF Downloads 355
3450 Mini-Open Repair Using Ring Forceps Show Similar Results to Repair Using Achillon Device in Acute Achilles Tendon Rupture

Authors: Chul Hyun Park

Abstract:

Background:Repair using the Achillon deviceis a representative mini-open repair technique;however, the limitations of this technique includethe need for special instruments and decreasedrepair strength.A modifiedmini-open repair using ring forcepsmight overcome these limitations. Purpose:This study was performed to compare the Achillon device with ring forceps in mini-open repairsof acute Achilles tendon rupture. Study Design:This was a retrospective cohort study, and the level of evidence was3. Methods:Fifty patients (41 men and 9 women), withacute Achilles tendon rupture on one foot, were consecutively treated using mini-open repair techniques. The first 20 patients were treated using the Achillon device (Achillon group) and the subsequent 30 patients were treated using a ring forceps (Forcep group). Clinical, functional, and isokinetic results,and postoperative complications were compared between the two groups at the last follow-up. Clinical evaluations wereperformed using the American Orthopedic Foot and Ankle Society (AOFAS) score, Achilles tendon Total Rupture Score (ATRS), length of incision, and operation time. Functional evaluationsincludedactive range of motion (ROM) of the ankle joint, maximum calf circumference (MCC), hopping test, and single limb heel-rise (SLHR) test. Isokinetic evaluations were performed using the isokinetic test for ankle plantar flexion. Results:The AOFAS score (p=0.669), ATRS (p=0.753), and length of incision (p=0.305) were not significantly different between the groups. Operative times in the Achillon group were significantly shorter than that in the Forcep group (p<0.001).The maximum height of SLHR (p=0.023) and number of SLHRs (p=0.045) in the Forcep group were significantly greater than that in the Achillon group. No significant differences in the mean peak torques for plantar flexion at angular speeds of 30°/s (p=0.219) and 120°/s (p=0.656) were detected between the groups. There was no significant difference in the occurrence of postoperative complications between the groups (p=0.093). Conclusion:The ring forceps technique is comparable with the Achillon technique with respect to clinical, functional, and isokinetic results and the postoperative complications. Given that no special instrument is required, the ring forceps technique could be a better option for acute Achilles tendon rupture repair.

Keywords: achilles tendon, acute rupture, repair, mini-open

Procedia PDF Downloads 81
3449 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients

Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho

Abstract:

Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).

Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper

Procedia PDF Downloads 146
3448 Study of Complex (CO) 3Ti (PHND) and CpV (PHND) (PHND = Phénanthridine)

Authors: Akila Tayeb-Benmachiche, Saber-Mustapha Zendaoui, Salah-Eddine Bouaoud, Bachir Zouchoune

Abstract:

The variation of the metal coordination site in π-coordinated polycyclic aromatic hydrocarbons (PAH) corresponds to the haptotropic rearrangement or haptotropic migration in which the metal fragment MLn is considered as the moveable moiety that is shifted between two rings of polycyclic or heteropolycyclic ligands. These structural characteristics and dynamical properties give to this category of transition metal complexes a considerable interest. We have investigated the coordination and the haptotropic shifts of (CO)3Ti and CpV moieties over the phenanthridine aromatic system and according to the metal atom nature. The optimization of (CO)3Ti(PHND) and CpV(PHND), using the Amsterdam Density Functional (ADF) program, without a symmetrical restriction of geometry gives an η6 coordination mode of the C6 and C5N rings, which in turn give rise to a six low-lying deficient 16-MVE of each (CO)3Ti(PHND) and CpV(PHND) structure (three singlet and three triplet state structures for Ti complexes and three triplet and three quintet state structures for V complexes). Thus, the η6–η6 haptotropic migration of the metal fragment MLn from the terminal C6 ring to the central C5N ring has been achieved by a loss of energy. However, its η6–η6 haptotropic migration from central C5N ring to the terminal C6 rings has been accomplished by a gain of energy. These results show the capability of the phenanthridine ligand to adapt itself to the electronic demand of the metal in agreement with the nature of the metal–ligand bonding and demonstrate that this theoretical study can also be applied to large fused π-systems.

Keywords: electronic structure, bonding analysis, density functional theory, coordination chemistry haptotropic migration

Procedia PDF Downloads 301
3447 Subsurface Exploration for Soil Geotechnical Properties and its Implications for Infrastructure Design and Construction in Victoria Island, Lagos, Nigeria

Authors: Sunday Oladele, Joseph Oluwagbeja Simeon

Abstract:

Subsurface exploration, integrating methods of geotechnics and geophysics, of a planned construction site in the coastal city of Lagos, Nigeria has been carried out with the aim of characterizing the soil properties and their implication for the proposed infrastructural development. Six Standard Penetration Tests (SPT), fourteen Dutch Cone Penetrometer Tests (DCPT) and 2D Electrical Resistivity Imaging employing Dipole-dipole and Pole-dipole arrays were implemented on the site. The topsoil (0 - 4m) consists of highly compacted sandy lateritic clay(10 to 5595Ωm) to 1.25m in some parts and dense sand in other parts to 5.50m depth. This topsoil was characterized as a material of very high shear strength (≤ 150kg/m2) and allowable bearing pressure value of 54kN/m2 to 85kN/m2 and a safety factor of 2.5. Soft amorphous peat/peaty clay (0.1 to 11.4Ωm), 3-6m thick, underlays the lateritic clay to about 18m depth. Grey, medium dense to very dense sand (0.37 to 2387Ωm) with occasional gravels underlies the peaty clay down to 30m depth. Within this layer, the freshwater bearing zones are characterized by high resistivity response (83 to 2387Ωm), while the clayey sand/saline water intruded sand produced subdued resistivity output (0.37 to 40Ωm). The overall ground-bearing pressure for the proposed structure would be 225kN/m2. Bored/cast-in-place pile at 18.00m depth with any of these diameters and respective safe working loads 600mm/1,140KN, 800mm/2,010KN and 1000mm/3,150KN is recommended for the proposed multi-story structure.

Keywords: subsurface exploration, Geotechnical properties, resistivity imaging, pile

Procedia PDF Downloads 93
3446 Towards Printed Green Time-Temperature Indicator

Authors: Mariia Zhuldybina, Ahmed Moulay, Mirko Torres, Mike Rozel, Ngoc-Duc Trinh, Chloé Bois

Abstract:

To reduce the global waste of perishable goods, a solution for monitoring and traceability of their environmental conditions is needed. Temperature is the most controllable environmental parameter determining the kinetics of physical, chemical, and microbial spoilage in food products. To store the time-temperature information, time-temperature indicator (TTI) is a promising solution. Printed electronics (PE) has shown a great potential to produce customized electronic devices using flexible substrates and inks with different functionalities. We propose to fabricate a hybrid printed TTI using environmentally friendly materials. The real-time TTI profile can be stored and transmitted to the smartphone via Near Field Communication (NFC). To ensure environmental performance, Canadian Green Electronics NSERC Network is developing green materials for the ink formulation with different functionalities. In terms of substrate, paper-based electronics has gained the great interest for utilization in a wide area of electronic systems because of their low costs in setup and methodology, as well as their eco-friendly fabrication technologies. The main objective is to deliver a prototype of TTI using small-scale printed techniques under typical printing conditions. All sub-components of the smart labels, including a memristor, a battery, an antenna compatible with NFC protocol, and a circuit compatible with integration performed by an offsite supplier will be fully printed with flexography or flat-bed screen printing.

Keywords: NFC, printed electronics, time-temperature indicator, hybrid electronics

Procedia PDF Downloads 163
3445 Right Cerebellar Stroke with a Right Vertebral Artery Occlusion Following an Embolization of the Right Glomus Tympanicum Tumor

Authors: Naim Izet Kajtazi

Abstract:

Context: Although rare, glomus tumor (i.e., nonchromaffin chemodectomas and paragan¬gliomas) is the most common middle ear tumor, with female predominance. Pre-operative embolization is often required to devascularize the hypervascular tumor for better surgical outcomes. Process: A 35-year-old female presented with episodes of frequent dizziness, ear fullness, and right ear tinnitus for 12 months. Head imaging revealed a right glomus tympanicum tumor. She underwent pre-operative endovascular embolization of the glomus tympanicum tumor with surgical, cyanoacrylate-based glue. Immediately after the procedure, she developed drowsiness and severe pain in the right temporal region. Further investigations revealed a right cerebellar stroke in the posterior inferior cerebellar artery territory. She was treated with intravenous heparin, followed by one year of oral anticoagulation. With rehabilitation, she significantly recovered from her post embolization stroke. However, the tumor was resected at another institution. Ten years later, follow-up imaging indicated a gradual increase in the size of the glomus jugulare tumor, compressing the nearby critical vascular structures. She subsequently received radiation therapy to treat the residual tumor. Outcome: Currently, she has no neurological deficit, but her mild dizziness, right ear tinnitus, and hearing impairment persist. Relevance: This case highlights the complex nature of these tumors, which often bring challenges to the patients as well as treatment teams. The multi-disciplinary team approach is necessary to tailor the management plan for individual tumors. Although embolization is a safe procedure, careful attention and thoughtful anatomic knowledge regarding dangerous anastomosis are essential to avoid devastating complications. Complications occur due to encountered vessel anomalies and new anastomoses formed during the gluing and changes in hemodynamics.

Keywords: stroke, embolization, MRI brain, cerebral angiogram

Procedia PDF Downloads 71
3444 Comparison of Support Vector Machines and Artificial Neural Network Classifiers in Characterizing Threatened Tree Species Using Eight Bands of WorldView-2 Imagery in Dukuduku Landscape, South Africa

Authors: Galal Omer, Onisimo Mutanga, Elfatih M. Abdel-Rahman, Elhadi Adam

Abstract:

Threatened tree species (TTS) play a significant role in ecosystem functioning and services, land use dynamics, and other socio-economic aspects. Such aspects include ecological, economic, livelihood, security-based, and well-being benefits. The development of techniques for mapping and monitoring TTS is thus critical for understanding the functioning of ecosystems. The advent of advanced imaging systems and supervised learning algorithms has provided an opportunity to classify TTS over fragmenting landscape. Recently, vegetation maps have been produced using advanced imaging systems such as WorldView-2 (WV-2) and robust classification algorithms such as support vectors machines (SVM) and artificial neural network (ANN). However, delineation of TTS in a fragmenting landscape using high resolution imagery has widely remained elusive due to the complexity of the species structure and their distribution. Therefore, the objective of the current study was to examine the utility of the advanced WV-2 data for mapping TTS in the fragmenting Dukuduku indigenous forest of South Africa using SVM and ANN classification algorithms. The results showed the robustness of the two machine learning algorithms with an overall accuracy (OA) of 77.00% (total disagreement = 23.00%) for SVM and 75.00% (total disagreement = 25.00%) for ANN using all eight bands of WV-2 (8B). This study concludes that SVM and ANN classification algorithms with WV-2 8B have the potential to classify TTS in the Dukuduku indigenous forest. This study offers relatively accurate information that is important for forest managers to make informed decisions regarding management and conservation protocols of TTS.

Keywords: artificial neural network, threatened tree species, indigenous forest, support vector machines

Procedia PDF Downloads 515
3443 Mapping Iron Content in the Brain with Magnetic Resonance Imaging and Machine Learning

Authors: Gabrielle Robertson, Matthew Downs, Joseph Dagher

Abstract:

Iron deposition in the brain has been linked with a host of neurological disorders such as Alzheimer’s, Parkinson’s, and Multiple Sclerosis. While some treatment options exist, there are no objective measurement tools that allow for the monitoring of iron levels in the brain in vivo. An emerging Magnetic Resonance Imaging (MRI) method has been recently proposed to deduce iron concentration through quantitative measurement of magnetic susceptibility. This is a multi-step process that involves repeated modeling of physical processes via approximate numerical solutions. For example, the last two steps of this Quantitative Susceptibility Mapping (QSM) method involve I) mapping magnetic field into magnetic susceptibility and II) mapping magnetic susceptibility into iron concentration. Process I involves solving an ill-posed inverse problem by using regularization via injection of prior belief. The end result from Process II highly depends on the model used to describe the molecular content of each voxel (type of iron, water fraction, etc.) Due to these factors, the accuracy and repeatability of QSM have been an active area of research in the MRI and medical imaging community. This work aims to estimate iron concentration in the brain via a single step. A synthetic numerical model of the human head was created by automatically and manually segmenting the human head on a high-resolution grid (640x640x640, 0.4mm³) yielding detailed structures such as microvasculature and subcortical regions as well as bone, soft tissue, Cerebral Spinal Fluid, sinuses, arteries, and eyes. Each segmented region was then assigned tissue properties such as relaxation rates, proton density, electromagnetic tissue properties and iron concentration. These tissue property values were randomly selected from a Probability Distribution Function derived from a thorough literature review. In addition to having unique tissue property values, different synthetic head realizations also possess unique structural geometry created by morphing the boundary regions of different areas within normal physical constraints. This model of the human brain is then used to create synthetic MRI measurements. This is repeated thousands of times, for different head shapes, volume, tissue properties and noise realizations. Collectively, this constitutes a training-set that is similar to in vivo data, but larger than datasets available from clinical measurements. This 3D convolutional U-Net neural network architecture was used to train data-driven Deep Learning models to solve for iron concentrations from raw MRI measurements. The performance was then tested on both synthetic data not used in training as well as real in vivo data. Results showed that the model trained on synthetic MRI measurements is able to directly learn iron concentrations in areas of interest more effectively than other existing QSM reconstruction methods. For comparison, models trained on random geometric shapes (as proposed in the Deep QSM method) are less effective than models trained on realistic synthetic head models. Such an accurate method for the quantitative measurement of iron deposits in the brain would be of important value in clinical studies aiming to understand the role of iron in neurological disease.

Keywords: magnetic resonance imaging, MRI, iron deposition, machine learning, quantitative susceptibility mapping

Procedia PDF Downloads 136
3442 Factors Affecting Slot Machine Performance in an Electronic Gaming Machine Facility

Authors: Etienne Provencal, David L. St-Pierre

Abstract:

A facility exploiting only electronic gambling machines (EGMs) opened in 2007 in Quebec City, Canada under the name of Salons de Jeux du Québec (SdjQ). This facility is one of the first worldwide to rely on that business model. This paper models the performance of such EGMs. The interest from a managerial point of view is to identify the variables that can be controlled or influenced so that a comprehensive model can help improve the overall performance of the business. The EGM individual performance model contains eight different variables under study (Game Title, Progressive jackpot, Bonus Round, Minimum Coin-in, Maximum Coin-in, Denomination, Slant Top and Position). Using data from Quebec City’s SdjQ, a linear regression analysis explains 90.80% of the EGM performance. Moreover, results show a behavior slightly different than that of a casino. The addition of GameTitle as a factor to predict the EGM performance is one of the main contributions of this paper. The choice of the game (GameTitle) is very important. Games having better position do not have significantly better performance than games located elsewhere on the gaming floor. Progressive jackpots have a positive and significant effect on the individual performance of EGMs. The impact of BonusRound on the dependent variable is significant but negative. The effect of Denomination is significant but weakly negative. As expected, the Language of an EGMS does not impact its individual performance. This paper highlights some possible improvements by indicating which features are performing well. Recommendations are given to increase the performance of the EGMs performance.

Keywords: EGM, linear regression, model prediction, slot operations

Procedia PDF Downloads 255
3441 Radio Frequency Identification Device Based Emergency Department Critical Care Billing: A Framework for Actionable Intelligence

Authors: Shivaram P. Arunachalam, Mustafa Y. Sir, Andy Boggust, David M. Nestler, Thomas R. Hellmich, Kalyan S. Pasupathy

Abstract:

Emergency departments (EDs) provide urgent care to patients throughout the day in a complex and chaotic environment. Real-time location systems (RTLS) are increasingly being utilized in healthcare settings, and have shown to improve safety, reduce cost, and increase patient satisfaction. Radio Frequency Identification Device (RFID) data in an ED has been shown to compute variables such as patient-provider contact time, which is associated with patient outcomes such as 30-day hospitalization. These variables can provide avenues for improving ED operational efficiency. A major challenge with ED financial operations is under-coding of critical care services due to physicians’ difficulty reporting accurate times for critical care provided under Current Procedural Terminology (CPT) codes 99291 and 99292. In this work, the authors propose a framework to optimize ED critical care billing using RFID data. RFID estimated physician-patient contact times could accurately quantify direct critical care services which will help model a data-driven approach for ED critical care billing. This paper will describe the framework and provide insights into opportunities to prevent under coding as well as over coding to avoid insurance audits. Future work will focus on data analytics to demonstrate the feasibility of the framework described.

Keywords: critical care billing, CPT codes, emergency department, RFID

Procedia PDF Downloads 131
3440 Neural Graph Matching for Modification Similarity Applied to Electronic Document Comparison

Authors: Po-Fang Hsu, Chiching Wei

Abstract:

In this paper, we present a novel neural graph matching approach applied to document comparison. Document comparison is a common task in the legal and financial industries. In some cases, the most important differences may be the addition or omission of words, sentences, clauses, or paragraphs. However, it is a challenging task without recording or tracing the whole edited process. Under many temporal uncertainties, we explore the potentiality of our approach to proximate the accurate comparison to make sure which element blocks have a relation of edition with others. In the beginning, we apply a document layout analysis that combines traditional and modern technics to segment layouts in blocks of various types appropriately. Then we transform this issue into a problem of layout graph matching with textual awareness. Regarding graph matching, it is a long-studied problem with a broad range of applications. However, different from previous works focusing on visual images or structural layout, we also bring textual features into our model for adapting this domain. Specifically, based on the electronic document, we introduce an encoder to deal with the visual presentation decoding from PDF. Additionally, because the modifications can cause the inconsistency of document layout analysis between modified documents and the blocks can be merged and split, Sinkhorn divergence is adopted in our neural graph approach, which tries to overcome both these issues with many-to-many block matching. We demonstrate this on two categories of layouts, as follows., legal agreement and scientific articles, collected from our real-case datasets.

Keywords: document comparison, graph matching, graph neural network, modification similarity, multi-modal

Procedia PDF Downloads 179