Search results for: faster RCNN
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 833

Search results for: faster RCNN

413 Comparison of Two Anesthetic Methods during Interventional Neuroradiology Procedure: Propofol versus Sevoflurane Using Patient State Index

Authors: Ki Hwa Lee, Eunsu Kang, Jae Hong Park

Abstract:

Background: Interventional neuroradiology (INR) has been a rapidly growing and evolving neurosurgical part during the past few decades. Sevoflurane and propofol are both suitable anesthetics for INR procedure. Monitoring of depth of anesthesia is being used very widely. SEDLine™ monitor, a 4-channel processed EEG monitor, uses a proprietary algorithm to analyze the raw EEG signal and displays the Patient State Index (PSI) values. There are only a fewer studies examining the PSI in the neuro-anesthesia. We aimed to investigate the difference of PSI values and hemodynamic variables between sevoflurane and propofol anesthesia during INR procedure. Methods: We reviewed the medical records of patients who scheduled to undergo embolization of non-ruptured intracranial aneurysm by a single operator from May 2013 to December 2014, retrospectively. Sixty-five patients were categorized into two groups; sevoflurane (n = 33) vs propofol (n = 32) group. The PSI values, hemodynamic variables, and the use of hemodynamic drugs were analyzed. Results: Significant differences were seen between PSI values obtained during different perioperative stages in both two groups (P < 0.0001). The PSI values of propofol group were lower than that of sevoflurane group during INR procedure (P < 0.01). The patients in propofol group had more prolonged time of extubation and more phenylephrine requirement than sevoflurane group (p < 0.05). Anti-hypertensive drug was more administered to the patients during extubation in sevoflurane group (p < 0.05). Conclusions: The PSI can detect depth of anesthesia and changes of concentration of anesthetics during INR procedure. Extubation was faster in sevoflurane group, but smooth recovery was shown in propofol group.

Keywords: interventional neuroradiology, patient state index, propofol, sevoflurane

Procedia PDF Downloads 172
412 A Randomized Comparative Evaluation of Efficacy of Ultrasound Guided Costoclavicular and Supraclavicular Approaches of Brachial Plexus Block for Upper Limb Surgeries

Authors: Anshul, Rajni Kalia, Sachin Kumar

Abstract:

Introduction: The costoclavicular approach, a modification to the infraclavicular approach, has been described for anesthesia for upper limb surgeries. Material And Methods: In this randomized and single-blind study, fourty patients undergoing emergency/elective upper limb surgery were allocated to two groups. Group C and S received ultrasound-guided Costoclavicular block and Supraclavicular block, respectively, with 20 ml 0.5 % ropivacaine with 8 mg dexamethasone under strict asepsis. The primary outcome assessed was the total duration of sensory and motor block in the postoperative period. Secondary outcomes were to compare the time taken to perform the procedure, block characteristics in terms of onset of motor and sensory blockade, the efficacy of analgesia with respect to the time of administration of the first rescue analgesic dose with both the blocks and note the side effects pertaining to either of the blocks. Results: The mean total duration of sensory and motor blockade was longer in group C vs. group S (p=0.002 and 0.024, respectively). The mean duration to perform a block in group S was more than in group C (p=0.012). The mean onset of sensory and motor Blockade Time in group S was more than in group C (p<0.001 and <0.001, respectively). The mean duration to perform a block in group S was more than in group C (p=0.012). Conclusion: The costoclavicular approach is better than supraclavicular in terms of rapid execution, faster onset of sensory-motor blockade, prolonged postoperative analgesia and similar PONV and safety profile.

Keywords: costoclavicular, supraclavicular, ropivacaine, dexamethasone

Procedia PDF Downloads 61
411 An Investigation into Computer Vision Methods to Identify Material Other Than Grapes in Harvested Wine Grape Loads

Authors: Riaan Kleyn

Abstract:

Mass wine production companies across the globe are provided with grapes from winegrowers that predominantly utilize mechanical harvesting machines to harvest wine grapes. Mechanical harvesting accelerates the rate at which grapes are harvested, allowing grapes to be delivered faster to meet the demands of wine cellars. The disadvantage of the mechanical harvesting method is the inclusion of material-other-than-grapes (MOG) in the harvested wine grape loads arriving at the cellar which degrades the quality of wine that can be produced. Currently, wine cellars do not have a method to determine the amount of MOG present within wine grape loads. This paper seeks to find an optimal computer vision method capable of detecting the amount of MOG within a wine grape load. A MOG detection method will encourage winegrowers to deliver MOG-free wine grape loads to avoid penalties which will indirectly enhance the quality of the wine to be produced. Traditional image segmentation methods were compared to deep learning segmentation methods based on images of wine grape loads that were captured at a wine cellar. The Mask R-CNN model with a ResNet-50 convolutional neural network backbone emerged as the optimal method for this study to determine the amount of MOG in an image of a wine grape load. Furthermore, a statistical analysis was conducted to determine how the MOG on the surface of a grape load relates to the mass of MOG within the corresponding grape load.

Keywords: computer vision, wine grapes, machine learning, machine harvested grapes

Procedia PDF Downloads 86
410 Comparison of Corneal Curvature Measurements Conducted with Tomey AO-2000® and the Current Standard Biometer IOL Master®

Authors: Mohd Radzi Hilmi, Khairidzan Mohd Kamal, Che Azemin Mohd Zulfaezal, Ariffin Azrin Esmady

Abstract:

Purpose: Corneal curvature (CC) is an important anterior segment parameter. This study compared CC measurements conducted with two optical devices in phakic eyes. Methods: Sixty phakic eyes of 30 patients were enrolled in this study. CC was measured three times with the optical biometer and topography-keratometer Tomey AO-2000 (Tomey Corporation, Nagoya, Japan), then with the standard partial optical coherence interferometry (PCI) IOL Master (Carl Zeiss Meditec, Dublin, CA) and data were statistically analysed. Results: The measurements resulted in a mean CC of 43.86 ± 1.57 D with Tomey AO-2000 and 43.84 ± 1.55 D with IOL Master. Distribution of data is normal, and no significance difference in CC values was detected (P = 0.952) between the two devices. Correlation between CC measurements was highly significant (r = 0. 99; P < 0.0001). The mean difference of CC values between devices was 0.017 D and 95% limit of agreement was -0.088 to 0.12. Duration taken for measurements with the standard biometer IOL Master was longer (55.17 ± 2.24 seconds) than with Tomey AO-2000 (39.88 ± 2.38 seconds) in automatic mode. Duration of manual measurement with Tomey AO-2000 in manual mode was the shortest (28.57 ± 2.71 seconds). Conclusion: In phakic eyes, CC measured with Tomey AO-2000 and IOL Master showed similar values, and high correlation was observed between these two devices. This shows that both devices can be used interchangeably. Tomey AO-2000 is better in terms of faster to operate and has its own topography systems.

Keywords: corneal topography, corneal curvature, IOL Master, Tomey AO2000

Procedia PDF Downloads 378
409 Electrochemical Sensing of L-Histidine Based on Fullerene-C60 Mediated Gold Nanocomposite

Authors: Sanjeeb Sutradhar, Archita Patnaik

Abstract:

Histidine is one of the twenty-two naturally occurring essential amino acids exhibiting two conformations, L-histidine and D-histidine. D-Histidine is biologically inert, while L-histidine is bioactive because of its conversion to neurotransmitter or neuromodulator histamine in both brain as well as central nervous system. The deficiency of L-histidine causes serious diseases like Parkinson’s disease, epilepsy and the failure of normal erythropoiesis development. Gold nanocomposites are attractive materials due to their excellent biocompatibility and are easy to adsorb on the electrode surface. In the present investigation, hydrophobic fullerene-C60 was functionalized with homocysteine via nucleophilic addition reaction to make it hydrophilic and to successively make the nanocomposite with in-situ prepared gold nanoparticles with ascorbic acid as reducing agent. The electronic structure calculations of the AuNPs@Hcys-C60 nanocomposite showed a drastic reduction of HOMO-LUMO gap compared to the corresponding molecules of interest, indicating enhanced electron transportability to the electrode surface. In addition, the electrostatic potential map of the nanocomposite showed the charge was distributed over either end of the nanocomposite, evidencing faster direct electron transfer from nanocomposite to the electrode surface. This nanocomposite showed catalytic activity; the nanocomposite modified glassy carbon electrode showed a tenfold higher kₑt, the electron transfer rate constant than the bare glassy carbon electrode. Significant improvement in its sensing behavior by square wave voltammetry was noted.

Keywords: fullerene-C60, gold nanocomposites, L-Histidine, square wave voltammetry

Procedia PDF Downloads 244
408 The Need for a Consistent Regulatory Framework for CRISPR Gene-Editing in the European Union

Authors: Andrew Thayer, Courtney Rondeau, Paraskevi Papadopoulou

Abstract:

The Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR) gene-editing technologies have generated considerable discussion about the applications and ethics of their use. However, no consistent guidelines for using CRISPR technologies have been developed -nor common legislation passed related to gene editing, especially as it is connected to genetically modified organisms (GMOs) in the European Union. The recent announcement that the first babies with CRISPR-edited genes were born, along with new studies exploring CRISPR’s applications in treating thalassemia, sickle-cell anemia, cancer, and certain forms of blindness, have demonstrated that the technology is developing faster than the policies needed to control it. Therefore, it can be seen that a reasonable and coherent regulatory framework for the use of CRISPR in human somatic and germline cells is necessary to ensure the ethical use of the technology in future years. The European Union serves as a unique region of interconnected countries without a standard set of regulations or legislation for CRISPR gene-editing. We posit that the EU would serve as a suitable model in comparing the legislations of its affiliated countries in order to understand the practicality and effectiveness of adopting majority-approved practices. Additionally, we present a proposed set of guidelines which could serve as a basis in developing a consistent regulatory framework for the EU countries to implement but also act as a good example for other countries to adhere to. Finally, an additional, multidimensional framework of smart solutions is proposed with which all stakeholders are engaged to become better-informed citizens.

Keywords: CRISPR, ethics, regulatory framework, European legislation

Procedia PDF Downloads 130
407 Thermography Evaluation on Facial Temperature Recovery after Elastic Gum

Authors: A. Dionísio, L. Roseiro, J. Fonseca, P. Nicolau

Abstract:

Thermography is a non-radiating and contact-free technology which can be used to monitor skin temperature. The efficiency and safety of thermography technology make it a useful tool for detecting and locating thermal changes in skin surface, characterized by increases or decreases in temperature. This work intends to be a contribution for the use of thermography as a methodology for evaluation of skin temperature in the context of orofacial biomechanics. The study aims to identify the oscillations of skin temperature in the left and right hemiface regions of the masseter muscle, during and after thermal stimulus, and estimate the time required to restore the initial temperature after the application of the stimulus. Using a FLIR T430sc camera, a data acquisition protocol was followed with a group of eight volunteers, aged between 22 and 27 years. The tests were performed in a controlled environment with the volunteers in a comfortably static position. The thermal stimulus involves the use of an ice volume with controlled size and contact surface. The skin surface temperature was recorded in two distinct situations, namely without further stimulus and with the additions of a stimulus obtained by a chewing gum. The data obtained were treated using FLIR Research IR Max software. The time required to recover the initial temperature ranged from 20 to 52 minutes when no stimulus was added and varied between 8 and 26 minutes with the chewing gum stimulus. These results show that recovery is faster with the addition of the stimulus and may guide clinicians regarding the pre and post-operative times with ice therapy, in the presence or absence of mechanical stimulus that increases muscle functions (e.g. phonetics or mastication).

Keywords: thermography, orofacial biomechanics, skin temperature, ice therapy

Procedia PDF Downloads 248
406 The Impact of Data Science on Geography: A Review

Authors: Roberto Machado

Abstract:

We conducted a systematic review using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses methodology, analyzing 2,996 studies and synthesizing 41 of them to explore the evolution of data science and its integration into geography. By employing optimization algorithms, we accelerated the review process, significantly enhancing the efficiency and precision of literature selection. Our findings indicate that data science has developed over five decades, facing challenges such as the diversified integration of data and the need for advanced statistical and computational skills. In geography, the integration of data science underscores the importance of interdisciplinary collaboration and methodological innovation. Techniques like large-scale spatial data analysis and predictive algorithms show promise in natural disaster management and transportation route optimization, enabling faster and more effective responses. These advancements highlight the transformative potential of data science in geography, providing tools and methodologies to address complex spatial problems. The relevance of this study lies in the use of optimization algorithms in systematic reviews and the demonstrated need for deeper integration of data science into geography. Key contributions include identifying specific challenges in combining diverse spatial data and the necessity for advanced computational skills. Examples of connections between these two fields encompass significant improvements in natural disaster management and transportation efficiency, promoting more effective and sustainable environmental solutions with a positive societal impact.

Keywords: data science, geography, systematic review, optimization algorithms, supervised learning

Procedia PDF Downloads 15
405 Risk Factors for Postoperative Fever in Patients Undergoing Lumbar Fusion

Authors: Bang Haeyong

Abstract:

Purpose: The objectives of this study were to determine the prevalence, incidence, and risk factors for postoperative fever after lumbar fusion. Methods: This study was a retrospective chart review of 291 patients who underwent lumbar fusion between March 2015 and February 2016 at the Asan Medical Center. Information was extracted from electronic medical records. Postoperative fever was measured at Tmax > 37.7 ℃ and Tmax > 38.3 ℃. The presence of postoperative fever, blood culture, urinary excretion, and/or chest x-ray were evaluated. Patients were evaluated for infection after lumbar fusion. Results: We found 222 patients (76.3%) had a postoperative temperature of 37.7 ℃, and 162 patients (55.7%) had a postoperative temperature of 38.3 ℃ or higher. The percentage of febrile patients trended down following the mean 1.8days (from the first postoperative day to seventh postoperative day). Infection rate was 9 patients (3.1%), respiratory virus (1.7%), urinary tract infection (0.3%), phlebitis (0.3%), and surgical site infection (1.4%). There was no correlation between Tmax > 37.7℃ or Tmax > 38.3℃, and timing of fever, positive blood or urine cultures, pneumonia, or surgical site infection. Risk factors for increased postoperative fever following surgery were confirmed to be delay of defecation (OR=1.37, p=.046), and shorten of remove drainage (OR=0.66, p=.037). Conclusions: The incidence of fever was 76.3% after lumbar fusion and the drainage time was faster in the case of fever. It was thought that the bleeding was absorbed at the operation site and fever occurred. The prevalence of febrile septicemia was higher in patients with long bowel movements before surgery than after surgery. Clinical symptoms should be considered because postoperative fever cannot be determined by fever alone because fever and infection are not significant.

Keywords: lumbar surgery, fever, postoperative, risk factor

Procedia PDF Downloads 242
404 Antioxidant Face Mask from Purple Sweet Potato (Ipomea Batatas) with Oleum Cytrus

Authors: Lilis Kistriyani, Dine Olisvia, Lutfa Rahmawati

Abstract:

Facial mask is an important part of every beauty treatment because it will give a smooth and gentle effect on the face. This research is done to make edible film that will be applied for face mask. The main ingredient in making this edible film is purple sweet potato powder with the addition of glycerol as plasticizer. One of the ingredients in purple sweet potato is a flavonoid compound. The purpose of this study was to determine the effect of increasing the amount of glycerol to flavonoids release and the effect on the physical properties and biological properties of edible film produced. The stages of this research are the making of edible film, then perform some analysis, among others, spectrophotometer UV-vis analysis to find out how many flavonoids can be released into facial skin, tensile strength and elongation of break analysis, biodegradability analysis, and microbiological analysis. The variation of edible film is the volume of glycerol that is 1 ml, 2 ml, 3 ml. The results of spectrophotometer UV-vis analysis showed that the most flavonoid release concentration is 20.33 ppm in the 2 ml glycerol variation. The best tensile strength value is 8,502 N, and the greatest elongation of break value is 14% in 1 ml glycerol variation. In the biodegradability test, the more volume of glycerol added the faster the edible film is degraded. The results of microbiological analysis showed that purple sweet potato extract has the ability to inhibit the growth of Propionibacterium acnes seen in the presence of inhibiting zone which is 18.9 mm.

Keywords: face mask, edible film, plasticizer, flavonoid

Procedia PDF Downloads 166
403 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services

Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme

Abstract:

Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.

Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing

Procedia PDF Downloads 103
402 Evaluation of DNA Microarray System in the Identification of Microorganisms Isolated from Blood

Authors: Merih Şimşek, Recep Keşli, Özgül Çetinkaya, Cengiz Demir, Adem Aslan

Abstract:

Bacteremia is a clinical entity with high morbidity and mortality rates when immediate diagnose, or treatment cannot be achieved. Microorganisms which can cause sepsis or bacteremia are easily isolated from blood cultures. Fifty-five positive blood cultures were included in this study. Microorganisms in 55 blood cultures were isolated by conventional microbiological methods; afterwards, microorganisms were defined in terms of the phenotypic aspects by the Vitek-2 system. The same microorganisms in all blood culture samples were defined in terms of genotypic aspects again by Multiplex-PCR DNA Low-Density Microarray System. At the end of the identification process, the DNA microarray system’s success in identification was evaluated based on the Vitek-2 system. The Vitek-2 system and DNA Microarray system were able to identify the same microorganisms in 53 samples; on the other hand, different microorganisms were identified in the 2 blood cultures by DNA Microarray system. The microorganisms identified by Vitek-2 system were found to be identical to 96.4 % of microorganisms identified by DNA Microarrays system. In addition to bacteria identified by Vitek-2, the presence of a second bacterium has been detected in 5 blood cultures by the DNA Microarray system. It was identified 18 of 55 positive blood culture as E.coli strains with both Vitek 2 and DNA microarray systems. The same identification numbers were found 6 and 8 for Acinetobacter baumanii, 10 and 10 for K.pneumoniae, 5 and 5 for S.aureus, 7 and 11 for Enterococcus spp, 5 and 5 for P.aeruginosa, 2 and 2 for C.albicans respectively. According to these results, DNA Microarray system requires both a technical device and experienced staff support; besides, it requires more expensive kits than Vitek-2. However, this method should be used in conjunction with conventional microbiological methods. Thus, large microbiology laboratories will produce faster, more sensitive and more successful results in the identification of cultured microorganisms.

Keywords: microarray, Vitek-2, blood culture, bacteremia

Procedia PDF Downloads 344
401 Technological Ensuring of the Space Reflector Antennas Manufacturing Process from Carbon Fiber Reinforced Plastics

Authors: Pyi Phyo Maung

Abstract:

In the study, the calculations of the permeability coefficient, values of the volume and porosity of a unit cell of a woven fabric before and after deformation based on the geometrical parameters are presented. Two types of carbon woven fabric structures were investigated: standard type, which integrated the filament, has a cross sectional shape of a cylinder and spread tow type, which has a rectangular cross sectional shape. The space antennas reflector, which distinctive feature is the presence of the surface of double curvature, is considered as the object of the research. Modeling of the kinetics of the process of impregnation of the reflector for the two types of carbon fabric’s unit cell structures was performed using software RAM-RTM. This work also investigated the influence of the grid angle between warp and welt of the unit cell on the duration of impregnation process. The results showed that decreasing the angle between warp and welt of the unit cell, the decreasing of the permeability values were occurred. Based on the results of calculation samples of the reflectors, their quality was determined. The comparisons of the theoretical and experimental results have been carried out. Comparison of the two textile structures (standard and spread tow) showed that the standard textiles with circular cross section were impregnated faster than spread tows, which have a rectangular cross section.

Keywords: vacuum assistant resin infusion, impregnation time, shear angle, reflector and modeling

Procedia PDF Downloads 268
400 Vehicle Routing Problem Considering Alternative Roads under Triple Bottom Line Accounting

Authors: Onur Kaya, Ilknur Tukenmez

Abstract:

In this study, we consider vehicle routing problems on networks with alternative direct links between nodes, and we analyze a multi-objective problem considering the financial, environmental and social objectives in this context. In real life, there might exist several alternative direct roads between two nodes, and these roads might have differences in terms of their lengths and durations. For example, a road might be shorter than another but might require longer time due to traffic and speed limits. Similarly, some toll roads might be shorter or faster but require additional payment, leading to higher costs. We consider such alternative links in our problem and develop a mixed integer linear programming model that determines which alternative link to use between two nodes, in addition to determining the optimal routes for different vehicles, depending on the model objectives and constraints. We consider the minimum cost routing as the financial objective for the company, minimizing the CO2 emissions and gas usage as the environmental objectives, and optimizing the driver working conditions/working hours, and minimizing the risks of accidents as the social objectives. With these objective functions, we aim to determine which routes, and which alternative links should be used in addition to the speed choices on each link. We discuss the results of the developed vehicle routing models and compare their results depending on the system parameters.

Keywords: vehicle routing, alternative links between nodes, mixed integer linear programming, triple bottom line accounting

Procedia PDF Downloads 400
399 Development and Characterization of a Film Based on Hydroxypropyl Methyl Cellulose Incorporated by a Phenolic Extract of Fennel and Reinforced by Magnesium Oxide: In Vivo - in Vitro

Authors: Mazouzi Nourdjihane, K. Boutemak, A. Haddad, Y. Chegreouche

Abstract:

In the last decades, biodegradable polymers have been considered as one of the most popular options for the delivery of drugs and various conventional doses. The film forming system (FFS) can be used in topical, transdermal, ophthalmic, oral and gastric applications. Recently this system has focused on improving drug delivery, which can promote drug release. In this context, the aim of this study is to create polymeric film-forming systems for the stomach and to evaluate and test their gastroprotective effects, comparing the effects of changes in composition on film characteristics. It uses a plant-derived polyphenol extract extracted from fennel to demonstrate anti-inflammatory activity in the film. The films are made from hydroxypropyl methylcellulose polymer and different types of plastic, glycerol and polyethylene glycol. The ffs properties show that MgO-glycerol-reinforced hydroxypropylmethylcellulose (HPMC-MgO-Gly) is better than that based on MgO-PEG-reinforced hydroxypropylmethylcellulose (HPMC-MgO-PEG). It is durable, has a faster drying time and allows for maximum recovery. Water vapor strength and blowing speed and other additions show another advantage of HPMC-MgO-Gly compared to HPMC-MgO-PEG, indicating good adhesion between the support (top) and film production. In this study, the gastroprotective effect of fennel phenol extract was found, showing that this plant material has a gastroprotective effect on ulcers and that the film can absorb the active substance.

Keywords: film formin system, hydroxypropyl methylcellulose, magnesium oxide, in vivo

Procedia PDF Downloads 62
398 A Hybrid Classical-Quantum Algorithm for Boundary Integral Equations of Scattering Theory

Authors: Damir Latypov

Abstract:

A hybrid classical-quantum algorithm to solve boundary integral equations (BIE) arising in problems of electromagnetic and acoustic scattering is proposed. The quantum speed-up is due to a Quantum Linear System Algorithm (QLSA). The original QLSA of Harrow et al. provides an exponential speed-up over the best-known classical algorithms but only in the case of sparse systems. Due to the non-local nature of integral operators, matrices arising from discretization of BIEs, are, however, dense. A QLSA for dense matrices was introduced in 2017. Its runtime as function of the system's size N is bounded by O(√Npolylog(N)). The run time of the best-known classical algorithm for an arbitrary dense matrix scales as O(N².³⁷³). Instead of exponential as in case of sparse matrices, here we have only a polynomial speed-up. Nevertheless, sufficiently high power of this polynomial, ~4.7, should make QLSA an appealing alternative. Unfortunately for the QLSA, the asymptotic separability of the Green's function leads to high compressibility of the BIEs matrices. Classical fast algorithms such as Multilevel Fast Multipole Method (MLFMM) take advantage of this fact and reduce the runtime to O(Nlog(N)), i.e., the QLSA is only quadratically faster than the MLFMM. To be truly impactful for computational electromagnetics and acoustics engineers, QLSA must provide more substantial advantage than that. We propose a computational scheme which combines elements of the classical fast algorithms with the QLSA to achieve the required performance.

Keywords: quantum linear system algorithm, boundary integral equations, dense matrices, electromagnetic scattering theory

Procedia PDF Downloads 145
397 Evaluation of Real-Time Background Subtraction Technique for Moving Object Detection Using Fast-Independent Component Analysis

Authors: Naoum Abderrahmane, Boumehed Meriem, Alshaqaqi Belal

Abstract:

Background subtraction algorithm is a larger used technique for detecting moving objects in video surveillance to extract the foreground objects from a reference background image. There are many challenges to test a good background subtraction algorithm, like changes in illumination, dynamic background such as swinging leaves, rain, snow, and the changes in the background, for example, moving and stopping of vehicles. In this paper, we propose an efficient and accurate background subtraction method for moving object detection in video surveillance. The main idea is to use a developed fast-independent component analysis (ICA) algorithm to separate background, noise, and foreground masks from an image sequence in practical environments. The fast-ICA algorithm is adapted and adjusted with a matrix calculation and searching for an optimum non-quadratic function to be faster and more robust. Moreover, in order to estimate the de-mixing matrix and the denoising de-mixing matrix parameters, we propose to convert all images to YCrCb color space, where the luma component Y (brightness of the color) gives suitable results. The proposed technique has been verified on the publicly available datasets CD net 2012 and CD net 2014, and experimental results show that our algorithm can detect competently and accurately moving objects in challenging conditions compared to other methods in the literature in terms of quantitative and qualitative evaluations with real-time frame rate.

Keywords: background subtraction, moving object detection, fast-ICA, de-mixing matrix

Procedia PDF Downloads 91
396 Contribution of Automated Early Warning Score Usage to Patient Safety

Authors: Phang Moon Leng

Abstract:

Automated Early Warning Scores is a newly developed clinical decision tool that is used to streamline and improve the process of obtaining a patient’s vital signs so a clinical decision can be made at an earlier stage to prevent the patient from further deterioration. This technology provides immediate update on the score and clinical decision to be taken based on the outcome. This paper aims to study the use of an automated early warning score system on whether the technology has assisted the hospital in early detection and escalation of clinical condition and improve patient outcome. The hospital adopted the Modified Early Warning Scores (MEWS) Scoring System and MEWS Clinical Response into Philips IntelliVue Guardian Automated Early Warning Score equipment and studied whether the process has been leaned, whether the use of technology improved the usage & experience of the nurses, and whether the technology has improved patient care and outcome. It was found the steps required to obtain vital signs has been significantly reduced and is used more frequently to obtain patient vital signs. The number of deaths, and length of stay has significantly decreased as clinical decisions can be made and escalated more quickly with the Automated EWS. The automated early warning score equipment has helped improve work efficiency by removing the need for documenting into patient’s EMR. The technology streamlines clinical decision-making and allows faster care and intervention to be carried out and improves overall patient outcome which translates to better care for patient.

Keywords: automated early warning score, clinical quality and safety, patient safety, medical technology

Procedia PDF Downloads 174
395 Importance of Developing a Decision Support System for Diagnosis of Glaucoma

Authors: Murat Durucu

Abstract:

Glaucoma is a condition of irreversible blindness, early diagnosis and appropriate interventions to make the patients able to see longer time. In this study, it addressed that the importance of developing a decision support system for glaucoma diagnosis. Glaucoma occurs when pressure happens around the eyes it causes some damage to the optic nerves and deterioration of vision. There are different levels ranging blindness of glaucoma disease. The diagnosis at an early stage allows a chance for therapies that slows the progression of the disease. In recent years, imaging technology from Heidelberg Retinal Tomography (HRT), Stereoscopic Disc Photo (SDP) and Optical Coherence Tomography (OCT) have been used for the diagnosis of glaucoma. This better accuracy and faster imaging techniques in response technique of OCT have become the most common method used by experts. Although OCT images or HRT precision and quickness, especially in the early stages, there are still difficulties and mistakes are occurred in diagnosis of glaucoma. It is difficult to obtain objective results on diagnosis and placement process of the doctor's. It seems very important to develop an objective decision support system for diagnosis and level the glaucoma disease for patients. By using OCT images and pattern recognition systems, it is possible to develop a support system for doctors to make their decisions on glaucoma. Thus, in this recent study, we develop an evaluation and support system to the usage of doctors. Pattern recognition system based computer software would help the doctors to make an objective evaluation for their patients. It is intended that after development and evaluation processes of the software, the system is planning to be serve for the usage of doctors in different hospitals.

Keywords: decision support system, glaucoma, image processing, pattern recognition

Procedia PDF Downloads 291
394 Leasing Revisited: Mastering the Digital Transformation with Traditional Financing

Authors: Tobias Huttche, Marco Canipa-Valdez, Corinne Mühlebach

Abstract:

This article discusses the role of leasing on the digital transformation process of companies and corresponding economic effects. Based on the traditional mechanisms of leasing, this article focuses in particular on the benefits of leasing as financing instrument with regard to the innovation potential of companies. Practical examples demonstrate how leasing can become an integral part of new business models. Especially, with regard to the digital transformation and corresponding investments in know-how and infrastructure, leasing can play an important role. Furthermore, findings of an empirical survey are presented dealing with the usage of leasing in Switzerland in an international context. The survey shows not only the benefits of leasing against the backdrop of digital transformation but gives guidance on how other countries can benefit from promoting leasing in their legislation and economy. Based on a simulation model for Switzerland, the economic effect of an increase in leasing volume is being calculated. Again, the respective results underline the substantial growth potential. This holds true especially for economies where asset-based lending is rarely used because of a lack of entrepreneurial or private security of the borrower (cash-based financing for developing and emerging countries). Overall, the authors found that leasing using companies are more productive and tend to grow faster than companies using less or none leasing. The positive effects of leasing on emerging digital challenges for companies and entire economies should encourage other countries to facilitate access to leasing as financing instrument by decreasing legal-, tax- and accounting-related requirements in the respective jurisdiction.

Keywords: Cash-Based financing, digital transformation, financing instruments, growth, innovation, leasing

Procedia PDF Downloads 251
393 Sentiment Analysis of Chinese Microblog Comments: Comparison between Support Vector Machine and Long Short-Term Memory

Authors: Xu Jiaqiao

Abstract:

Text sentiment analysis is an important branch of natural language processing. This technology is widely used in public opinion analysis and web surfing recommendations. At present, the mainstream sentiment analysis methods include three parts: sentiment analysis based on a sentiment dictionary, based on traditional machine learning, and based on deep learning. This paper mainly analyzes and compares the advantages and disadvantages of the SVM method of traditional machine learning and the Long Short-term Memory (LSTM) method of deep learning in the field of Chinese sentiment analysis, using Chinese comments on Sina Microblog as the data set. Firstly, this paper classifies and adds labels to the original comment dataset obtained by the web crawler, and then uses Jieba word segmentation to classify the original dataset and remove stop words. After that, this paper extracts text feature vectors and builds document word vectors to facilitate the training of the model. Finally, SVM and LSTM models are trained respectively. After accuracy calculation, it can be obtained that the accuracy of the LSTM model is 85.80%, while the accuracy of SVM is 91.07%. But at the same time, LSTM operation only needs 2.57 seconds, SVM model needs 6.06 seconds. Therefore, this paper concludes that: compared with the SVM model, the LSTM model is worse in accuracy but faster in processing speed.

Keywords: sentiment analysis, support vector machine, long short-term memory, Chinese microblog comments

Procedia PDF Downloads 84
392 Recent Progress in Wave Rotor Combustion

Authors: Mohamed Razi Nalim, Shahrzad Ghadiri

Abstract:

With current concerns regarding global warming, demand for a society with greater environmental awareness significantly increases. With gradual development in hybrid and electric vehicles and the availability of renewable energy resources, increasing efficiency in fossil fuel and combustion engines seems a faster solution toward sustainability and reducing greenhouse gas emissions. This paper aims to provide a comprehensive review of recent progress in wave rotor combustor, one of the combustion concepts with considerable potential to improve power output and emission standards. A wave rotor is an oscillatory flow device that uses the unsteady gas dynamic concept to transfer energy by generating pressure waves. From a thermodynamic point of view, unlike conventional positive-displacement piston engines which follow the Brayton cycle, wave rotors offer higher cycle efficiency due to pressure gain during the combustion process based on the Humphrey cycle. First, the paper covers all recent and ongoing computational and experimental studies around the world with a quick look at the milestones in the history of wave rotor development. Second, the main similarity and differences in the ignition system of the wave rotor with piston engines are considered. Also, the comparison is made with another pressure gain device, rotating detonation engines. Next, the main challenges and research needs for wave rotor combustor commercialization are discussed.

Keywords: wave rotor combustor, unsteady gas dynamic, pre-chamber jet ignition, pressure gain combustion, constant-volume combustion

Procedia PDF Downloads 75
391 An Experimental Study of Self-Regulated Learning with High School Gifted Pupils

Authors: Prakash Singh

Abstract:

Research studies affirm the view that gifted pupils are endowed with unique personality traits, enabling them to study at higher levels of thinking, at a faster pace, and with a greater degree of autonomy than their average counterparts. The focus of this study was whether high school gifted pupils are capable of studying an advanced level curriculum on their own by employing self-regulated learning (SRL) strategies. To be self-regulated, pupils are required to be metacognitively, motivationally, and behaviourally active participants in their own learning processes so that they are able to initiate and direct their personal curriculum efforts to acquire cognitive skills and knowledge, instead of being solely reliant on their teachers. Researchers working with gifted populations concede that limited studies have been conducted thus far to examine gifted pupils’ expertise in using SRL strategies to assume ownership of their learning. In order to conduct this investigation, an enriched module in Accounting for specifically gifted grade eleven pupils was developed, incorporating advanced level content, and use was made of the Post-test-Only Control Group Design to accomplish this research objective. The results emanating from this empirical study strongly suggest that SRL strategies can be employed to overcome a narrow, rigid approach that limits the education of gifted pupils in the regular classroom of the high school. SRL can meaningfully offer an alternative way to implement an advanced level curriculum for the gifted in the mainstream of education. This can be achieved despite the limitations of differentiation in the regular classroom.

Keywords: advanced level curriculum, high school gifted pupils, self-regulated learning, teachers’ professional competencies

Procedia PDF Downloads 399
390 Cache Analysis and Software Optimizations for Faster on-Chip Network Simulations

Authors: Khyamling Parane, B. M. Prabhu Prasad, Basavaraj Talawar

Abstract:

Fast simulations are critical in reducing time to market in CMPs and SoCs. Several simulators have been used to evaluate the performance and power consumed by Network-on-Chips. Researchers and designers rely upon these simulators for design space exploration of NoC architectures. Our experiments show that simulating large NoC topologies take hours to several days for completion. To speed up the simulations, it is necessary to investigate and optimize the hotspots in simulator source code. Among several simulators available, we choose Booksim2.0, as it is being extensively used in the NoC community. In this paper, we analyze the cache and memory system behaviour of Booksim2.0 to accurately monitor input dependent performance bottlenecks. Our measurements show that cache and memory usage patterns vary widely based on the input parameters given to Booksim2.0. Based on these measurements, the cache configuration having least misses has been identified. To further reduce the cache misses, we use software optimization techniques such as removal of unused functions, loop interchanging and replacing post-increment operator with pre-increment operator for non-primitive data types. The cache misses were reduced by 18.52%, 5.34% and 3.91% by employing above technology respectively. We also employ thread parallelization and vectorization to improve the overall performance of Booksim2.0. The OpenMP programming model and SIMD are used for parallelizing and vectorizing the more time-consuming portions of Booksim2.0. Speedups of 2.93x and 3.97x were observed for the Mesh topology with 30 × 30 network size by employing thread parallelization and vectorization respectively.

Keywords: cache behaviour, network-on-chip, performance profiling, vectorization

Procedia PDF Downloads 189
389 Fluoranthene Removal in Wastewater Using Biological and Physico-Chemical Methods

Authors: Angelica Salmeron Alcocer, Deifilia Ahuatzi Chacon, Felipe Rodriguez Casasola

Abstract:

Polycyclic aromatic hydrocarbons (PAHs) are produced naturally (forest fires, volcanic eruptions) and human activity (burning fossil fuels). Concern for PAHs is due to their toxic, mutagenic and carcinogenic effects and so pose a potential risk to human health and ecology. Therefore these are considered the most toxic components of oil, they are highly hydrophobic, making them easily depositable on the floor, air and water. One method of removing PAHs of contaminated soil used surfactants such as Tween 80, which it has been reported as less toxic and also increases the solubility of the PAH compared to other surfactants, fluoranthene is a PAH with molecular formula C16H10, its name derives from the fluorescence which presents to UV light. In this paper, a study of the fluoranthene removal solubilized with Tween 80 in synthetic wastewater using a microbial community (isolated from soil of coffee plantations in the state of Veracruz, Mexico) and Fenton oxidation method was performed. The microbial community was able to use both tween 80 and fluoranthene as carbon sources for growth, when the biological treatment in batch culture was applied, 100% of fluoranthene was mineralized, this only occurred at an initial concentration of 100 ppm, but by increasing the initial concentration of fluoranthene the removal efficiencies decay and degradation time increases due to the accumulation of byproducts more toxic or less biodegradable, however when the Fenton oxidation was previously applied to the biological treatment, it was observed that removal of fluoranthene improved because it is consumed approximately 2.4 times faster.

Keywords: fluoranthene, polycyclic aromatic hydrocarbons, biological treatment, fenton oxidation

Procedia PDF Downloads 234
388 Embedded System of Signal Processing on FPGA: Underwater Application Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.

Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing

Procedia PDF Downloads 72
387 The Dietary Behavior of Eating Alone in Middle-Aged Populations by Body Mass Index (BMI)

Authors: Pil Kyoo Jo, Youngmee Lee, Jee Young Kim, Yu Jin Oh, Sohyun Park, Young Ha Joo, Hye Suk Kim, Semi Kang

Abstract:

A growing number of people are living alone and eating alone. People might have different dietary behaviors between eating alone and eating with others, it can influence their weight and health. The purpose of this study was to investigate the dietary behavior of eating alone in middle-aged populations in South Korea. We used the nationally representative data from the 5th Korea National Health and Nutrition Examination Survey (KNHANES), 2010-2012 and a cross-sectional survey on the eating behaviors among adults (N=1318, 530 men, 788 women) aged from 20 to 54 years. Results showed that ‘underweight’ group ate more amount of food when eating with others compared to eating alone and ‘overweight’ and ‘obesity’ groups had opposite respondent (p<0.05). When having a meal alone, ‘underweight’ group ate food until didn’t feel hungry and ‘overweight’ and ‘obesity’ groups ate leftover food even they felt full (p<0.01). The ‘overweight’ and ‘obesity’ groups usually ate alone than ‘underweight’ group did (p<0.05). All groups had faster meal time when eating alone than eating with others and usually ate processed foods for convenience when eating alone. Younger people, aged 10-30, ate more processed food than older people did. South Koreans spend nearly 45% of their total food consumption from processed foods. This research was supported by the National Research Foundation of Korea for 2011 Korea-Japan Basic Scientific Cooperation Program (NRF-2011B00003). This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2015S1A5B6037369).

Keywords: BMI, dietary behavior, eating alone, middle-aged populations

Procedia PDF Downloads 267
386 Emptiness Downlink and Uplink Proposal Using Space-Time Equation Interpretation

Authors: Preecha Yupapin And Somnath

Abstract:

From the emptiness, the vibration induces the fractal, and the strings are formed. From which the first elementary particle groups, known as quarks, were established. The neutrino and electron are created by them. More elementary particles and life are formed by organic and inorganic substances. The universe is constructed, from which the multi-universe has formed in the same way. universe assumes that the intense energy has escaped from the singularity cone from the multi-universes. Initially, the single mass energy is confined, from which it is disturbed by the space-time distortion. It splits into the entangled pair, where the circular motion is established. It will consider one side of the entangled pair, where the fusion energy of the strong coupling force has formed. The growth of the fusion energy has the quantum physic phenomena, where the moving of the particle along the circumference with a speed faster than light. It introduces the wave-particle duality aspect, which will be saturated at the stopping point. It will be re-run again and again without limitation, which can say that the universe has been created and expanded. The Bose-Einstein condensate (BEC) is released through the singularity by the wormhole, which will be condensed to become a mass associated with the Sun's size. It will circulate(orbit) along the Sun. the consideration of the uncertainty principle is applied, from which the breath control is followed by the uncertainty condition ∆p∆x=∆E∆t~ℏ. The flowing in-out air into a body via a nose has applied momentum and energy control respecting the movement and time, in which the target is that the distortion of space-time will have vanished. Finally, the body is clean which can go to the next procedure, where the mind can escape from the body by the speed of light. However, the borderline between contemplation to being an Arahant is a vacuum, which will be explained.

Keywords: space-time, relativity, enlightenment, emptiness

Procedia PDF Downloads 61
385 Quorum Quenching Activities of Bacteria Isolated from Red Sea Sediments

Authors: Zahid Rehman, TorOve Leiknes

Abstract:

Quorum sensing (QS) is the process by which bacteria communicate with each other through small signaling molecules, such as N-acylhomoserine lactones (AHLs). Also, certain bacteria have the ability to degrade AHL molecules by a process referred to as quorum quenching (QQ); therefore, QQ can be used to control bacterial infections and biofilm formation. In this study, we aimed to identify new species of bacteria with QQ activities. To achieve this, sediments from Red Sea were collected either in the close vicinity of Sea grass or from area with no vegetation. From these samples, we isolated 72 bacterial strains and tested their ability to degrade/inactivate AHL molecules. Chromobacterium violaceum based bioassay was used in initial screening of isolates for QQ activity. The QQ activity of the positive isolates was further confirmed and quantified by employing liquid chromatography and mass spectrometry. These analyses showed that isolated bacterial strain could degrade AHL molecules with different acyl chain length and modifications. Sequencing of 16S-rRNA genes of positive isolates revealed that they belong to three different genera. Specifically, two isolates belong to genus Erythrobacter, four to Labrenzia and one isolate belongs to Bacterioplanes. Time course experiment showed that isolate belonging to genus Erythrobacter could degrade AHLs faster than other isolates. Furthermore, these isolates were tested for their ability to inhibit formation of biofilm and degradation of 3OXO-C12 AHLs produced by P. aeruginosa PAO1. Our results showed that isolate VG12 is better at controlling biofilm formation. This aligns with the ability of VG12 to cause at least 10-fold reduction in the amount of different AHLs tested.

Keywords: quorum sensing, biofilm, quorum quenching, anti-biofouling

Procedia PDF Downloads 162
384 Performance Evaluation of Parallel Surface Modeling and Generation on Actual and Virtual Multicore Systems

Authors: Nyeng P. Gyang

Abstract:

Even though past, current and future trends suggest that multicore and cloud computing systems are increasingly prevalent/ubiquitous, this class of parallel systems is nonetheless underutilized, in general, and barely used for research on employing parallel Delaunay triangulation for parallel surface modeling and generation, in particular. The performances, of actual/physical and virtual/cloud multicore systems/machines, at executing various algorithms, which implement various parallelization strategies of the incremental insertion technique of the Delaunay triangulation algorithm, were evaluated. T-tests were run on the data collected, in order to determine whether various performance metrics differences (including execution time, speedup and efficiency) were statistically significant. Results show that the actual machine is approximately twice faster than the virtual machine at executing the same programs for the various parallelization strategies. Results, which furnish the scalability behaviors of the various parallelization strategies, also show that some of the differences between the performances of these systems, during different runs of the algorithms on the systems, were statistically significant. A few pseudo superlinear speedup results, which were computed from the raw data collected, are not true superlinear speedup values. These pseudo superlinear speedup values, which arise as a result of one way of computing speedups, disappear and give way to asymmetric speedups, which are the accurate kind of speedups that occur in the experiments performed.

Keywords: cloud computing systems, multicore systems, parallel Delaunay triangulation, parallel surface modeling and generation

Procedia PDF Downloads 200