Search results for: multiple detection
4958 Methodology for the Determination of Triterpenic Compounds in Apple Extracts
Authors: Mindaugas Liaudanskas, Darius Kviklys, Kristina Zymonė, Raimondas Raudonis, Jonas Viškelis, Norbertas Uselis, Pranas Viškelis, Valdimaras Janulis
Abstract:
Apples are among the most commonly consumed fruits in the world. Based on data from the year 2014, approximately 84.63 million tons of apples are grown per annum. Apples are widely used in food industry to produce various products and drinks (juice, wine, and cider); they are also used unprocessed. Apples in human diet are an important source of different groups of biological active compounds that can positively contribute to the prevention of various diseases. They are a source of various biologically active substances – especially vitamins, organic acids, micro- and macro-elements, pectins, and phenolic, triterpenic, and other compounds. Triterpenic compounds, which are characterized by versatile biological activity, are the biologically active compounds found in apples that are among the most promising and most significant for human health. A specific analytical procedure including sample preparation and High Performance Liquid Chromatography (HPLC) analysis was developed, optimized, and validated for the detection of triterpenic compounds in the samples of different apples, their peels, and flesh from widespread apple cultivars 'Aldas', 'Auksis', 'Connel Red', 'Ligol', 'Lodel', and 'Rajka' grown in Lithuanian climatic conditions. The conditions for triterpenic compound extraction were optimized: the solvent of the extraction was 100% (v/v) acetone, and the extraction was performed in an ultrasound bath for 10 min. Isocratic elution (the eluents ratio being 88% (solvent A) and 12% (solvent B)) for a rapid separation of triterpenic compounds was performed. The validation of the methodology was performed on the basis of the ICH recommendations. The following characteristics of validation were evaluated: the selectivity of the method (specificity), precision, the detection and quantitation limits of the analytes, and linearity. The obtained parameters values confirm suitability of methodology to perform analysis of triterpenic compounds. Using the optimised and validated HPLC technique, four triterpenic compounds were separated and identified, and their specificity was confirmed. These compounds were corosolic acid, betulinic acid, oleanolic acid, and ursolic acid. Ursolic acid was the dominant compound in all the tested apple samples. The detected amount of betulinic acid was the lowest of all the identified triterpenic compounds. The greatest amounts of triterpenic compounds were detected in whole apple and apple peel samples of the 'Lodel' cultivar, and thus apples and apple extracts of this cultivar are potentially valuable for use in medical practice, for the prevention of various diseases, for adjunct therapy, for the isolation of individual compounds with a specific biological effect, and for the development and production of dietary supplements and functional food enriched in biologically active compounds. Acknowledgements. This work was supported by a grant from the Research Council of Lithuania, project No. MIP-17-8.Keywords: apples, HPLC, triterpenic compounds, validation
Procedia PDF Downloads 1734957 The Value of Store Choice Criteria on Perceived Patronage Intentions
Authors: Susana Marques
Abstract:
Research on how store environment cues influence consumers’ store choice decision criteria, such as store operations, product quality, monetary price, store image and sales promotion, is sparse. Especially absent research on the simultaneous impact of multiple store environment cues. The authors propose a comprehensive store choice model that includes: three types of store environment cues as exogenous constructs; various store choice criteria as possible mediating constructs, and store patronage intentions as an endogenous construct. On the basis of testing with a sample of 561 customers of hypermarkets, the model is partially supported. This study used structural equation modelling to test the proposed model.Keywords: store choice, store patronage, structural equation modelling, retailing
Procedia PDF Downloads 2724956 Chest Pain as a Predictor for Heart Issues in Geriatrics
Authors: Leila Kargar, Homa Abri, Golsa Safai
Abstract:
The occurrence of chest pain among geriatrics could be considered as a predictor of heart issues. There is a need for attention to this pain among this population. This review paper has tried to collect the recent data with attention to the chest pain among geriatrics. This review paper has focused on specific keywords, including chest pain, heart issues, and geriatrics, among published papers from 2015 till 2020. To collect data for this purpose, Scopus, Web of Sciences, and PubMed were used. After inserting related papers to the Endnote, an independent researcher checked the abstract, and papers with unclear methods or non-English language were excluded. Finally, 7-papers were included in this review paper. The findings of those papers showed that chest pain could be a predictor for heart issues, and also, there is a direct relationship between chest pain and heart issues among geriatrics. So, early detection and an accurate decision could be helpful to prevent heart issues in this population.Keywords: pain, heart issue, geriatrics, health
Procedia PDF Downloads 2184955 Insect Cell-Based Models: Asutralian Sheep bBlowfly Lucilia Cuprina Embryo Primary Cell line Establishment and Transfection
Authors: Yunjia Yang, Peng Li, Gordon Xu, Timothy Mahony, Bing Zhang, Neena Mitter, Karishma Mody
Abstract:
Sheep flystrike is one of the most economically important diseases affecting the Australian sheep and wool industry (>356M/annually). Currently, control of Lucillia cuprina relies almost exclusively on chemicals controls, and the parasite has developed resistance to nearly all control chemicals used in the past. It is, therefore, critical to develop an alternative solution for the sustainable control and management of flystrike. RNA interference (RNAi) technologies have been successfully explored in multiple animal industries for developing parasites controls. This research project aims to develop a RNAi based biological control for sheep blowfly. Double-stranded RNA (dsRNA) has already proven successful against viruses, fungi, and insects. However, the environmental instability of dsRNA is a major bottleneck for successful RNAi. Bentonite polymer (BenPol) technology can overcome this problem, as it can be tuned for the controlled release of dsRNA in the gut challenging pH environment of the blowfly larvae, prolonging its exposure time to and uptake by target cells. To investigate the potential of BenPol technology for dsRNA delivery, four different BenPol carriers were tested for their dsRNA loading capabilities, and three of them were found to be capable of affording dsRNA stability under multiple temperatures (4°C, 22°C, 40°C, 55°C) in sheep serum. Based on stability results, dsRNA from potential targeted genes was loaded onto BenPol carriers and tested in larvae feeding assays, three genes resulting in knockdowns. Meanwhile, a primary blowfly embryo cell line (BFEC) derived from L. cuprina embryos was successfully established, aim for an effective insect cell model for testing RNAi efficacy for preliminary assessments and screening. The results of this study establish that the dsRNA is stable when loaded on BenPol particles, unlike naked dsRNA rapidly degraded in sheep serum. The stable nanoparticle delivery system offered by BenPol technology can protect and increase the inherent stability of dsRNA molecules at higher temperatures in a complex biological fluid like serum, providing promise for its future use in enhancing animal protection.Keywords: lucilia cuprina, primary cell line establishment, RNA interference, insect cell transfection
Procedia PDF Downloads 734954 Rodriguez Diego, Del Valle Martin, Hargreaves Matias, Riveros Jose Luis
Authors: Nathainail Bashir, Neil Anderson
Abstract:
The objective of this study site was to investigate the current state of the practice with regards to karst detection methods and recommend the best method and pattern of arrays to acquire the desire results. Proper site investigation in karst prone regions is extremely valuable in determining the location of possible voids. Two geophysical techniques were employed: multichannel analysis of surface waves (MASW) and electric resistivity tomography (ERT).The MASW data was acquired at each test location using different array lengths and different array orientations (to increase the probability of getting interpretable data in karst terrain). The ERT data were acquired using a dipole-dipole array consisting of 168 electrodes. The MASW data was interpreted (re: estimated depth to physical top of rock) and used to constrain and verify the interpretation of the ERT data. The ERT data indicates poorer quality MASW data were acquired in areas where there was significant local variation in the depth to top of rock.Keywords: dipole-dipole, ERT, Karst terrains, MASW
Procedia PDF Downloads 3154953 Sustainable Tourism from a Multicriteria Analysis Perspective
Authors: Olga Blasco-Blasco, Vicente Liern
Abstract:
The development of tourism since the mid-20th century has raised problems of overcrowding, indiscriminate construction in seaside areas and gentrification. Increasingly, the World Tourism Organisation and public institutions are promoting policies that encourage sustainability. From the perspective of sustainability, three types of tourism can be established: traditional tourism, sustainable tourism and sustainable impact tourism. Measuring sustainability is complex due to its multiple dimensions of different relative importance and diversity in nature. In order to try to answer this problem and to identify the benefits of applying policies that promote sustainable tourism, a decision-making analysis will be carried out through the application of a multicriteria analysis method. The proposal is applied to hotel reservations and to the evaluation and management of tourism sustainability in the Spanish Autonomous Communities.Keywords: sustainable tourism, multicriteria analysis, flexible optimization, composite indicators
Procedia PDF Downloads 3114952 Retina Registration for Biometrics Based on Characterization of Retinal Feature Points
Authors: Nougrara Zineb
Abstract:
The unique structure of the blood vessels in the retina has been used for biometric identification. The retina blood vessel pattern is a unique pattern in each individual and it is almost impossible to forge that pattern in a false individual. The retina biometrics’ advantages include high distinctiveness, universality, and stability overtime of the blood vessel pattern. Once the creases have been extracted from the images, a registration stage is necessary, since the position of the retinal vessel structure could change between acquisitions due to the movements of the eye. Image registration consists of following steps: Feature detection, feature matching, transform model estimation and image resembling and transformation. In this paper, we present an algorithm of registration; it is based on the characterization of retinal feature points. For experiments, retinal images from the DRIVE database have been tested. The proposed methodology achieves good results for registration in general.Keywords: fovea, optic disc, registration, retinal images
Procedia PDF Downloads 2664951 Clinical Nursing Experience in Managing a Uterine Cancer Patient with Psychogenic Shock During the Extracorporeal Membrane Oxygenation Weaning Process
Authors: Syue-Wen Lin
Abstract:
Objective: This article discusses the nursing experience of caring for a uterine cancer patient who experienced cardiogenic shock and was weaned off ECMO. The patient was placed on ECMO due to cardiogenic shock and initially struggled with anxiety caused by the physical discomfort from the disease and multiple medical devices, as well as the isolation in the ICU and restrictions on physical activity. Over time, the patient was able to wean off ECMO and perform daily activities and rehabilitation independently. Methods: The nursing period was from January 6 to January 9. Through observation, direct care, interviews, physical assessments, and case reviews, the intensive care team and bypass personnel conducted a comprehensive assessment using Gordon's 11 functional health patterns. The assessment identified three main nursing health problems: pain, anxiety, and decreased cardiac tissue perfusion. Results: The author consulted a psychologist to employ open communication techniques and empathetic care to build a trusting nurse-patient relationship. A patient-centered intensive cancer care plan was developed. Pain was assessed using a pain scale, and pain medications were adjusted in consultation with a pharmacist. Lavender essential oil therapy, light music, and pillows were used to distract and alleviate pain. The patient was encouraged to express feelings and family members were invited to increase visits and provide companionship to reduce the uncertainty caused by cancer and illness. Vital signs were closely monitored, and nursing interventions were provided to maintain adequate myocardial perfusion. Post-ECMO, the patient was encouraged to engage in rehabilitation and cardiopulmonary training. Conclusion: A key takeaway from the care process is the importance of observing not only the patient's vital signs but also their psychological state, especially when dealing with cancer patients on ECMO. The patient's greatest source of comfort was the presence of family, which helped alleviate anxiety. Healthcare providers play multiple critical roles as advocates, coordinators, educators, and counselors, listening to and accepting the patient’s emotional responses. The report aims to provide clinical cancer nurses with a reference to improve the quality of care and alleviate cancer-related discomfort.Keywords: ECMO, uterine cancer, palliative care, Gordon's 11 functional health patterns
Procedia PDF Downloads 304950 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics
Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee
Abstract:
Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru
Procedia PDF Downloads 874949 Comparison of the Amount of Microplastics in Plant- and Animal-Based Milks
Authors: Meli̇sa Aşci, Berk Kiliç, Emine Ulusoy
Abstract:
Ingestion of microplastics in humans has been increasing rapidly, as such hazardous materials are abundant in multiple food products, specifically milks. With increasing consumption rates, humans have been ingesting microplastics on a daily basis, making them prone to be intoxicated and even cause the disruption of intracellular pathways and liver cell disruption, and eventually tissue and organ damage. In this experiment, different milk types(animal-based and plant-based) were tested for microplastics. Results showed that animal-based milks contained a higher concentration of microplastics compared to plant-based milks. Research has shown that in addition to causing health issues in humans, microplastics can also affect livestock animals and plants.Keywords: microplastics, plant-based milks, animal-based milks, preventive nutrition
Procedia PDF Downloads 284948 PET/CT Patient Dosage Assay
Authors: Gulten Yilmaz, A. Beril Tugrul, Mustafa Demir, Dogan Yasar, Bayram Demir, Bulent Buyuk
Abstract:
A Positron Emission Tomography (PET) is a radioisotope imaging technique that illustrates the organs and the metabolisms of the human body. This technique is based on the simultaneous detection of 511 keV annihilation photons, annihilated as a result of electrons annihilating positrons that radiate from positron-emitting radioisotopes that enter biological active molecules in the body. This study was conducted on ten patients in an effort to conduct patient-related experimental studies. Dosage monitoring for the bladder, which was the organ that received the highest dose during PET applications, was conducted for 24 hours. Assessment based on measuring urination activities after injecting patients was also a part of this study. The MIRD method was used to conduct dosage calculations for results obtained from experimental studies. Results obtained experimentally and theoretically were assessed comparatively.Keywords: PET/CT, TLD, MIRD, dose measurement, patient doses
Procedia PDF Downloads 5214947 An Optimization Tool-Based Design Strategy Applied to Divide-by-2 Circuits with Unbalanced Loads
Authors: Agord M. Pinto Jr., Yuzo Iano, Leandro T. Manera, Raphael R. N. Souza
Abstract:
This paper describes an optimization tool-based design strategy for a Current Mode Logic CML divide-by-2 circuit. Representing a building block for output frequency generation in a RFID protocol based-frequency synthesizer, the circuit was designed to minimize the power consumption for driving of multiple loads with unbalancing (at transceiver level). Implemented with XFAB XC08 180 nm technology, the circuit was optimized through MunEDA WiCkeD tool at Cadence Virtuoso Analog Design Environment ADE.Keywords: divide-by-2 circuit, CMOS technology, PLL phase locked-loop, optimization tool, CML current mode logic, RF transceiver
Procedia PDF Downloads 4644946 Revolutionizing Accounting: Unleashing the Power of Artificial Intelligence
Authors: Sogand Barghi
Abstract:
The integration of artificial intelligence (AI) in accounting practices is reshaping the landscape of financial management. This paper explores the innovative applications of AI in the realm of accounting, emphasizing its transformative impact on efficiency, accuracy, decision-making, and financial insights. By harnessing AI's capabilities in data analysis, pattern recognition, and automation, accounting professionals can redefine their roles, elevate strategic decision-making, and unlock unparalleled value for businesses. This paper delves into AI-driven solutions such as automated data entry, fraud detection, predictive analytics, and intelligent financial reporting, highlighting their potential to revolutionize the accounting profession. Artificial intelligence has swiftly emerged as a game-changer across industries, and accounting is no exception. This paper seeks to illuminate the profound ways in which AI is reshaping accounting practices, transcending conventional boundaries, and propelling the profession toward a new era of efficiency and insight-driven decision-making. One of the most impactful applications of AI in accounting is automation. Tasks that were once labor-intensive and time-consuming, such as data entry and reconciliation, can now be streamlined through AI-driven algorithms. This not only reduces the risk of errors but also allows accountants to allocate their valuable time to more strategic and analytical tasks. AI's ability to analyze vast amounts of data in real time enables it to detect irregularities and anomalies that might go unnoticed by traditional methods. Fraud detection algorithms can continuously monitor financial transactions, flagging any suspicious patterns and thereby bolstering financial security. AI-driven predictive analytics can forecast future financial trends based on historical data and market variables. This empowers organizations to make informed decisions, optimize resource allocation, and develop proactive strategies that enhance profitability and sustainability. Traditional financial reporting often involves extensive manual effort and data manipulation. With AI, reporting becomes more intelligent and intuitive. Automated report generation not only saves time but also ensures accuracy and consistency in financial statements. While the potential benefits of AI in accounting are undeniable, there are challenges to address. Data privacy and security concerns, the need for continuous learning to keep up with evolving AI technologies, and potential biases within algorithms demand careful attention. The convergence of AI and accounting marks a pivotal juncture in the evolution of financial management. By harnessing the capabilities of AI, accounting professionals can transcend routine tasks, becoming strategic advisors and data-driven decision-makers. The applications discussed in this paper underline the transformative power of AI, setting the stage for an accounting landscape that is smarter, more efficient, and more insightful than ever before. The future of accounting is here, and it's driven by artificial intelligence.Keywords: artificial intelligence, accounting, automation, predictive analytics, financial reporting
Procedia PDF Downloads 714945 Deformulation and Comparative Analysis of Apparently Similar Polymers Using Multiple Modes of Pyrolysis-Gc/Ms
Authors: Athena Nguyen, Rojin Belganeh
Abstract:
Detecting and identifying differences in like polymer materials are key factors in deformulation, comparative analysis as well as reverse engineering. Pyrolysis-GC/MS is an easy solid sample introduction technique which expands the application areas of gas chromatography and mass spectrometry. The Micro-furnace pyrolyzer is directly interfaced with the GC injector preventing any potential of cold spot, carryover, and cross contamination. This presentation demonstrates the study of two similar polymers by performing different mode of operations in the same system: Evolve gas analysis (EGA), Flash pyrolysis, Thermal desorption analysis, and Heart-cutting analysis. Unknown polymer materials and their chemical compositions are identified.Keywords: gas chromatography/mass spectrometry, pyrolysis, pyrolyzer, thermal desorption-GC/MS
Procedia PDF Downloads 2644944 Literature Review on Text Comparison Techniques: Analysis of Text Extraction, Main Comparison and Visual Representation Tools
Authors: Andriana Mkrtchyan, Vahe Khlghatyan
Abstract:
The choice of a profession is one of the most important decisions people make throughout their life. With the development of modern science, technologies, and all the spheres existing in the modern world, more and more professions are being arisen that complicate even more the process of choosing. Hence, there is a need for a guiding platform to help people to choose a profession and the right career path based on their interests, skills, and personality. This review aims at analyzing existing methods of comparing PDF format documents and suggests that a 3-stage approach is implemented for the comparison, that is – 1. text extraction from PDF format documents, 2. comparison of the extracted text via NLP algorithms, 3. comparison representation using special shape and color psychology methodology.Keywords: color psychology, data acquisition/extraction, data augmentation, disambiguation, natural language processing, outlier detection, semantic similarity, text-mining, user evaluation, visual search
Procedia PDF Downloads 764943 Joubert Syndrome: A Rare Genetic Disorder Reported in Kurdish Family
Authors: Aran Abd Al Rahman
Abstract:
Joubert syndrome regards as a congenital cerebellar ataxia caused by autosomal recessive carried on X chromosome. The disease diagnosed by brain imaging—the so-called molar tooth sign. Neurological signs were present from the neonatal period and include hypotonia progressing to ataxia, global developmental delay, ocular motor apraxia, and breathing dysregulation. These signs are variably associated with multiorgan involvement, mainly of the retina, kidneys, skeleton, and liver. 30 causative genes have been identified so far, all of which encode for proteins of the primary cilium or its apparatus, The purpose of our project was to detect the mutant gene (INPP5E gene) which cause Joubert syndrome. There were many methods used for diagnosis such as MRI and CT- scan and molecular diagnosis by doing ARMS PCR for detection of mutant gene that we were used in this research project. In this research for individual family which reported, the two children with parents, the two children were affected and were carrier.Keywords: Joubert syndrome, genetic disease, Kurdistan region, Sulaimani
Procedia PDF Downloads 1414942 Fault Diagnosis of Squirrel-Cage Induction Motor by a Neural Network Multi-Models
Authors: Yahia. Kourd, N. Guersi D. Lefebvre
Abstract:
In this paper we propose to study the faults diagnosis in squirrel-cage induction motor using MLP neural networks. We use neural healthy and faulty models of the behavior in order to detect and isolate some faults in machine. In the first part of this work, we have created a neural model for the healthy state using Matlab and a motor located in LGEB by acquirins data inputs and outputs of this engine. Then we detected the faults in the machine by residual generation. These residuals are not sufficient to isolate the existing faults. For this reason, we proposed additive neural networks to represent the faulty behaviors. From the analysis of these residuals and the choice of a threshold we propose a method capable of performing the detection and diagnosis of some faults in asynchronous machines with squirrel cage rotor.Keywords: faults diagnosis, neural networks, multi-models, squirrel-cage induction motor
Procedia PDF Downloads 6374941 Big Data Analysis with RHadoop
Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim
Abstract:
It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop
Procedia PDF Downloads 4374940 Detection and Identification of Antibiotic Resistant UPEC Using FTIR-Microscopy and Advanced Multivariate Analysis
Authors: Uraib Sharaha, Ahmad Salman, Eladio Rodriguez-Diaz, Elad Shufan, Klaris Riesenberg, Irving J. Bigio, Mahmoud Huleihel
Abstract:
Antimicrobial drugs have played an indispensable role in controlling illness and death associated with infectious diseases in animals and humans. However, the increasing resistance of bacteria to a broad spectrum of commonly used antibiotics has become a global healthcare problem. Many antibiotics had lost their effectiveness since the beginning of the antibiotic era because many bacteria have adapted defenses against these antibiotics. Rapid determination of antimicrobial susceptibility of a clinical isolate is often crucial for the optimal antimicrobial therapy of infected patients and in many cases can save lives. The conventional methods for susceptibility testing require the isolation of the pathogen from a clinical specimen by culturing on the appropriate media (this culturing stage lasts 24 h-first culturing). Then, chosen colonies are grown on media containing antibiotic(s), using micro-diffusion discs (second culturing time is also 24 h) in order to determine its bacterial susceptibility. Other methods, genotyping methods, E-test and automated methods were also developed for testing antimicrobial susceptibility. Most of these methods are expensive and time-consuming. Fourier transform infrared (FTIR) microscopy is rapid, safe, effective and low cost method that was widely and successfully used in different studies for the identification of various biological samples including bacteria; nonetheless, its true potential in routine clinical diagnosis has not yet been established. The new modern infrared (IR) spectrometers with high spectral resolution enable measuring unprecedented biochemical information from cells at the molecular level. Moreover, the development of new bioinformatics analyses combined with IR spectroscopy becomes a powerful technique, which enables the detection of structural changes associated with resistivity. The main goal of this study is to evaluate the potential of the FTIR microscopy in tandem with machine learning algorithms for rapid and reliable identification of bacterial susceptibility to antibiotics in time span of few minutes. The UTI E.coli bacterial samples, which were identified at the species level by MALDI-TOF and examined for their susceptibility by the routine assay (micro-diffusion discs), are obtained from the bacteriology laboratories in Soroka University Medical Center (SUMC). These samples were examined by FTIR microscopy and analyzed by advanced statistical methods. Our results, based on 700 E.coli samples, were promising and showed that by using infrared spectroscopic technique together with multivariate analysis, it is possible to classify the tested bacteria into sensitive and resistant with success rate higher than 90% for eight different antibiotics. Based on these preliminary results, it is worthwhile to continue developing the FTIR microscopy technique as a rapid and reliable method for identification antibiotic susceptibility.Keywords: antibiotics, E.coli, FTIR, multivariate analysis, susceptibility, UTI
Procedia PDF Downloads 1734939 The Roman Fora in North Africa Towards a Supportive Protocol to the Decision for the Morphological Restitution
Authors: Dhouha Laribi Galalou, Najla Allani Bouhoula, Atef Hammouda
Abstract:
This research delves into the fundamental question of the morphological restitution of built archaeology in order to place it in its paradigmatic context and to seek answers to it. Indeed, the understanding of the object of the study, its analysis, and the methodology of solving the morphological problem posed, are manageable aspects only by means of a thoughtful strategy that draws on well-defined epistemological scaffolding. In this stream, the crisis of natural reasoning in archaeology has generated multiple changes in this field, ranging from the use of new tools to the integration of an archaeological information system where urbanization involves the interplay of several disciplines. The built archaeological topic is also an architectural and morphological object. It is also a set of articulated elementary data, the understanding of which is about to be approached from a logicist point of view. Morphological restitution is no exception to the rule, and the inter-exchange between the different disciplines uses the capacity of each to frame the reflection on the incomplete elements of a given architecture or on its different phases and multiple states of existence. The logicist sequence is furnished by the set of scattered or destroyed elements found, but also by what can be called a rule base which contains the set of rules for the architectural construction of the object. The knowledge base built from the archaeological literature also provides a reference that enters into the game of searching for forms and articulations. The choice of the Roman Forum in North Africa is justified by the great urban and architectural characteristics of this entity. The research on the forum involves both a fairly large knowledge base but also provides the researcher with material to study - from a morphological and architectural point of view - starting from the scale of the city down to the architectural detail. The experimentation of the knowledge deduced on the paradigmatic level, as well as the deduction of an analysis model, is then carried out on the basis of a well-defined context which contextualises the experimentation from the elaboration of the morphological information container attached to the rule base and the knowledge base. The use of logicist analysis and artificial intelligence has allowed us to first question the aspects already known in order to measure the credibility of our system, which remains above all a decision support tool for the morphological restitution of Roman Fora in North Africa. This paper presents a first experimentation of the model elaborated during this research, a model framed by a paradigmatic discussion and thus trying to position the research in relation to the existing paradigmatic and experimental knowledge on the issue.Keywords: classical reasoning, logicist reasoning, archaeology, architecture, roman forum, morphology, calculation
Procedia PDF Downloads 1474938 Investigating Best Practice Energy Efficiency Policies and Programs, and Their Replication Potential for Residential Sector of Saudi Arabia
Authors: Habib Alshuwaikhat, Nahid Hossain
Abstract:
Residential sector consumes more than half of the produced electricity in Saudi Arabia, and fossil fuel is the main source of energy to meet growing household electricity demand in the Kingdom. Several studies forecasted and expressed concern that unless the domestic energy demand growth is controlled, it will reduce Saudi Arabia’s crude oil export capacity within a decade and the Kingdom is likely to be incapable of exporting crude oil within next three decades. Though the Saudi government has initiated to address the domestic energy demand growth issue, the demand side energy management policies and programs are focused on industrial and commercial sectors. It is apparent that there is an urgent need to develop a comprehensive energy efficiency strategy for addressing efficient energy use in residential sector in the Kingdom. Then again as Saudi Arabia is at its primary stage in addressing energy efficiency issues in its residential sector, there is a scope for the Kingdom to learn from global energy efficiency practices and design its own energy efficiency policies and programs. However, in order to do that sustainable, it is essential to address local contexts of energy efficiency. It is also necessary to find out the policies and programs that will fit to the local contexts. Thus the objective of this study was set to identify globally best practice energy efficiency policies and programs in residential sector that have replication potential in Saudi Arabia. In this regard two sets of multi-criteria decision analysis matrices were developed to evaluate the energy efficiency policies and programs. The first matrix was used to evaluate the global energy efficiency policies and programs, and the second matrix was used to evaluate the replication potential of global best practice energy efficiency policies and programs for Saudi Arabia. Wuppertal Institute’s guidelines for energy efficiency policy evaluation were used to develop the matrices, and the different attributes of the matrices were set through available literature review. The study reveals that the best practice energy efficiency policies and programs with good replication potential for Saudi Arabia are those which have multiple components to address energy efficiency and are diversified in their characteristics. The study also indicates the more diversified components are included in a policy and program, the more replication potential it has for the Kingdom. This finding is consistent with other studies, where it is observed that in order to be successful in energy efficiency practices, it is required to introduce multiple policy components in a cluster rather than concentrate on a single policy measure. The developed multi-criteria decision analysis matrices for energy efficiency policy and program evaluation could be utilized to assess the replication potential of other globally best practice energy efficiency policies and programs for the residential sector of the Kingdom. In addition it has potential to guide Saudi policy makers to adopt and formulate its own energy efficiency policies and programs for Saudi Arabia.Keywords: Saudi Arabia, residential sector, energy efficiency, policy evaluation
Procedia PDF Downloads 4964937 Systematic Identification of Noncoding Cancer Driver Somatic Mutations
Authors: Zohar Manber, Ran Elkon
Abstract:
Accumulation of somatic mutations (SMs) in the genome is a major driving force of cancer development. Most SMs in the tumor's genome are functionally neutral; however, some cause damage to critical processes and provide the tumor with a selective growth advantage (termed cancer driver mutations). Current research on functional significance of SMs is mainly focused on finding alterations in protein coding sequences. However, the exome comprises only 3% of the human genome, and thus, SMs in the noncoding genome significantly outnumber those that map to protein-coding regions. Although our understanding of noncoding driver SMs is very rudimentary, it is likely that disruption of regulatory elements in the genome is an important, yet largely underexplored mechanism by which somatic mutations contribute to cancer development. The expression of most human genes is controlled by multiple enhancers, and therefore, it is conceivable that regulatory SMs are distributed across different enhancers of the same target gene. Yet, to date, most statistical searches for regulatory SMs have considered each regulatory element individually, which may reduce statistical power. The first challenge in considering the cumulative activity of all the enhancers of a gene as a single unit is to map enhancers to their target promoters. Such mapping defines for each gene its set of regulating enhancers (termed "set of regulatory elements" (SRE)). Considering multiple enhancers of each gene as one unit holds great promise for enhancing the identification of driver regulatory SMs. However, the success of this approach is greatly dependent on the availability of comprehensive and accurate enhancer-promoter (E-P) maps. To date, the discovery of driver regulatory SMs has been hindered by insufficient sample sizes and statistical analyses that often considered each regulatory element separately. In this study, we analyzed more than 2,500 whole-genome sequence (WGS) samples provided by The Cancer Genome Atlas (TCGA) and The International Cancer Genome Consortium (ICGC) in order to identify such driver regulatory SMs. Our analyses took into account the combinatorial aspect of gene regulation by considering all the enhancers that control the same target gene as one unit, based on E-P maps from three genomics resources. The identification of candidate driver noncoding SMs is based on their recurrence. We searched for SREs of genes that are "hotspots" for SMs (that is, they accumulate SMs at a significantly elevated rate). To test the statistical significance of recurrence of SMs within a gene's SRE, we used both global and local background mutation rates. Using this approach, we detected - in seven different cancer types - numerous "hotspots" for SMs. To support the functional significance of these recurrent noncoding SMs, we further examined their association with the expression level of their target gene (using gene expression data provided by the ICGC and TCGA for samples that were also analyzed by WGS).Keywords: cancer genomics, enhancers, noncoding genome, regulatory elements
Procedia PDF Downloads 1044936 A Computer-Aided System for Detection and Classification of Liver Cirrhosis
Authors: Abdel Hadi N. Ebraheim, Eman Azomi, Nefisa A. Fahmy
Abstract:
This paper designs and implements a computer-aided system (CAS) to help detect and diagnose liver cirrhosis in patients with Chronic Hepatitis C. Our system reduces the required features (tests) the patient is asked to do to tests to their minimal best most informative subset of tests, with a diagnostic accuracy above 99%, and hence saving both time and costs. We use the Support Vector Machine (SVM) with cross-validation, a Multilayer Perceptron Neural Network (MLP), and a Generalized Regression Neural Network (GRNN) that employs a base of radial functions for functional approximation, as classifiers. Our system is tested on 199 subjects, of them 99 Chronic Hepatitis C.The subjects were selected from among the outpatient clinic in National Herpetology and Tropical Medicine Research Institute (NHTMRI).Keywords: liver cirrhosis, artificial neural network, support vector machine, multi-layer perceptron, classification, accuracy
Procedia PDF Downloads 4614935 Square Wave Anodic Stripping Voltammetry of Copper (II) at the Tetracarbonylmolybdenum(0) MWCNT Paste Electrode
Authors: Illyas Isa, Mohamad Idris Saidin, Mustaffa Ahmad, Norhayati Hashim
Abstract:
A highly selective and sensitive electrode for determination of trace amounts of Cu (II) using square wave anodic stripping voltammetry (SWASV) was proposed. The electrode was made of the paste of multiwall carbon nanotubes (MWCNT) and 2,6–diacetylpyridine-di-(1R)–(-)–fenchone diazine tetracarbonylmolybdenum(0) at 100:5 (w/w). Under optimal conditions the electrode showed a linear relationship with concentration in the range of 1.0 × 10–10 to 1.0 × 10– 6 M Cu (II) and limit of detection 8.0 × 10–11 M Cu (II). The relative standard deviation (n = 5) of response to 1.0 × 10–6 M Cu(II) was 0.036. The interferences of cations such as Ni(II), Mg(II), Cd(II), Co(II), Hg(II), and Zn(II) (in 10 and 100-folds concentration) are negligible except from Pb (II). Electrochemical impedance spectroscopy (EIS) showed that the charge transfer at the electrode-solution interface was favorable. Result of analysis of Cu(II) in several water samples agreed well with those obtained by inductively coupled plasma-optical emission spectrometry (ICP-OES). The proposed electrode was then recommended as an alternative to spectroscopic technique in analyzing Cu (II).Keywords: chemically modified electrode, Cu(II), Square wave anodic stripping voltammetry, tetracarbonylmolybdenum(0)
Procedia PDF Downloads 2624934 The Application of Fuzzy Set Theory to Mobile Internet Advertisement Fraud Detection
Authors: Jinming Ma, Tianbing Xia, Janusz Getta
Abstract:
This paper presents the application of fuzzy set theory to implement of mobile advertisement anti-fraud systems. Mobile anti-fraud is a method aiming to identify mobile advertisement fraudsters. One of the main problems of mobile anti-fraud is the lack of evidence to prove a user to be a fraudster. In this paper, we implement an application by using fuzzy set theory to demonstrate how to detect cheaters. The advantage of our method is that the hardship in detecting fraudsters in small data samples has been avoided. We achieved this by giving each user a suspicious degree showing how likely the user is cheating and decide whether a group of users (like all users of a certain APP) together to be fraudsters according to the average suspicious degree. This makes the process more accurate as the data of a single user is too small to be predictable.Keywords: mobile internet, advertisement, anti-fraud, fuzzy set theory
Procedia PDF Downloads 1814933 Novel Algorithm for Restoration of Retina Images
Authors: P. Subbuthai, S. Muruganand
Abstract:
Diabetic Retinopathy is one of the complicated diseases and it is caused by the changes in the blood vessels of the retina. Extraction of retina image through Fundus camera sometimes produced poor contrast and noises. Because of this noise, detection of blood vessels in the retina is very complicated. So preprocessing is needed, in this paper, a novel algorithm is implemented to remove the noisy pixel in the retina image. The proposed algorithm is Extended Median Filter and it is applied to the green channel of the retina because green channel vessels are brighter than the background. Proposed extended median filter is compared with the existing standard median filter by performance metrics such as PSNR, MSE and RMSE. Experimental results show that the proposed Extended Median Filter algorithm gives a better result than the existing standard median filter in terms of noise suppression and detail preservation.Keywords: fundus retina image, diabetic retinopathy, median filter, microaneurysms, exudates
Procedia PDF Downloads 3424932 MCERTL: Mutation-Based Correction Engine for Register-Transfer Level Designs
Authors: Khaled Salah
Abstract:
In this paper, we present MCERTL (mutation-based correction engine for RTL designs) as an automatic error correction technique based on mutation analysis. A mutation-based correction methodology is proposed to automatically fix the erroneous RTL designs. The proposed strategy combines the processes of mutation and assertion-based localization. The erroneous statements are mutated to produce possible fixes for the failed RTL code. A concurrent mutation engine is proposed to mitigate the computational cost of running sequential mutants operators. The proposed methodology is evaluated against some benchmarks. The experimental results demonstrate that our proposed method enables us to automatically locate and correct multiple bugs at reasonable time.Keywords: bug localization, error correction, mutation, mutants
Procedia PDF Downloads 2804931 Time of Death Determination in Medicolegal Death Investigations
Authors: Michelle Rippy
Abstract:
Medicolegal death investigation historically is a field that does not receive much research attention or advancement, as all of the subjects are deceased. Public health threats, drug epidemics and contagious diseases are typically recognized in decedents first, with thorough and accurate death investigations able to assist in epidemiology research and prevention programs. One vital component of medicolegal death investigation is determining the decedent’s time of death. An accurate time of death can assist in corroborating alibies, determining sequence of death in multiple casualty circumstances and provide vital facts in civil situations. Popular television portrays an unrealistic forensic ability to provide the exact time of death to the minute for someone found deceased with no witnesses present. The actuality of unattended decedent time of death determination can generally only be narrowed to a 4-6 hour window. In the mid- to late-20th century, liver temperatures were an invasive action taken by death investigators to determine the decedent’s core temperature. The core temperature was programmed into an equation to determine an approximate time of death. Due to many inconsistencies with the placement of the thermometer and other variables, the accuracy of the liver temperatures was dispelled and this once common place action lost scientific support. Currently, medicolegal death investigators utilize three major after death or post-mortem changes at a death scene. Many factors are considered in the subjective determination as to the time of death, including the cooling of the decedent, stiffness of the muscles, release of blood internally, clothing, ambient temperature, disease and recent exercise. Current research is utilizing non-invasive hospital grade tympanic thermometers to measure the temperature in the each of the decedent’s ears. This tool can be used at the scene and in conjunction with scene indicators may provide a more accurate time of death. The research is significant and important to investigations and can provide an area of accuracy to a historically inaccurate area, considerably improving criminal and civil death investigations. The goal of the research is to provide a scientific basis to unwitnessed deaths, instead of the art that the determination currently is. The research is currently in progress with expected termination in December 2018. There are currently 15 completed case studies with vital information including the ambient temperature, decedent height/weight/sex/age, layers of clothing, found position, if medical intervention occurred and if the death was witnessed. This data will be analyzed with the multiple variables studied and available for presentation in January 2019.Keywords: algor mortis, forensic pathology, investigations, medicolegal, time of death, tympanic
Procedia PDF Downloads 1184930 Binarization and Recognition of Characters from Historical Degraded Documents
Authors: Bency Jacob, S.B. Waykar
Abstract:
Degradations in historical document images appear due to aging of the documents. It is very difficult to understand and retrieve text from badly degraded documents as there is variation between the document foreground and background. Thresholding of such document images either result in broken characters or detection of false texts. Numerous algorithms exist that can separate text and background efficiently in the textual regions of the document; but portions of background are mistaken as text in areas that hardly contain any text. This paper presents a way to overcome these problems by a robust binarization technique that recovers the text from a severely degraded document images and thereby increases the accuracy of optical character recognition systems. The proposed document recovery algorithm efficiently removes degradations from document images. Here we are using the ostus method ,local thresholding and global thresholding and after the binarization training and recognizing the characters in the degraded documents.Keywords: binarization, denoising, global thresholding, local thresholding, thresholding
Procedia PDF Downloads 3444929 Database Management System for Orphanages to Help Track of Orphans
Authors: Srivatsav Sanjay Sridhar, Asvitha Raja, Prathit Kalra, Soni Gupta
Abstract:
Database management is a system that keeps track of details about a person in an organisation. Not a lot of orphanages these days are shifting to a computer and program-based system, but unfortunately, most have only pen and paper-based records, which not only consumes space but it is also not eco-friendly. It comes as a hassle when one has to view a record of a person as they have to search through multiple records, and it will consume time. This program will organise all the data and can pull out any information about anyone whose data is entered. This is also a safe way of storage as physical data gets degraded over time or, worse, destroyed due to natural disasters. In this developing world, it is only smart enough to shift all data to an electronic-based storage system. The program comes with all features, including creating, inserting, searching, and deleting the data, as well as printing them.Keywords: database, orphans, programming, C⁺⁺
Procedia PDF Downloads 157