Search results for: time series data mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38108

Search results for: time series data mining

33158 Models of Innovation Processes and Their Evolution: A Literature Review

Authors: Maier Dorin, Maier Andreea

Abstract:

Today, any organization - regardless of the specific activity - must be prepared to face continuous radical changes, innovation thus becoming a condition of survival in a globalized market. Not all managers have an overall view on the real size of necessary innovation potential. Unfortunately there is still no common (and correct) understanding of the term of innovation among managers. Moreover, not all managers are aware of the need for innovation. This article highlights and analyzes a series of models of innovation processes and their evolution. The models analyzed encompass both the strategic level and the operational one within an organization, indicating performance innovation on each landing. As the literature review shows, there are no easy answers to the innovation process as there are no shortcuts to great results. Successful companies do not have a silver innovative bullet - they do not get results by making one or few things better than others, they make everything better.

Keywords: innovation, innovation process, business success, models of innovation

Procedia PDF Downloads 382
33157 Time Dependent Biodistribution Modeling of 177Lu-DOTATOC Using Compartmental Analysis

Authors: M. Mousavi-Daramoroudi, H. Yousefnia, F. Abbasi-Davani, S. Zolghadri

Abstract:

In this study, 177Lu-DOTATOC was prepared under optimized conditions (radiochemical purity: > 99%, radionuclidic purity: > 99%). The percentage of injected dose per gram (%ID/g) was calculated for organs up to 168 h post injection. Compartmental model was applied to mathematical description of the drug behaviour in tissue at different times. The biodistribution data showed the significant excretion of the radioactivity from the kidneys. The adrenal and pancreas, as major expression sites for somatostatin receptor (SSTR), had significant uptake. A pharmacokinetic model of 177Lu-DOTATOC was presented by compartmental analysis which demonstrates the behavior of the complex.

Keywords: biodistribution, compartmental modeling, ¹⁷⁷Lu, Octreotide

Procedia PDF Downloads 201
33156 Analysis of Silicon Controlled Rectifier-Based Electrostatic Discharge Protection Circuits with Electrical Characteristics for the 5V Power Clamp

Authors: Jun-Geol Park, Kyoung-Il Do, Min-Ju Kwon, Kyung-Hyun Park, Yong-Seo Koo

Abstract:

This paper analyzed the SCR (Silicon Controlled Rectifier)-based ESD (Electrostatic Discharge) protection circuits with the turn-on time characteristics. The structures are the LVTSCR (Low Voltage Triggered SCR), the ZTSCR (Zener Triggered SCR) and the PTSCR (P-Substrate Triggered SCR). The three structures are for the 5V power clamp. In general, the structures with the low trigger voltage structure can have the fast turn-on characteristics than other structures. All the ESD protection circuits have the low trigger voltage by using the N+ bridge region of LVTSCR, by using the zener diode structure of ZTSCR, by increasing the trigger current of PTSCR. The simulation for the comparison with the turn-on time was conducted by the Synopsys TCAD simulator. As the simulation results, the LVTSCR has the turn-on time of 2.8 ns, ZTSCR of 2.1 ns and the PTSCR of 2.4 ns. The HBM simulation results, however, show that the PTSCR is the more robust structure of 430K in HBM 8kV standard than 450K of LVTSCR and 495K of ZTSCR. Therefore the PTSCR is the most effective ESD protection circuit for the 5V power clamp.

Keywords: ESD, SCR, turn-on time, trigger voltage, power clamp

Procedia PDF Downloads 330
33155 Enabling Participation of Deaf People in the Co-Production of Services: An Example in Service Design, Commissioning and Delivery in a London Borough

Authors: Stephen Bahooshy

Abstract:

Co-producing services with the people that access them is considered best practice in the United Kingdom, with the Care Act 2014 arguing that people who access services and their carers should be involved in the design, commissioning and delivery of services. Co-production is a way of working with the community, breaking down barriers of access and providing meaningful opportunity for people to engage. Unfortunately, owing to a number of reported factors such as time constraints, practitioner experience and departmental budget restraints, this process is not always followed. In 2019, in a south London borough, d/Deaf people who access services were engaged in the design, commissioning and delivery of an information and advice service that would support their community to access local government services. To do this, sensory impairment social workers and commissioners collaborated to host a series of engagement events with the d/Deaf community. Interpreters were used to enable communication between the commissioners and d/Deaf participants. Initially, the community’s opinions, ideas and requirements were noted. This was then summarized and fed back to the community to ensure accuracy. Subsequently, a service specification was developed which included performance metrics, inclusive of qualitative and quantitative indicators, such as ‘I statements’, whereby participants respond on an adapted Likert scale how much they agree or disagree with a particular statement in relation to their experience of the service. The service specification was reviewed by a smaller group of d/Deaf residents and social workers, to ensure that it met the community’s requirements. The service was then tendered using the local authority’s e-tender process. Bids were evaluated and scored in two parts; part one was by commissioners and social workers and part two was a presentation by prospective providers to an evaluation panel formed of four d/Deaf residents. The internal evaluation panel formed 75% of the overall score, whilst the d/Deaf resident evaluation panel formed 25% of the overall tender score. Co-producing the evaluation panel with social workers and the d/Deaf community meant that commissioners were able to meet the requirements of this community by developing evaluation questions and tools that were easily understood and use by this community. For example, the wording of questions were reviewed and the scoring mechanism consisted of three faces to reflect the d/Deaf residents’ scores instead of traditional numbering. These faces were a happy face, a neutral face and a sad face. By making simple changes to the commissioning and tender evaluation process, d/Deaf people were able to have meaningful involvement in the design and commissioning process for a service that would benefit their community. Co-produced performance metrics means that it is incumbent on the successful provider to continue to engage with people accessing the service and ensure that the feedback is utilized. d/Deaf residents were grateful to have been involved in this process as this was not an opportunity that they had previously been afforded. In recognition of their time, each d/Deaf resident evaluator received a £40 gift voucher, bringing the total cost of this co-production to £160.

Keywords: co-production, community engagement, deaf and hearing impaired, service design

Procedia PDF Downloads 255
33154 Analysis of Business Intelligence Tools in Healthcare

Authors: Avishkar Gawade, Omkar Bansode, Ketan Bhambure, Bhargav Deore

Abstract:

In recent year wide range of business intelligence technology have been applied to different area in order to support decision making process BI enables extraction of knowledge from data store. BI tools usually used in public health field for financial and administrative purposes.BI uses a dashboard in presentation stage to deliver information to information to end users.In this paper,we intend to analyze some open source BI tools on the market and their applicability in the clinical sphere taking into consideration the general characteristics of the clinical environment.A pervasive BI platform was developed using a real case in order to prove the tool viability.Analysis of various BI Tools in done with the help of several parameters such as data security,data integration,data quality reporting and anlaytics,performance,scalability and cost effectivesness.

Keywords: CDSS, EHR, business intelliegence, tools

Procedia PDF Downloads 124
33153 Comparative Analysis of Different Land Use Land Cover (LULC) Maps in WRF Modelling Over Indian Region

Authors: Sen Tanmoy, Jain Sarika, Panda Jagabandhu

Abstract:

The studies regarding the impact of urbanization using the WRF-ARW model rely heavily on the static geographical information selected, including domain configuration and land use land cover (LULC) data. Accurate representation of LULC data provides essential information for understanding urban growth and simulating meteorological parameters such as temperature, precipitation etc. Researchers are using different LULC data as per availability and their requirements. As far as India is concerned, we have very limited resources and data availability. So, it is important to understand how we can optimize our results using limited LULC data. In this review article, we explored how a LULC map is generated from different sources in the Indian context and what its significance is in WRF-ARW modeling to study urbanization/Climate change or any other meteorological parameters. Bibliometric analyses were also performed in this review article based on countries of study and indexed keywords. Finally, some key points are marked out for selecting the most suitable LULC map for any urbanization-related study.

Keywords: LULC, LULC mapping, LANDSAT, WRF-ARW, ISRO, bibliometric Analysis.

Procedia PDF Downloads 11
33152 Principles and Guidance for the Last Days of Life: Te Ara Whakapiri

Authors: Tania Chalton

Abstract:

In June 2013, an independent review of the Liverpool Care Pathway (LCP) identified a number of problems with the implementation of the LCP in the UK and recommended that it be replaced by individual care plans for each patient. As a result of the UK findings, in November 2013 the Ministry of Health (MOH) commissioned the Palliative Care Council to initiate a programme of work to investigate an appropriate approach for the care of people in their last days of life in New Zealand (NZ). The Last Days of Life Working Group commenced a process to develop national consensus on the care of people in their last days of life in April 2014. In order to develop its advice for the future provision of care to people in their last days of life, the Working Group (WG) established a comprehensive work programme and as a result has developed a series of working papers. Specific areas of focus included: An analysis of the UK Independent Review findings and an assessment of these findings to the NZ context. A stocktake of services providing care to people in their last days of life, including aged residential care (ARC); hospices; hospitals; and primary care. International and NZ literature reviews of evidence and best practice. Survey of family to understand the consumer perspective on the care of people in their last days of life. Key aspects of care that required further considerations for NZ were: Terminology: clarify terminology used in the last days of life and in relation to death and dying. Evidenced based: including specific review of evidence regarding, spiritual, culturally appropriate care as well as dementia care. Diagnosis of dying: need for both guidance around the diagnosis of dying and communication with family. Workforce issues: access to an appropriate workforce after hours. Nutrition and hydration: guidance around appropriate approaches to nutrition and hydration. Symptom and pain management: guidance around symptom management. Documentation: documentation of the person’s care which is robust enough for data collection and auditing requirements, not ‘tick box’ approach to care. Education and training: improved consistency and access to appropriate education and training. Leadership: A dedicated team or person to support and coordinate the introduction and implementation of any last days of life model of care. Quality indicators and data collection: model of care to enable auditing and regular reviews to ensure on-going quality improvement. Cultural and spiritual: address and incorporate any cultural and spiritual aspects. A final document was developed incorporating all the evidence which provides guidance to the health sector on best practice for people at end of life: “Principles and guidance for the last days of life: Te Ara Whakapiri”.

Keywords: end of life, guidelines, New Zealand, palliative care

Procedia PDF Downloads 418
33151 Activity-Based Costing of Medical Intensive Care Unit 240

Authors: Suppawan Lertpongpakpoom, Anongnat Boonrat, Kunya BoontummoSuppawan

Abstract:

This descriptive cost analysis aimed to analyze the unit cost of patients in medical intensive care unit. Purposive sampling was used to select 20 nurses, 6 practical nurses, 5 nurses aid and select samples 30 patients. Data were collected from both primary source (activity and average time of nursing care) and secondary source Z bill of payment and patient record). Instruments were cost recording form, activity observation form, and service recording form. Content validity of all instruments were evaluated by three experts (CVI = 0.87). Descriptive statistics was employed for data analysis. The results of the Activity-Based Costing Analysis showed that total activity cost of 4 service types for the patients was 14,776.92 Bath. The highest cost was nursing record was 5,674.78 Bath, followed direct nursing activity was 5,176.18 Bath, medical treatment was 1,976.6 Bath. The lowest cost was management activity was 1,003.64 Bath per visit. The result suggested that Activity-Base Costing Analysis could be applied to give better understanding of cost structure, enabling better consideration wasted expense and non-value-added activity, and improvement of effective utilization.

Keywords: activity-based costing, medical intensive care, nursing care, cost analysis

Procedia PDF Downloads 390
33150 Urban and Building Information Modeling’s Applications for Environmental Education: Case Study of Educational Campuses

Authors: Samar Alarif

Abstract:

Smart sustainable educational campuses are the latest paradigm of innovation in the education domain. Campuses become a hub for sustainable environmental innovations. University has a vital role in paving the road for digital transformations in the infrastructure domain by preparing skilled engineers and specialists. The open digital platform enables smart campuses to simulate real education experience by managing their infrastructure within the curriculums. Moreover, it allows the engagement between governments, businesses, and citizens to push for innovation and sustainable services. Urban and building information modeling platforms have recently attained widespread attention in smart campuses due to their applications and benefits for creating the campus's digital twin in the form of an open digital platform. Qualitative and quantitative strategies were used in directing this research to develop and validate the UIM/BIM platform benefits for smart campuses FM and its impact on the institution's sustainable vision. The research findings are based on literature reviews and case studies of the TU berlin El-Gouna campus. Textual data will be collected using semi-structured interviews with actors, secondary data like BIM course student projects, documents, and publications related to the campus actors. The study results indicated that UIM/BIM has several benefits for the smart campus. Universities can achieve better capacity-building by integrating all the actors in the UIM/BIM process. Universities would achieve their community outreach vision by launching an online outreach of UIM/BIM course for the academic and professional community. The UIM/BIM training courses would integrate students from different disciplines and alumni graduated as well as engineers and planners and technicians. Open platforms enable universities to build a partnership with the industry; companies should be involved in the development of BIM technology courses. The collaboration between academia and the industry would fix the gap, promote the academic courses to reply to the professional requirements, and transfer the industry's academic innovations. In addition to that, the collaboration between academia, industry, government vocational and training centers, and civil society should be promoted by co-creation workshops, a series of seminars, and conferences. These co-creation activities target the capacity buildings and build governmental strategies and policies to support expanding the sustainable innovations and to agree on the expected role of all the stakeholders to support the transformation.

Keywords: smart city, smart educational campus, UIM, urban platforms, sustainable campus

Procedia PDF Downloads 109
33149 Quality Standards for Emergency Response: A Methodological Framework

Authors: Jennifer E. Lynette

Abstract:

This study describes the development process of a methodological framework for quality standards used to measure the efficiency and quality of response efforts of trained personnel at emergency events. This paper describes the techniques used to develop the initial framework and its potential application to professions under the broader field of emergency management. The example described in detail in this paper applies the framework specifically to fire response activities by firefighters. Within the quality standards framework, the fire response process is chronologically mapped. Individual variables within the sequence of events are identified. Through in-person data collection, questionnaires, interviews, and the expansion of the incident reporting system, this study identifies and categorizes previously unrecorded variables involved in the response phase of a fire. Following a data analysis of each variable using a quantitative or qualitative assessment, the variables are ranked pertaining to the magnitude of their impact to the event outcome. Among others, key indicators of quality performance in the analysis involve decision communication, resource utilization, response techniques, and response time. Through the application of this framework and subsequent utilization of quality standards indicators, there is potential to increase efficiency in the response phase of an emergency event; thereby saving additional lives, property, and resources.

Keywords: emergency management, fire, quality standards, response

Procedia PDF Downloads 298
33148 Data Projects for “Social Good”: Challenges and Opportunities

Authors: Mikel Niño, Roberto V. Zicari, Todor Ivanov, Kim Hee, Naveed Mushtaq, Marten Rosselli, Concha Sánchez-Ocaña, Karsten Tolle, José Miguel Blanco, Arantza Illarramendi, Jörg Besier, Harry Underwood

Abstract:

One of the application fields for data analysis techniques and technologies gaining momentum is the area of social good or “common good”, covering cases related to humanitarian crises, global health care, or ecology and environmental issues, among others. The promotion of data-driven projects in this field aims at increasing the efficacy and efficiency of social initiatives, improving the way these actions help humanity in general and people in need in particular. This application field, however, poses its own barriers and challenges when developing data-driven projects, lagging behind in comparison with other scenarios. These challenges derive from aspects such as the scope and scale of the social issue to solve, cultural and political barriers, the skills of main stakeholders and the technological resources available, the motivation to be engaged in such projects, or the ethical and legal issues related to sensitive data. This paper analyzes the application of data projects in the field of social good, reviewing its current state and noteworthy initiatives, and presenting a framework covering the key aspects to analyze in such projects. The goal is to provide guidelines to understand the main challenges and opportunities for this type of data project, as well as identifying the main differential issues compared to “classical” data projects in general. A case study is presented on the initial steps and stakeholder analysis of a data project for the inclusion of refugees in the city of Frankfurt, Germany, in order to empirically confront the framework with a real example.

Keywords: data-driven projects, humanitarian operations, personal and sensitive data, social good, stakeholders analysis

Procedia PDF Downloads 316
33147 Solving Directional Overcurrent Relay Coordination Problem Using Artificial Bees Colony

Authors: M. H. Hussain, I. Musirin, A. F. Abidin, S. R. A. Rahim

Abstract:

This paper presents the implementation of Artificial Bees Colony (ABC) algorithm in solving Directional OverCurrent Relays (DOCRs) coordination problem for near-end faults occurring in fixed network topology. The coordination optimization of DOCRs is formulated as linear programming (LP) problem. The objective function is introduced to minimize the operating time of the associated relay which depends on the time multiplier setting. The proposed technique is to taken as a technique for comparison purpose in order to highlight its superiority. The proposed algorithms have been tested successfully on 8 bus test system. The simulation results demonstrated that the ABC algorithm which has been proved to have good search ability is capable in dealing with constraint optimization problems.

Keywords: artificial bees colony, directional overcurrent relay coordination problem, relay settings, time multiplier setting

Procedia PDF Downloads 314
33146 Increase Productivity by Using Work Measurement Technique

Authors: Mohammed Al Awadh

Abstract:

In order for businesses to take advantage of the opportunities for expanded production and trade that have arisen as a result of globalization and increased levels of competition, productivity growth is required. The number of available sources is decreasing with each passing day, which results in an ever-increasing demand. In response to this, there will be an increased demand placed on firms to improve the efficiency with which they utilise their resources. As a scientific method, work and time research techniques have been employed in all manufacturing and service industries to raise the efficiency of use of the factors of production. These approaches focus on work and time. The goal of this research is to improve the productivity of a manufacturing industry's production system by looking at ways to measure work. The work cycles were broken down into more manageable and quantifiable components. On the observation sheet, these aspects were noted down. The operation has been properly analysed in order to identify value-added and non-value-added components, and observations have been recorded for each of the different trails.

Keywords: time study, work measurement, work study, efficiency

Procedia PDF Downloads 61
33145 The Safety Profile of Vilazodone: A Study on Post-Marketing Surveillance

Authors: Humraaz Kaja, Kofi Mensah, Frasia Oosthuizen

Abstract:

Background and Aim: Vilazodone was approved in 2011 as an antidepressant to treat the major depressive disorder. As a relatively new drug, it is not clear if all adverse effects have been identified. The aim of this study was to review the adverse effects reported to the WHO Programme for International Drug Monitoring (PIDM) in order to add to the knowledge about the safety profile and adverse effects caused by vilazodone. Method: Data on adverse effects reported for vilazodone was obtained from the database VigiAccess managed by PIDM. Data was extracted from VigiAccess using Excel® and analyzed using descriptive statistics. The data collected was compared to the patient information leaflet (PIL) of Viibryd® and the FDA documents to determine adverse drug reactions reported post-marketing. Results: A total of 9708 adverse events had been recorded on VigiAccess, of which 6054 were not recorded on the PIL and the FDA approval document. Most of the reports were received from the Americas and were for adult women aged 45-64 years (24%, n=1059). The highest number of adverse events reported were for psychiatric events (19%; n=1889), followed by gastro-intestinal effects (18%; n=1839). Specific psychiatric disorders recorded included anxiety (316), depression (208), hallucination (168) and agitation (142). The systematic review confirmed several psychiatric adverse effects associated with the use of vilazodone. The findings of this study suggested that these common psychiatric adverse effects associated with the use of vilazodone were not known during the time of FDA approval of the drug and is not currently recorded in the patient information leaflet (PIL). Conclusions: In summary, this study found several adverse drug reactions not recorded in documents emanating from clinical trials pre-marketing. This highlights the importance of continued post-marketing surveillance of a drug, as well as the need for further studies on the psychiatric adverse events associated with vilazodone in order to improve the safety profile.

Keywords: adverse drug reactions, pharmacovigilance, post-marketing surveillance, vilazodone

Procedia PDF Downloads 98
33144 Clinical Feature Analysis and Prediction on Recurrence in Cervical Cancer

Authors: Ravinder Bahl, Jamini Sharma

Abstract:

The paper demonstrates analysis of the cervical cancer based on a probabilistic model. It involves technique for classification and prediction by recognizing typical and diagnostically most important test features relating to cervical cancer. The main contributions of the research include predicting the probability of recurrences in no recurrence (first time detection) cases. The combination of the conventional statistical and machine learning tools is applied for the analysis. Experimental study with real data demonstrates the feasibility and potential of the proposed approach for the said cause.

Keywords: cervical cancer, recurrence, no recurrence, probabilistic, classification, prediction, machine learning

Procedia PDF Downloads 347
33143 Analyzing the Risk Based Approach in General Data Protection Regulation: Basic Challenges Connected with Adapting the Regulation

Authors: Natalia Kalinowska

Abstract:

The adoption of the General Data Protection Regulation, (GDPR) finished the four-year work of the European Commission in this area in the European Union. Considering far-reaching changes, which will be applied by GDPR, the European legislator envisaged two-year transitional period. Member states and companies have to prepare for a new regulation until 25 of May 2018. The idea, which becomes a new look at an attitude to data protection in the European Union is risk-based approach. So far, as a result of implementation of Directive 95/46/WE, in many European countries (including Poland) there have been adopted very particular regulations, specifying technical and organisational security measures e.g. Polish implementing rules indicate even how long password should be. According to the new approach from May 2018, controllers and processors will be obliged to apply security measures adequate to level of risk associated with specific data processing. The risk in GDPR should be interpreted as the likelihood of a breach of the rights and freedoms of the data subject. According to Recital 76, the likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing. GDPR does not indicate security measures which should be applied – in recitals there are only examples such as anonymization or encryption. It depends on a controller’s decision what type of security measures controller considered as sufficient and he will be responsible if these measures are not sufficient or if his identification of risk level is incorrect. Data protection regulation indicates few levels of risk. Recital 76 indicates risk and high risk, but some lawyers think, that there is one more category – low risk/now risk. Low risk/now risk data processing is a situation when it is unlikely to result in a risk to the rights and freedoms of natural persons. GDPR mentions types of data processing when a controller does not have to evaluate level of risk because it has been classified as „high risk” processing e.g. processing on a large scale of special categories of data, processing with using new technologies. The methodology will include analysis of legal regulations e.g. GDPR, the Polish Act on the Protection of personal data. Moreover: ICO Guidelines and articles concerning risk based approach in GDPR. The main conclusion is that an appropriate risk assessment is a key to keeping data safe and avoiding financial penalties. On the one hand, this approach seems to be more equitable, not only for controllers or processors but also for data subjects, but on the other hand, it increases controllers’ uncertainties in the assessment which could have a direct impact on incorrect data protection and potential responsibility for infringement of regulation.

Keywords: general data protection regulation, personal data protection, privacy protection, risk based approach

Procedia PDF Downloads 240
33142 UAV’s Enhanced Data Collection for Heterogeneous Wireless Sensor Networks

Authors: Kamel Barka, Lyamine Guezouli, Assem Rezki

Abstract:

In this article, we propose a protocol called DataGA-DRF (a protocol for Data collection using a Genetic Algorithm through Dynamic Reference Points) that collects data from Heterogeneous wireless sensor networks. This protocol is based on DGA (Destination selection according to Genetic Algorithm) to control the movement of the UAV (Unmanned aerial vehicle) between dynamic reference points that virtually represent the sensor node deployment. The dynamics of these points ensure an even distribution of energy consumption among the sensors and also improve network performance. To determine the best points, DataGA-DRF uses a classification algorithm such as K-Means.

Keywords: heterogeneous wireless networks, unmanned aerial vehicles, reference point, collect data, genetic algorithm

Procedia PDF Downloads 66
33141 Time-Course Lipid Accumulation and Transcript Analyses of Lipid Biosynthesis Gene of Chlorella sp.3 under Nitrogen Limited Condition

Authors: Jyoti Singh, Swati Dubey, Mukta Singh, R. P. Singh

Abstract:

The freshwater microalgae Chlorella sp. is alluring considerable interest as a source for biofuel production due to its fast growth rate and high lipid content. Under nitrogen limited conditions, they can accumulate significant amounts of lipids. Thus, it is important to gain insight into the molecular mechanism of their lipid metabolism. In this study under nitrogen limited conditions, regular pattern of growth characteristics lipid accumulation and gene expression analysis of key regulatory genes of lipid biosynthetic pathway were carried out in microalgae Chlorella sp 3. Our results indicated that under nitrogen limited conditions there is a significant increase in the lipid content and lipid productivity, achieving 44.21±2.64 % and 39.34±0.66 mg/l/d at the end of the cultivation, respectively. Time-course transcript patterns of lipid biosynthesis genes i.e. acetyl coA carboxylase (accD) and diacylglycerol acyltransferase (dgat) showed that during late log phase of microalgae Chlorella sp.3 both the genes were significantly up regulated as compared to early log phase. Moreover, the transcript level of the dgat gene is two-fold higher than the accD gene. The results suggested that both the genes responded sensitively to the nitrogen limited conditions during the late log stage, which proposed their close relevance to lipid biosynthesis. Further, this transcriptome data will be useful for engineering microalgae species by targeting these genes for genetic modification to improve microalgal biofuel quality and production.

Keywords: biofuel, gene, lipid, microalgae

Procedia PDF Downloads 286
33140 Spectroscopic Autoradiography of Alpha Particles on Geologic Samples at the Thin Section Scale Using a Parallel Ionization Multiplier Gaseous Detector

Authors: Hugo Lefeuvre, Jerôme Donnard, Michael Descostes, Sophie Billon, Samuel Duval, Tugdual Oger, Herve Toubon, Paul Sardini

Abstract:

Spectroscopic autoradiography is a method of interest for geological sample analysis. Indeed, researchers may face different issues such as radioelement identification and quantification in the field of environmental studies. Imaging gaseous ionization detectors find their place in geosciences for conducting specific measurements of radioactivity to improve the monitoring of natural processes using naturally-occurring radioactive tracers, but also for the nuclear industry linked to the mining sector. In geological samples, the location and identification of the radioactive-bearing minerals at the thin-section scale remains a major challenge as the detection limit of the usual elementary microprobe techniques is far higher than the concentration of most of the natural radioactive decay products. The spatial distribution of each decay product in the case of uranium in a geomaterial is interesting for relating radionuclides concentration to the mineralogy. The present study aims to provide spectroscopic autoradiography analysis method for measuring the initial energy of alpha particles with a parallel ionization multiplier gaseous detector. The analysis method has been developed thanks to Geant4 modelling of the detector. The track of alpha particles recorded in the gas detector allow the simultaneous measurement of the initial point of emission and the reconstruction of the initial particle energy by a selection based on the linear energy distribution. This spectroscopic autoradiography method was successfully used to reproduce the alpha spectra from a 238U decay chain on a geological sample at the thin-section scale. The characteristics of this measurement are an energy spectrum resolution of 17.2% (FWHM) at 4647 keV and a spatial resolution of at least 50 µm. Even if the efficiency of energy spectrum reconstruction is low (4.4%) compared to the efficiency of a simple autoradiograph (50%), this novel measurement approach offers the opportunity to select areas on an autoradiograph to perform an energy spectrum analysis within that area. This opens up possibilities for the detailed analysis of heterogeneous geological samples containing natural alpha emitters such as uranium-238 and radium-226. This measurement will allow the study of the spatial distribution of uranium and its descendants in geo-materials by coupling scanning electron microscope characterizations. The direct application of this dual modality (energy-position) of analysis will be the subject of future developments. The measurement of the radioactive equilibrium state of heterogeneous geological structures, and the quantitative mapping of 226Ra radioactivity are now being actively studied.

Keywords: alpha spectroscopy, digital autoradiography, mining activities, natural decay products

Procedia PDF Downloads 133
33139 Implementation of Data Science in Field of Homologation

Authors: Shubham Bhonde, Nekzad Doctor, Shashwat Gawande

Abstract:

For the use and the import of Keys and ID Transmitter as well as Body Control Modules with radio transmission in a lot of countries, homologation is required. Final deliverables in homologation of the product are certificates. In considering the world of homologation, there are approximately 200 certificates per product, with most of the certificates in local languages. It is challenging to manually investigate each certificate and extract relevant data from the certificate, such as expiry date, approval date, etc. It is most important to get accurate data from the certificate as inaccuracy may lead to missing re-homologation of certificates that will result in an incompliance situation. There is a scope of automation in reading the certificate data in the field of homologation. We are using deep learning as a tool for automation. We have first trained a model using machine learning by providing all country's basic data. We have trained this model only once. We trained the model by feeding pdf and jpg files using the ETL process. Eventually, that trained model will give more accurate results later. As an outcome, we will get the expiry date and approval date of the certificate with a single click. This will eventually help to implement automation features on a broader level in the database where certificates are stored. This automation will help to minimize human error to almost negligible.

Keywords: homologation, re-homologation, data science, deep learning, machine learning, ETL (extract transform loading)

Procedia PDF Downloads 146
33138 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows

Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid

Abstract:

Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.

Keywords: erodible beds, finite element method, finite volume method, nonlinear elasticity, shallow water equations, stresses in soil

Procedia PDF Downloads 116
33137 A Case Study of Clinicians’ Perceptions of Enterprise Content Management at Tygerberg Hospital

Authors: Temitope O. Tokosi

Abstract:

Healthcare is a human right. The sensitivity of health issues has necessitated the introduction of Enterprise Content Management (ECM) at district hospitals in the Western Cape Province of South Africa. The objective is understanding clinicians’ perception of ECM at their workplace. It is a descriptive case study design of constructivist paradigm. It employed a phenomenological data analysis method using a pattern matching deductive based analytical procedure. Purposive and s4nowball sampling techniques were applied in selecting participants. Clinicians expressed concerns and frustrations using ECM such as, non-integration with other hospital systems. Inadequate access points to ECM. Incorrect labelling of notes and bar-coding causes more time wasted in finding information. System features and/or functions (such as search and edit) are not possible. Hospital management and clinicians are not constantly interacting and discussing. Information turnaround time is unacceptably lengthy. Resolving these problems would involve a positive working relationship between hospital management and clinicians. In addition, prioritising the problems faced by clinicians in relation to relevance can ensure problem-solving in order to meet clinicians’ expectations and hospitals’ objective. Clinicians’ perception should invoke attention from hospital management with regards technology use. The study’s results can be generalised across clinician groupings exposed to ECM at various district hospitals because of professional and hospital homogeneity.

Keywords: clinician, electronic content management, hospital, perception, technology

Procedia PDF Downloads 222
33136 Evaluation of Duncan-Chang Deformation Parameters of Granular Fill Materials Using Non-Invasive Seismic Wave Methods

Authors: Ehsan Pegah, Huabei Liu

Abstract:

Characterizing the deformation properties of fill materials in a wide stress range always has been an important issue in geotechnical engineering. The hyperbolic Duncan-Chang model is a very popular model of stress-strain relationship that captures the nonlinear deformation of granular geomaterials in a very tractable manner. It consists of a particular set of the model parameters, which are generally measured from an extensive series of laboratory triaxial tests. This practice is both time-consuming and costly, especially in large projects. In addition, undesired effects caused by soil disturbance during the sampling procedure also may yield a large degree of uncertainty in the results. Accordingly, non-invasive geophysical seismic approaches may be utilized as the appropriate alternative surveys for measuring the model parameters based on the seismic wave velocities. To this end, the conventional seismic refraction profiles were carried out in the test sites with the granular fill materials to collect the seismic waves information. The acquired shot gathers are processed, from which the P- and S-wave velocities can be derived. The P-wave velocities are extracted from the Seismic Refraction Tomography (SRT) technique while S-wave velocities are obtained by the Multichannel Analysis of Surface Waves (MASW) method. The velocity values were then utilized with the equations resulting from the rigorous theories of elasticity and soil mechanics to evaluate the Duncan-Chang model parameters. The derived parameters were finally compared with those from laboratory tests to validate the reliability of the results. The findings of this study may confidently serve as the useful references for determination of nonlinear deformation parameters of granular fill geomaterials. Those are environmentally friendly and quite economic, which can yield accurate results under the actual in-situ conditions using the surface seismic methods.

Keywords: Duncan-Chang deformation parameters, granular fill materials, seismic waves velocity, multichannel analysis of surface waves, seismic refraction tomography

Procedia PDF Downloads 171
33135 The Influence of Chinese Philosophic-Religious Traditions on Chinese Consumption Behaviour: Findings from the Taoist Case Study

Authors: Haiping Zhu

Abstract:

The purpose of this work-in-progress paper is to explore how the Chinese philosophic-religious tradition of Taoism impacts on the consumption behaviour of contemporary Chinese consumers. Although much cultural research has been conducted on Chinese consumption behaviours, most studies have approached the subject from Western perspectives. Examination of the limited literature indicates a gap in the knowledge of the relationship of traditional Chinese Taoism philosophy and Chinese consumption behaviour. To bridge this gap, this study examines Chinese consumption behaviour at a Taoist-related Chinese religious festival - the DuanWu festival - in order to seek some understanding of how the Taoism philosophic-religious tradition influences Chinese consumption behaviour from the point of view of the individuals involved. It focuses attention on their expression of Taoism cultural values, purchasing experience and subsequent consumption behaviours. This study undertook multiple methods for Taoist case study data collection: accompanied shopping with Taoists before DuanWu Festival; participant observations during DuanWu Festival; and in-depth interviews in order to explore Taoists consumption behaviours at the end of the Festival. Specifically, the finding from the Taoist case study corroborates and details the influence of the Taoism doctrine: man–nature orientation, Fenshui, ecological effect, and ecological knowledge, on their attitudes toward green purchasing behaviour. Findings from this Taoist case study - one of a series of three Chinese philosophic religious tradition case studies - contribute to the deeper understanding of contemporary Chinese consumers from a non-Western viewpoint and offer initial insights for global marketers to differentiate consumer needs and develop effective marketing strategies.

Keywords: consumer behaviour, culture values, green purchase behaviour, Taoism

Procedia PDF Downloads 241
33134 Multishape Task Scheduling Algorithms for Real Time Micro-Controller Based Application

Authors: Ankur Jain, W. Wilfred Godfrey

Abstract:

Embedded systems are usually microcontroller-based systems that represent a class of reliable and dependable dedicated computer systems designed for specific purposes. Micro-controllers are used in most electronic devices in an endless variety of ways. Some micro-controller-based embedded systems are required to respond to external events in the shortest possible time and such systems are known as real-time embedded systems. So in multitasking system there is a need of task Scheduling,there are various scheduling algorithms like Fixed priority Scheduling(FPS),Earliest deadline first(EDF), Rate Monotonic(RM), Deadline Monotonic(DM),etc have been researched. In this Report various conventional algorithms have been reviewed and analyzed, these algorithms consists of single shape task, A new Multishape scheduling algorithms has been proposed and implemented and analyzed.

Keywords: dm, edf, embedded systems, fixed priority, microcontroller, rtos, rm, scheduling algorithms

Procedia PDF Downloads 389
33133 Generating Individualized Wildfire Risk Assessments Utilizing Multispectral Imagery and Geospatial Artificial Intelligence

Authors: Gus Calderon, Richard McCreight, Tammy Schwartz

Abstract:

Forensic analysis of community wildfire destruction in California has shown that reducing or removing flammable vegetation in proximity to buildings and structures is one of the most important wildfire defenses available to homeowners. State laws specify the requirements for homeowners to create and maintain defensible space around all structures. Unfortunately, this decades-long effort had limited success due to noncompliance and minimal enforcement. As a result, vulnerable communities continue to experience escalating human and economic costs along the wildland-urban interface (WUI). Quantifying vegetative fuels at both the community and parcel scale requires detailed imaging from an aircraft with remote sensing technology to reduce uncertainty. FireWatch has been delivering high spatial resolution (5” ground sample distance) wildfire hazard maps annually to the community of Rancho Santa Fe, CA, since 2019. FireWatch uses a multispectral imaging system mounted onboard an aircraft to create georeferenced orthomosaics and spectral vegetation index maps. Using proprietary algorithms, the vegetation type, condition, and proximity to structures are determined for 1,851 properties in the community. Secondary data processing combines object-based classification of vegetative fuels, assisted by machine learning, to prioritize mitigation strategies within the community. The remote sensing data for the 10 sq. mi. community is divided into parcels and sent to all homeowners in the form of defensible space maps and reports. Follow-up aerial surveys are performed annually using repeat station imaging of fixed GPS locations to address changes in defensible space, vegetation fuel cover, and condition over time. These maps and reports have increased wildfire awareness and mitigation efforts from 40% to over 85% among homeowners in Rancho Santa Fe. To assist homeowners fighting increasing insurance premiums and non-renewals, FireWatch has partnered with Black Swan Analytics, LLC, to leverage the multispectral imagery and increase homeowners’ understanding of wildfire risk drivers. For this study, a subsample of 100 parcels was selected to gain a comprehensive understanding of wildfire risk and the elements which can be mitigated. Geospatial data from FireWatch’s defensible space maps was combined with Black Swan’s patented approach using 39 other risk characteristics into a 4score Report. The 4score Report helps property owners understand risk sources and potential mitigation opportunities by assessing four categories of risk: Fuel sources, ignition sources, susceptibility to loss, and hazards to fire protection efforts (FISH). This study has shown that susceptibility to loss is the category residents and property owners must focus their efforts. The 4score Report also provides a tool to measure the impact of homeowner actions on risk levels over time. Resiliency is the only solution to breaking the cycle of community wildfire destruction and it starts with high-quality data and education.

Keywords: defensible space, geospatial data, multispectral imaging, Rancho Santa Fe, susceptibility to loss, wildfire risk.

Procedia PDF Downloads 93
33132 Two Brazilian Medeas: The Cases of Mata Teu Pai and Medeia Negra

Authors: Jaqueline Bohn Donada

Abstract:

The significance of Euripides’ Medea for contemporary literature is noticeable. Even if the bulk of Classical Reception studies does not tend to look carefully and consistently to the literature produced outside the Anglophone world, Brazilian literature offers abundant materials for such studies. Indeed, a certain Classical background can be observed in Brazilian literature at least since 1975 when Gota d’Água [The Final Straw, in English], a play that recreates the story of Medea and sets it in a favela in Rio de Janeiro. Also worthy of notice is Ivo Bender’s Trilogia Perversa [Perverse Trilogy, in English], a series of three historical plays set in Southern Brazil and based on Aeschylus’ Oresteia and on Euripides’ Iphigenia in Aulis published in the 1980s. Since then, a number of works directly inspired by the plays of Aeschylus, Sophocles and Euripides have been published, not to mention several adaptations of Homer’s two epic poems. This paper proposes a comparative analysis of two such works: Grace Passô’s 2017 play Mata teu Pai [Kill your father, in English] and Marcia Lima’s 2019 play Medeia Negra [Black Medea, in English] from the perspective of Classical Reception Studies in an intersection with feminist literary criticism. The paper intends to look at the endurance of Euripides’ character in contemporary Brazilian literature with a focus on how the character seems to have acquired special relevance to the treatment of pressing issues of the twenty-first century. Whereas Grace Passô’s play sets Medea at the center of a group of immigrant women, Marcia Limma has the character enact the dilemmas of incarcerated women in Brazil. The hypothesis that this research aims at testing is that both artists preserve the pathos of Euripides’s original character at the same time that they recreate his Medea in concrete circumstances of Brazilian contemporary social reality. At the end, the research aims at stating the significance of the Medea theme to contemporary Brazilian literature.

Keywords: Euripides, Medea, Grace Passô, Marcia Limma, Brazilian literature

Procedia PDF Downloads 111
33131 Microbubbles Enhanced Synthetic Phorbol Ester Degradation by Ozonolysis

Authors: D. Kuvshinov, A. Siswanto, W. Zimmerman

Abstract:

A phorbol-12-myristate-13-acetate (TPA) is a synthetic analogue of phorbol ester (PE), a natural toxic compound of Euphorbiaceae plant. The oil extracted from plants of this family is useful source for primarily biofuel. However this oil can also be used as a food stock due to its significant nutrition content. The limitations for utilizing the oil as a food stock are mainly due to a toxicity of PE. Nowadays a majority of PE detoxification processes are expensive as include multi steps alcohol extraction sequence. Ozone is considered as a strong oxidative agent. It reaction with PE it attacks the carbon double bond of PE. This modification of PE molecular structure results into nontoxic ester with high lipid content. This report presents data on development of simple and cheap PE detoxification process with water application as a buffer and ozone as reactive component. The core of this new technique is a simultaneous application of new microscale plasma unit for ozone production and patented gas oscillation technology. In combination with a reactor design the technology permits ozone injection to the water-TPA mixture in form of microbubbles. The efficacy of a heterogeneous process depends on diffusion coefficient which can be controlled by contact time and interface area. The low velocity of rising microbubbles and high surface to volume ratio allow fast mass transfer to be achieved during the process. Direct injection of ozone is the most efficient process for a highly reactive and short lived chemical. Data on the plasma unit behavior are presented and influence of the gas oscillation technology to the microbubbles production mechanism has been discussed. Data on overall process efficacy for TPA degradation is shown.

Keywords: microbubble, ozonolysis, synthetic phorbol ester, chemical engineering

Procedia PDF Downloads 202
33130 Temperature-Dependent Barrier Characteristics of Inhomogeneous Pd/n-GaN Schottky Barrier Diodes Surface

Authors: K. Al-Heuseen, M. R. Hashim

Abstract:

The current-voltage (I-V) characteristics of Pd/n-GaN Schottky barrier were studied at temperatures over room temperature (300-470K). The values of ideality factor (n), zero-bias barrier height (φB0), flat barrier height (φBF) and series resistance (Rs) obtained from I-V-T measurements were found to be strongly temperature dependent while (φBo) increase, (n), (φBF) and (Rs) decrease with increasing temperature. The apparent Richardson constant was found to be 2.1x10-9 Acm-2K-2 and mean barrier height of 0.19 eV. After barrier height inhomogeneities correction, by assuming a Gaussian distribution (GD) of the barrier heights, the Richardson constant and the mean barrier height were obtained as 23 Acm-2K-2 and 1.78eV, respectively. The corrected Richardson constant was very closer to theoretical value of 26 Acm-2K-2.

Keywords: electrical properties, Gaussian distribution, Pd-GaN Schottky diodes, thermionic emission

Procedia PDF Downloads 263
33129 Rapid Identification and Diagnosis of the Pathogenic Leptospiras through Comparison among Culture, PCR and Real Time PCR Techniques from Samples of Human and Mouse Feces

Authors: S. Rostampour Yasouri, M. Ghane, M. Doudi

Abstract:

Leptospirosis is one of the most significant infectious and zoonotic diseases along with global spreading. This disease is causative agent of economoic losses and human fatalities in various countries, including Northern provinces of Iran. The aim of this research is to identify and compare the rapid diagnostic techniques of pathogenic leptospiras, considering the multifacetedness of the disease from a clinical manifestation and premature death of patients. In the spring and summer of 2020-2022, 25 fecal samples were collected from suspected leptospirosis patients and 25 Fecal samples from mice residing in the rice fields and factories in Tonekabon city. Samples were prepared by centrifugation and passing through membrane filters. Culture technique was used in liquid and solid EMJH media during one month of incubation at 30°C. Then, the media were examined microscopically. DNA extraction was conducted by extraction Kit. Diagnosis of leptospiras was enforced by PCR and Real time PCR (SYBR Green) techniques using lipL32 specific primer. Out of the patients, 11 samples (44%) and 8 samples (32%) were determined to be pathogenic Leptospira by Real time PCR and PCR technique, respectively. Out of the mice, 9 Samples (36%) and 3 samples (12%) were determined to be pathogenic Leptospira by the mentioned techniques, respectively. Although the culture technique is considered to be the gold standard technique, but due to the slow growth of pathogenic Leptospira and lack of colony formation of some species, it is not a fast technique. Real time PCR allowed rapid diagnosis with much higher accuracy compared to PCR because PCR could not completely identify samples with lower microbial load.

Keywords: culture, pathogenic leptospiras, PCR, real time PCR

Procedia PDF Downloads 69