Search results for: time domain reflectometry (TDR)
15919 Ionic Liquid and Chemical Denaturants Effects on the Fluorescence Properties of the Laccase
Authors: Othman Saoudi
Abstract:
In this work, we have interested in the investigation of the chemical denaturants and synthesized ionic liquids effects on the fluorescence properties of the laccase from Trametes versicolor. The fluorescence properties of the laccase result from the presence of Tryptophan, which has an aromatic core responsible for the absorption in ultra violet domain and the emission of the photons of fluorescence. The effect Pyrrolidinuim Formate ([pyrr][F]) and Morpholinium Formate ([morph][F]) ionic liquids on the laccase behavior for various volumetric fractions are studied. We have shown that the fluorescence spectrum relative to the [pyrr][F] presents a single band with a maximum around 340 nm and a secondary peak at 361 nm for a volumetric fraction of 20% v/v. For concentration superiors to 40%, the fluorescence intensity decreases and a displacement of the peaks toward higher wavelengths has occurred. For the [morph][F], the fluorescence spectrum showed a single band around 340 nm. The intensity of the principal peak decreases for concentration superiors to 20% v/v. From the plot representing the variation of the λₘₐₓ versus the volumetric concentration, we have determined the concentration of the half-transitions C1/2. These concentrations are equal to 42.62% and 40.91% v/v in the presence of [pyrr][F] and [morph][F] respectively. For the chemical denaturation, we have shown that the fluorescence intensity decreases with increasing denaturant concentrations where the maximum of the wavelength of emission shifts toward the higher wavelengths. We have also determined from the spectrum relative to the urea and GdmCl, the unfolding energy, ∆GD. The results show that the variation of the unfolding energy as a function of the denaturant concentrations varies according to the linear regression model. We have demonstrated also that the half-transitions C1/2 have occurred for urea and GdmCl denaturants concentrations around 3.06 and 3.17 M respectively.Keywords: laccase, fluorescence, ionic liquids, chemical denaturants
Procedia PDF Downloads 50715918 Effect of Catalyst on Castor Oil Based Polyurethane with Different Hard/Soft Segment Ratio
Authors: Swarnalata Sahoo, Smita Mohanty, S. K. Nayak
Abstract:
Environmentally friendly Polyurethane(PU) synthesis from Castor oil(CO) has been studied extensively. Probably due to high proportion of fatty hydroxy acids and unsaturated bond, CO showed better performance than other oil, can be easily utilized as commercial applications. In this work, cured PU polymers having different –NCO/OH ratio with and without catalyst were synthesized by using partially biobased Isocyanate with castor oil (CO). Curing time has been studied by observing at the time of reaction, which can be confirmed by AT-FTIR. DSC has been studied to monitor the reaction between CO & Isocyanates using non Isothermal process. Curing kinetics have also been studied to investigate the catalytic effect of the NCO / OH ratio of Polyurethane. Adhesion properties were evaluated from Lapshear test. Tg of the PU polymer was evaluated by DSC which can be compared by DMA. Surface Properties were studied by contact angle measurement. Improvement of the interfacial adhesion between the nonpolar surface of Aluminum substrate and the polar adhesive has been studied by modifying surface.Keywords: polyurethane, partially bio-based isocyanate, castor oil, catalyst
Procedia PDF Downloads 45015917 Optimization of Electrical Discharge Machining Parameters in Machining AISI D3 Tool Steel by Grey Relational Analysis
Authors: Othman Mohamed Altheni, Abdurrahman Abusaada
Abstract:
This study presents optimization of multiple performance characteristics [material removal rate (MRR), surface roughness (Ra), and overcut (OC)] of hardened AISI D3 tool steel in electrical discharge machining (EDM) using Taguchi method and Grey relational analysis. Machining process parameters selected were pulsed current Ip, pulse-on time Ton, pulse-off time Toff and gap voltage Vg. Based on ANOVA, pulse current is found to be the most significant factor affecting EDM process. Optimized process parameters are simultaneously leading to a higher MRR, lower Ra, and lower OC are then verified through a confirmation experiment. Validation experiment shows an improved MRR, Ra and OC when Taguchi method and grey relational analysis were usedKeywords: edm parameters, grey relational analysis, Taguchi method, ANOVA
Procedia PDF Downloads 29415916 Assessing Functional Structure in European Marine Ecosystems Using a Vector-Autoregressive Spatio-Temporal Model
Authors: Katyana A. Vert-Pre, James T. Thorson, Thomas Trancart, Eric Feunteun
Abstract:
In marine ecosystems, spatial and temporal species structure is an important component of ecosystems’ response to anthropological and environmental factors. Although spatial distribution patterns and fish temporal series of abundance have been studied in the past, little research has been allocated to the joint dynamic spatio-temporal functional patterns in marine ecosystems and their use in multispecies management and conservation. Each species represents a function to the ecosystem, and the distribution of these species might not be random. A heterogeneous functional distribution will lead to a more resilient ecosystem to external factors. Applying a Vector-Autoregressive Spatio-Temporal (VAST) model for count data, we estimate the spatio-temporal distribution, shift in time, and abundance of 140 species of the Eastern English Chanel, Bay of Biscay and Mediterranean Sea. From the model outputs, we determined spatio-temporal clusters, calculating p-values for hierarchical clustering via multiscale bootstrap resampling. Then, we designed a functional map given the defined cluster. We found that the species distribution within the ecosystem was not random. Indeed, species evolved in space and time in clusters. Moreover, these clusters remained similar over time deriving from the fact that species of a same cluster often shifted in sync, keeping the overall structure of the ecosystem similar overtime. Knowing the co-existing species within these clusters could help with predicting data-poor species distribution and abundance. Further analysis is being performed to assess the ecological functions represented in each cluster.Keywords: cluster distribution shift, European marine ecosystems, functional distribution, spatio-temporal model
Procedia PDF Downloads 19415915 Techniques of Construction Management in Civil Engineering
Authors: Mamoon M. Atout
Abstract:
The Middle East Gulf region has witnessed rapid growth and development in many areas over the last two decades. The development of the real-estate sector, construction industry and infrastructure projects are a major share of the development that has participated in the civilization of the countries of the Gulf. Construction industry projects were planned and managed by different types of experts, who came from all over the world having different types of experiences in construction management and industry. Some of these projects were completed on time, while many were not, due to many accumulating factors. Many accumulated factors are considered as the principle reason for the problem experienced at the project construction stage, which reflected negatively on the project success. Specific causes of delay have been identified by construction managers to avoid any unexpected delays through proper analysis and considerations to some implications such as risk assessment and analysis for many potential problems to ensure that projects will be delivered on time. Construction management implications were adopted and considered by project managers who have experience and knowledge in applying the techniques of the system of engineering construction management. The aim of this research is to determine the benefits of the implications of construction management by the construction team and level of considerations of the techniques and processes during the project development and construction phases to avoid any delay in the projects. It also aims to determine the factors that participate to project completion delays in case project managers are not well committed to their roles and responsibilities. The results of the analysis will determine the necessity of the applications required by the project team to avoid the causes of delays that help them deliver projects on time, e.g. verifying tender documents, quantities and preparing the construction method of the project.Keywords: construction management, control process, cost control, planning and scheduling
Procedia PDF Downloads 24715914 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy
Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu
Abstract:
The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis
Procedia PDF Downloads 6515913 FESA: Fuzzy-Controlled Energy-Efficient Selective Allocation and Reallocation of Tasks Among Mobile Robots
Authors: Anuradha Banerjee
Abstract:
Energy aware operation is one of the visionary goals in the area of robotics because operability of robots is greatly dependent upon their residual energy. Practically, the tasks allocated to robots carry different priority and often an upper limit of time stamp is imposed within which the task needs to be completed. If a robot is unable to complete one particular task given to it the task is reallocated to some other robot. The collection of robots is controlled by a Central Monitoring Unit (CMU). Selection of the new robot is performed by a fuzzy controller called Task Reallocator (TRAC). It accepts the parameters like residual energy of robots, possibility that the task will be successfully completed by the new robot within stipulated time, distance of the new robot (where the task is reallocated) from distance of the old one (where the task was going on) etc. The proposed methodology increases the probability of completing globally assigned tasks and saves huge amount of energy as far as the collection of robots is concerned.Keywords: energy-efficiency, fuzzy-controller, priority, reallocation, task
Procedia PDF Downloads 31615912 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution
Authors: Nikolay P. Brayanov, Anna V. Stoynova
Abstract:
Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development
Procedia PDF Downloads 24415911 Linguistic Insights Improve Semantic Technology in Medical Research and Patient Self-Management Contexts
Authors: William Michael Short
Abstract:
Semantic Web’ technologies such as the Unified Medical Language System Metathesaurus, SNOMED-CT, and MeSH have been touted as transformational for the way users access online medical and health information, enabling both the automated analysis of natural-language data and the integration of heterogeneous healthrelated resources distributed across the Internet through the use of standardized terminologies that capture concepts and relationships between concepts that are expressed differently across datasets. However, the approaches that have so far characterized ‘semantic bioinformatics’ have not yet fulfilled the promise of the Semantic Web for medical and health information retrieval applications. This paper argues within the perspective of cognitive linguistics and cognitive anthropology that four features of human meaning-making must be taken into account before the potential of semantic technologies can be realized for this domain. First, many semantic technologies operate exclusively at the level of the word. However, texts convey meanings in ways beyond lexical semantics. For example, transitivity patterns (distributions of active or passive voice) and modality patterns (configurations of modal constituents like may, might, could, would, should) convey experiential and epistemic meanings that are not captured by single words. Language users also naturally associate stretches of text with discrete meanings, so that whole sentences can be ascribed senses similar to the senses of words (so-called ‘discourse topics’). Second, natural language processing systems tend to operate according to the principle of ‘one token, one tag’. For instance, occurrences of the word sound must be disambiguated for part of speech: in context, is sound a noun or a verb or an adjective? In syntactic analysis, deterministic annotation methods may be acceptable. But because natural language utterances are typically characterized by polyvalency and ambiguities of all kinds (including intentional ambiguities), such methods leave the meanings of texts highly impoverished. Third, ontologies tend to be disconnected from everyday language use and so struggle in cases where single concepts are captured through complex lexicalizations that involve profile shifts or other embodied representations. More problematically, concept graphs tend to capture ‘expert’ technical models rather than ‘folk’ models of knowledge and so may not match users’ common-sense intuitions about the organization of concepts in prototypical structures rather than Aristotelian categories. Fourth, and finally, most ontologies do not recognize the pervasively figurative character of human language. However, since the time of Galen the widespread use of metaphor in the linguistic usage of both medical professionals and lay persons has been recognized. In particular, metaphor is a well-documented linguistic tool for communicating experiences of pain. Because semantic medical knowledge-bases are designed to help capture variations within technical vocabularies – rather than the kinds of conventionalized figurative semantics that practitioners as well as patients actually utilize in clinical description and diagnosis – they fail to capture this dimension of linguistic usage. The failure of semantic technologies in these respects degrades the efficiency and efficacy not only of medical research, where information retrieval inefficiencies can lead to direct financial costs to organizations, but also of care provision, especially in contexts of patients’ self-management of complex medical conditions.Keywords: ambiguity, bioinformatics, language, meaning, metaphor, ontology, semantic web, semantics
Procedia PDF Downloads 13215910 Towards Positive Identity Construction for Japanese Non-Native English Language Teachers
Authors: Yumi Okano
Abstract:
The low level of English proficiency among Japanese people has been a problem for a long time. Japanese non-native English language teachers, under social or ideological constraints, feel a gap between government policy and their language proficiency and cannot maintain high self-esteem. This paper focuses on current Japanese policies and the social context in which teachers are placed and examines the measures necessary for their positive identity formation from a macro-meso-micro perspective. Some suggestions for achieving this are: 1) Teachers should free themselves from the idea of native speakers and embrace local needs and accents, 2) Teachers should be involved in student discussions as facilitators and individuals so that they can be good role models for their students, and 3) Teachers should invest in their classrooms. 4) Guidelines and training should be provided to help teachers gain confidence. In addition to reducing the workload to make more time available, 5) expanding opportunities for investment outside the classroom into the real world is necessary.Keywords: language teacher identity, native speakers, government policy, critical pedagogy, investment
Procedia PDF Downloads 10315909 Core Number Optimization Based Scheduler to Order/Mapp Simulink Application
Authors: Asma Rebaya, Imen Amari, Kaouther Gasmi, Salem Hasnaoui
Abstract:
Over these last years, the number of cores witnessed a spectacular increase in digital signal and general use processors. Concurrently, significant researches are done to get benefit from the high degree of parallelism. Indeed, these researches are focused to provide an efficient scheduling from hardware/software systems to multicores architecture. The scheduling process consists on statically choose one core to execute one task and to specify an execution order for the application tasks. In this paper, we describe an efficient scheduler that calculates the optimal number of cores required to schedule an application, gives a heuristic scheduling solution and evaluates its cost. Our proposal results are evaluated and compared with Preesm scheduler results and we prove that ours allows better scheduling in terms of latency, computation time and number of cores.Keywords: computation time, hardware/software system, latency, optimization, multi-cores platform, scheduling
Procedia PDF Downloads 28315908 Effectiveness of Multi-Business Core Development Policy in Tokyo Metropolitan Area
Authors: Takashi Nakamura
Abstract:
In the Tokyo metropolitan area, traffic congestion and long commute times are caused by overconcentration in the central area. To resolve these problems, a core business city development policy was adopted in 1988. The core business cities, which include Yokohama, Chiba, Saitama, Tachikawa, and others, have designated business facilities accumulation districts where assistance measures are applied. Focusing on Yokohama city, this study investigates the trends in the number of offices, employees, and commuters at 2001 and 2012 Economic Census, as well as the average commute time in the Tokyo metropolitan area from 2005 to 2015 Metropolitan Transportation Census. Surveys were administered in 2001 and 2012 Economic Census to participants who worked in Yokohama, according to their distribution in the city's 1,757 subregions. Four main findings emerged: (1) The number of offices increased in Yokohama when the number of offices decreased in the Tokyo metropolitan area overall. Additionally, the number of employees at Yokohama increased. (2) The number of commuters to Tokyo's central area increased from Saitama prefecture, Tokyo Tama area, and Tokyo central area. However, it decreased from other areas. (3) The average commute time in the Tokyo metropolitan area was 67.7 minutes in 2015, a slight decrease from 2005 and 2010. (4) The number of employees at business facilities accumulation districts in Yokohama city increased greatly.Keywords: core business city development policy, commute time, number of employees, Yokohama city, distribution of employees
Procedia PDF Downloads 14315907 Comparison of Different DNA Extraction Platforms with FFPE tissue
Authors: Wang Yanping Karen, Mohd Rafeah Siti, Park MI Kyoung
Abstract:
Formalin-fixed paraffin embedded (FFPE) tissue is important in the area of oncological diagnostics. This method of preserving tissues enabling them to be stored easily at ambient temperature for a long time. This decreases the risk of losing the DNA quantity and quality after extraction, reducing sample wastage, and making FFPE more cost effective. However, extracting DNA from FFPE tissue is a challenge as DNA purified is often highly cross-linked, fragmented, and degraded. In addition, this causes problems for many downstream processes. In this study, there will be a comparison of DNA extraction efficiency between One BioMed’s Xceler8 automated platform with commercial available extraction kits (Qiagen and Roche). The FFPE tissue slices were subjected to deparaffinization process, pretreatment and then DNA extraction using the three mentioned platforms. The DNA quantity were determined with real-time PCR (BioRad CFX ) and gel electrophoresis. The amount of DNA extracted with the One BioMed’s X8 platform was found to be comparable with the other two manual extraction kits.Keywords: DNA extraction, FFPE tissue, qiagen, roche, one biomed X8
Procedia PDF Downloads 10715906 Efficiency of Treatment in Patients with Newly Diagnosed Destructive Pulmonary Tuberculosis Using Intravenous Chemotherapy
Authors: M. Kuzhko, M. Gumeniuk, D. Butov, T. Tlustova, O. Denysov, T. Sprynsian
Abstract:
Background: The aim of the research was to determine the effectiveness of chemotherapy using intravenous antituberculosis drugs compared with their oral administration during the intensive phase of treatment. Methods: 152 tuberculosis patients were randomized into 2 groups: Main (n=65) who received isoniazid, ethambutol and sodium rifamycin intravenous + pyrazinamide per os and control (n=87) who received all the drugs (isoniazid, rifampicin, ethambutol, pyrazinamide) orally. Results: After 2 weeks of treatment symptoms of intoxication disappeared in 59 (90.7±3.59 %) of patients of the main group and 60 (68.9±4.9 %) patients in the control group, p<0.05. The mean duration of symptoms of intoxication in patients main group was 9.6±0.7 days, in control group – 13.7±0.9 days. After completing intensive phase sputum conversion was found in all the patients main group and 71 (81.6±4.1 %) patients control group p < 0.05. The average time of sputum conversion in main group was 1.6±0.1 months and 1.9±0.1 months in control group, p > 0.05. In patients with destructive pulmonary tuberculosis time to sputum conversion was 1.7±0.1 months in main group and 2.2±0.2 months in control group, p < 0.05. The average time of cavities healing in main group was 2.9±0.2 months and 3.9±0.2 months in the control group, p < 0.05. Conclusions: In patients with newly diagnosed destructive pulmonary tuberculosis use of isoniazid, ethambutol and sodium rifamycin intravenous in the intensive phase of chemotherapy resulted in a significant reduction in terms of the disappearance of symptoms of intoxication and sputum conversion.Keywords: intravenous chemotherapy, tuberculosis, treatment efficiency, tuberculosis drugs
Procedia PDF Downloads 20215905 Accurate Calculation of the Penetration Depth of a Bullet Using ANSYS
Authors: Eunsu Jang, Kang Park
Abstract:
In developing an armored ground combat vehicle (AGCV), it is a very important step to analyze the vulnerability (or the survivability) of the AGCV against enemy’s attack. In the vulnerability analysis, the penetration equations are usually used to get the penetration depth and check whether a bullet can penetrate the armor of the AGCV, which causes the damage of internal components or crews. The penetration equations are derived from penetration experiments which require long time and great efforts. However, they usually hold only for the specific material of the target and the specific type of the bullet used in experiments. Thus, penetration simulation using ANSYS can be another option to calculate penetration depth. However, it is very important to model the targets and select the input parameters in order to get an accurate penetration depth. This paper performed a sensitivity analysis of input parameters of ANSYS on the accuracy of the calculated penetration depth. Two conflicting objectives need to be achieved in adopting ANSYS in penetration analysis: maximizing the accuracy of calculation and minimizing the calculation time. To maximize the calculation accuracy, the sensitivity analysis of the input parameters for ANSYS was performed and calculated the RMS error with the experimental data. The input parameters include mesh size, boundary condition, material properties, target diameter are tested and selected to minimize the error between the calculated result from simulation and the experiment data from the papers on the penetration equation. To minimize the calculation time, the parameter values obtained from accuracy analysis are adjusted to get optimized overall performance. As result of analysis, the followings were found: 1) As the mesh size gradually decreases from 0.9 mm to 0.5 mm, both the penetration depth and calculation time increase. 2) As diameters of the target decrease from 250mm to 60 mm, both the penetration depth and calculation time decrease. 3) As the yield stress which is one of the material property of the target decreases, the penetration depth increases. 4) The boundary condition with the fixed side surface of the target gives more penetration depth than that with the fixed side and rear surfaces. By using above finding, the input parameters can be tuned to minimize the error between simulation and experiments. By using simulation tool, ANSYS, with delicately tuned input parameters, penetration analysis can be done on computer without actual experiments. The data of penetration experiments are usually hard to get because of security reasons and only published papers provide them in the limited target material. The next step of this research is to generalize this approach to anticipate the penetration depth by interpolating the known penetration experiments. This result may not be accurate enough to be used to replace the penetration experiments, but those simulations can be used in the early stage of the design process of AGCV in modelling and simulation stage.Keywords: ANSYS, input parameters, penetration depth, sensitivity analysis
Procedia PDF Downloads 40115904 Thermoelectric Properties of Doped Polycrystalline Silicon Film
Authors: Li Long, Thomas Ortlepp
Abstract:
The transport properties of carriers in polycrystalline silicon film affect the performance of polycrystalline silicon-based devices. They depend strongly on the grain structure, grain boundary trap properties and doping concentration, which in turn are determined by the film deposition and processing conditions. Based on the properties of charge carriers, phonons, grain boundaries and their interactions, the thermoelectric properties of polycrystalline silicon are analyzed with the relaxation time approximation of the Boltz- mann transport equation. With this approach, thermal conductivity, electrical conductivity and Seebeck coefficient as a function of grain size, trap properties and doping concentration can be determined. Experiment on heavily doped polycrystalline silicon is carried out and measurement results are compared with the model.Keywords: conductivity, polycrystalline silicon, relaxation time approximation, Seebeck coefficient, thermoelectric property
Procedia PDF Downloads 12415903 A Portable Device for Pulse Wave Velocity Measurements
Authors: Chien-Lin Wang, Cha-Ling Ko, Tainsong Chen
Abstract:
Pulse wave velocity (PWV) of blood flow provides important information of vessel property and blood pressure which can be used to assess cardiovascular disease. However, the above measurements need expensive equipment, such as Doppler ultrasound, MRI, angiography etc. The photoplethysmograph (PPG) signals are commonly utilized to detect blood volume changes. In this study, two infrared (IR) probes are designed and placed at a fixed distance from finger base and fingertip. An analog circuit with automatic gain adjustment is implemented to get the stable original PPG signals from above two IR probes. In order to obtain the time delay precisely between two PPG signals, we obtain the pulse transit time from the second derivative of the original PPG signals. To get a portable, wireless and low power consumption PWV measurement device, the low energy Bluetooth 4.0 (BLE) and the microprocessor (Cortex™-M3) are used in this study. The PWV is highly correlated with blood pressure. This portable device has potential to be used for continuous blood pressure monitoring.Keywords: pulse wave velocity, photoplethysmography, portable device, biomedical engineering
Procedia PDF Downloads 52715902 [Keynote Talk]: Caught in the Tractorbeam of Larger Influences: The Filtration of Innovation in Education Technology Design
Authors: Justin D. Olmanson, Fitsum Abebe, Valerie Jones, Eric Kyle, Xianquan Liu, Katherine Robbins, Guieswende Rouamba
Abstract:
The history of education technology--and designing, adapting, and adopting technologies for use in educational spaces--is nuanced, complex, and dynamic. Yet, despite a range of continually emerging technologies, the design and development process often yields results that appear quite similar in terms of affordances and interactions. Through this study we (1) verify the extent to which designs have been constrained, (2) consider what might account for it, and (3) offer a way forward in terms of how we might identify and strategically sidestep these influences--thereby increasing the diversity of our designs with a given technology or within a particular learning domain. We begin our inquiry from the perspective that a host of co-influencing elements, fields, and meta narratives converge on the education technology design process to exert a tangible, often homogenizing effect on the resultant designs. We identify several elements that influence design in often implicit or unquestioned ways (e.g. curriculum, learning theory, economics, learning context, pedagogy), we describe our methodology for identifying the elemental positionality embedded in a design, we direct our analysis to a particular subset of technologies in the field of literacy, and unpack our findings. Our early analysis suggests that the majority of education technologies designed for use/used in US public schools are heavily influenced by a handful of mainstream theories and meta narratives. These findings have implications for how we approach the education technology design process--which we use to suggest alternative methods for designing/ developing with emerging technologies. Our analytical process and re conceptualized design process hold the potential to diversify the ways emerging and established technologies get incorporated into our designs.Keywords: curriculum, design, innovation, meta narratives
Procedia PDF Downloads 50915901 Analysing Environmental Licensing of Infrastructure Projects in Brazil
Authors: Ronaldo Seroa Da Motta, Gabriela Santiago
Abstract:
The main contribution of this study is the identification of the factors influencing the environmental licensing process of infrastructure projects in Brazil. These factors will be those that reflect the technical characteristics of the project, the corporate governance of the entrepreneur, and the institutional and regulatory governance of the environmental agency, including the number of interventions by non-licensing agencies. The model conditions these variables to the licensing processing time of 34 infrastructure projects. Our results indicated that the conditions would be more sensitive to the type of enterprise, complexity as in gas pipelines and hydroelectric plants in the most vulnerable biome with a greater value of the enterprise or the entrepreneur's assets, together with the number of employees of the licensing agency. The number of external interventions by other non-licensing institutions does not affect the licensing time. Such results challenge the current criticism that environmental licensing has been often pointed out as a barrier to speed up investments in infrastructure projects in Brazil due to the participation of civil society and other non-licensing institutions.Keywords: environmental licensing, condionants, Brazil, timing process
Procedia PDF Downloads 13515900 A Mixed Methods Research Design for the Development of the Xenia Higher Education Institutions' Inclusiveness Index
Authors: Achilles Kameas, Eleni Georgakakou, Anna Lisa Amodeo, Aideen Quilty, Aisling Malone, Roberta Albertazzi, Moises Carmona, Concetta Esposito, Ruben David Fernandez Carrasco, Carmela Ferrara, Francesco Garzillo, Mojca Pusnik, Maria Cristina Scarano
Abstract:
While researchers, especially in academia, study and research the phenomena of inclusion of sexual minority and gender marginalized groups, seldom the European Higher Education Institutions (HEI) act on lowering the cultural and educational barriers to their proactive inclusion. The challenge in European HEIs is that gender, and sexual orientation discrimination remains an issue not adequately addressed. Following a mixed methods research design of quantitative and qualitative research techniques and tools, which is applied in five (5) European countries (Italy, Greece, Ireland, Slovenia, and Spain) and that combines desk research, evaluation, and weighting processes for a Matrix-based on Objective indicators and Survey for students and staff of the HEI to gauge the perception of inclusiveness in the HEI context, XENIA HEI Inclusiveness Index is an instrument that will allow universities to gauge and assess their inclusiveness in the domain of discrimination and exclusion based on gender identity and sexual orientation. The index will allow capturing the depth and reach of policies, programmes, and initiatives of HEIs in tackling the phenomena and dynamics of exclusion of LGBT+ (lesbian, gay, bisexual, trans, and other marginalized groups on the basis of gender and sexual identity) and cisgender women exposed to the risk of discrimination.Keywords: gender identity, higher education, LGBT+ rights, XENIA inclusiveness index
Procedia PDF Downloads 16315899 Networked Radar System to Increase Safety of Urban Railroad Crossing
Authors: Sergio Saponara, Luca Fanucci, Riccardo Cassettari, Ruggero Piernicola, Marco Righetto
Abstract:
The paper presents an innovative networked radar system for detection of obstacles in a railway level crossing scenario. This Monitoring System (MS) is able to detect moving or still obstacles within the railway level crossing area automatically, avoiding the need of human presence for surveillance. The MS is also connected to the National Railway Information and Signaling System to communicate in real-time the level crossing status. The architecture is compliant with the highest Safety Integrity Level (SIL4) of the CENELEC standard. The number of radar sensors used is configurable at set-up time and depends on how large the level crossing area can be. At least two sensors are expected and up four can be used for larger areas. The whole processing chain that elaborates the output sensor signals, as well as the communication interface, is fully-digital, was designed in VHDL code and implemented onto a Xilinx Virtex 6.Keywords: radar for safe mobility, railroad crossing, railway, transport safety
Procedia PDF Downloads 48015898 Parking Space Detection and Trajectory Tracking Control for Vehicle Auto-Parking
Authors: Shiuh-Jer Huang, Yu-Sheng Hsu
Abstract:
On-board available parking space detecting system, parking trajectory planning and tracking control mechanism are the key components of vehicle backward auto-parking system. Firstly, pair of ultrasonic sensors is installed on each side of vehicle body surface to detect the relative distance between ego-car and surrounding obstacle. The dimension of a found empty space can be calculated based on vehicle speed and the time history of ultrasonic sensor detecting information. This result can be used for constructing the 2D vehicle environmental map and available parking type judgment. Finally, the auto-parking controller executes the on-line optimal parking trajectory planning based on this 2D environmental map, and monitors the real-time vehicle parking trajectory tracking control. This low cost auto-parking system was tested on a model car.Keywords: vehicle auto-parking, parking space detection, parking path tracking control, intelligent fuzzy controller
Procedia PDF Downloads 24415897 A High Performance Piano Note Recognition Scheme via Precise Onset Detection and Segmented Short-Time Fourier Transform
Authors: Sonali Banrjee, Swarup Kumar Mitra, Aritra Acharyya
Abstract:
A piano note recognition method has been proposed by the authors in this paper. The authors have used a comprehensive method for onset detection of each note present in a piano piece followed by segmented short-time Fourier transform (STFT) for the identification of piano notes. The performance evaluation of the proposed method has been carried out in different harsh noisy environments by adding different levels of additive white Gaussian noise (AWGN) having different signal-to-noise ratio (SNR) in the original signal and evaluating the note detection error rate (NDER) of different piano pieces consisting of different number of notes at different SNR levels. The NDER is found to be remained within 15% for all piano pieces under consideration when the SNR is kept above 8 dB.Keywords: AWGN, onset detection, piano note, STFT
Procedia PDF Downloads 16015896 Portable Cardiac Monitoring System Based on Real-Time Microcontroller and Multiple Communication Interfaces
Authors: Ionel Zagan, Vasile Gheorghita Gaitan, Adrian Brezulianu
Abstract:
This paper presents the contributions in designing a mobile system named Tele-ECG implemented for remote monitoring of cardiac patients. For a better flexibility of this application, the authors chose to implement a local memory and multiple communication interfaces. The project described in this presentation is based on the ARM Cortex M0+ microcontroller and the ADAS1000 dedicated chip necessary for the collection and transmission of Electrocardiogram signals (ECG) from the patient to the microcontroller, without altering the performances and the stability of the system. The novelty brought by this paper is the implementation of a remote monitoring system for cardiac patients, having a real-time behavior and multiple interfaces. The microcontroller is responsible for processing digital signals corresponding to ECG and also for the implementation of communication interface with the main server, using GSM/Bluetooth SIMCOM SIM800C module. This paper translates all the characteristics of the Tele-ECG project representing a feasible implementation in the biomedical field. Acknowledgment: This paper was supported by the project 'Development and integration of a mobile tele-electrocardiograph in the GreenCARDIO© system for patients monitoring and diagnosis - m-GreenCARDIO', Contract no. BG58/30.09.2016, PNCDI III, Bridge Grant 2016, using the infrastructure from the project 'Integrated Center for research, development and innovation in Advanced Materials, Nanotechnologies, and Distributed Systems for fabrication and control', Contract No. 671/09.04.2015, Sectoral Operational Program for Increase of the Economic Competitiveness co-funded from the European Regional Development Fund.Keywords: Tele-ECG, real-time cardiac monitoring, electrocardiogram, microcontroller
Procedia PDF Downloads 27215895 Disentangling the Sources and Context of Daily Work Stress: Study Protocol of a Comprehensive Real-Time Modelling Study Using Portable Devices
Authors: Larissa Bolliger, Junoš Lukan, Mitja Lustrek, Dirk De Bacquer, Els Clays
Abstract:
Introduction and Aim: Chronic workplace stress and its health-related consequences like mental and cardiovascular diseases have been widely investigated. This project focuses on the sources and context of psychosocial daily workplace stress in a real-world setting. The main objective is to analyze and model real-time relationships between (1) psychosocial stress experiences within the natural work environment, (2) micro-level work activities and events, and (3) physiological signals and behaviors in office workers. Methods: An Ecological Momentary Assessment (EMA) protocol has been developed, partly building on machine learning techniques. Empatica® wristbands will be used for real-life detection of stress from physiological signals; micro-level activities and events at work will be based on smartphone registrations, further processed according to an automated computer algorithm. A field study including 100 office-based workers with high-level problem-solving tasks like managers and researchers will be implemented in Slovenia and Belgium (50 in each country). Data mining and state-of-the-art statistical methods – mainly multilevel statistical modelling for repeated data – will be used. Expected Results and Impact: The project findings will provide novel contributions to the field of occupational health research. While traditional assessments provide information about global perceived state of chronic stress exposure, the EMA approach is expected to bring new insights about daily fluctuating work stress experiences, especially micro-level events and activities at work that induce acute physiological stress responses. The project is therefore likely to generate further evidence on relevant stressors in a real-time working environment and hence make it possible to advise on workplace procedures and policies for reducing stress.Keywords: ecological momentary assessment, real-time, stress, work
Procedia PDF Downloads 16115894 A Physiological Approach for Early Detection of Hemorrhage
Authors: Rabie Fadil, Parshuram Aarotale, Shubha Majumder, Bijay Guargain
Abstract:
Hemorrhage is the loss of blood from the circulatory system and leading cause of battlefield and postpartum related deaths. Early detection of hemorrhage remains the most effective strategy to reduce mortality rate caused by traumatic injuries. In this study, we investigated the physiological changes via non-invasive cardiac signals at rest and under different hemorrhage conditions simulated through graded lower-body negative pressure (LBNP). Simultaneous electrocardiogram (ECG), photoplethysmogram (PPG), blood pressure (BP), impedance cardiogram (ICG), and phonocardiogram (PCG) were acquired from 10 participants (age:28 ± 6 year, weight:73 ± 11 kg, height:172 ± 8 cm). The LBNP protocol consisted of applying -20, -30, -40, -50, and -60 mmHg pressure to the lower half of the body. Beat-to-beat heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and mean aerial pressure (MAP) were extracted from ECG and blood pressure. Systolic amplitude (SA), systolic time (ST), diastolic time (DT), and left ventricle Ejection time (LVET) were extracted from PPG during each stage. Preliminary results showed that the application of -40 mmHg i.e. moderate stage simulated hemorrhage resulted significant changes in HR (85±4 bpm vs 68 ± 5bpm, p < 0.01), ST (191 ± 10 ms vs 253 ± 31 ms, p < 0.05), LVET (350 ± 14 ms vs 479 ± 47 ms, p < 0.05) and DT (551 ± 22 ms vs 683 ± 59 ms, p < 0.05) compared to rest, while no change was observed in SA (p > 0.05) as a consequence of LBNP application. These findings demonstrated the potential of cardiac signals in detecting moderate hemorrhage. In future, we will analyze all the LBNP stages and investigate the feasibility of other physiological signals to develop a predictive machine learning model for early detection of hemorrhage.Keywords: blood pressure, hemorrhage, lower-body negative pressure, LBNP, machine learning
Procedia PDF Downloads 16715893 A Parallel Algorithm for Solving the PFSP on the Grid
Authors: Samia Kouki
Abstract:
Solving NP-hard combinatorial optimization problems by exact search methods, such as Branch-and-Bound, may degenerate to complete enumeration. For that reason, exact approaches limit us to solve only small or moderate size problem instances, due to the exponential increase in CPU time when problem size increases. One of the most promising ways to reduce significantly the computational burden of sequential versions of Branch-and-Bound is to design parallel versions of these algorithms which employ several processors. This paper describes a parallel Branch-and-Bound algorithm called GALB for solving the classical permutation flowshop scheduling problem as well as its implementation on a Grid computing infrastructure. The experimental study of our distributed parallel algorithm gives promising results and shows clearly the benefit of the parallel paradigm to solve large-scale instances in moderate CPU time.Keywords: grid computing, permutation flow shop problem, branch and bound, load balancing
Procedia PDF Downloads 28315892 Adsorption of Xylene Cyanol FF onto Activated Carbon from Brachystegia Eurycoma Seed Hulls: Determination of the Optimal Conditions by Statistical Design of Experiments
Authors: F. G Okibe, C. E Gimba, V. O Ajibola, I. G Ndukwe, E. D. Paul
Abstract:
A full factorial experimental design technique at two levels and four factors (24) was used to optimize the adsorption at 615 nm of Xylene Cyanol ff in aqueous solutions onto activated carbon prepared from brachystegia eurycoma seed hulls by chemical carbonization method. The effect of pH (3 and 5), initial dye concentration (20 and 60 mg/l), adsorbent dosage (0.01 and 0.05 g), and contact time (30 and 60 min) on removal efficiency of the adsorbent for the dye were investigated at 298K. From the analysis of variance, response surface and cube plot, adsorbent dosage was observed to be the most significant factor affecting the adsorption process. However, from the interaction between the variables studied, the optimum removal efficiency was 96.80 % achieved with adsorbent dosage of 0.05 g, contact time 45 minutes, pH 3, and initial dye concentration 60 mg/l.Keywords: factorial experimental design, adsorption, optimization, brachystegia eurycoma, xylene cyanol ff
Procedia PDF Downloads 40015891 Evaluation of Sequential Polymer Flooding in Multi-Layered Heterogeneous Reservoir
Authors: Panupong Lohrattanarungrot, Falan Srisuriyachai
Abstract:
Polymer flooding is a well-known technique used for controlling mobility ratio in heterogeneous reservoirs, leading to improvement of sweep efficiency as well as wellbore profile. However, low injectivity of viscous polymer solution attenuates oil recovery rate and consecutively adds extra operating cost. An attempt of this study is to improve injectivity of polymer solution while maintaining recovery factor, enhancing effectiveness of polymer flooding method. This study is performed by using reservoir simulation program to modify conventional single polymer slug into sequential polymer flooding, emphasizing on increasing of injectivity and also reduction of polymer amount. Selection of operating conditions for single slug polymer including pre-injected water, polymer concentration and polymer slug size is firstly performed for a layered-heterogeneous reservoir with Lorenz coefficient (Lk) of 0.32. A selected single slug polymer flooding scheme is modified into sequential polymer flooding with reduction of polymer concentration in two different modes: Constant polymer mass and reduction of polymer mass. Effects of Residual Resistance Factor (RRF) is also evaluated. From simulation results, it is observed that first polymer slug with the highest concentration has the main function to buffer between displacing phase and reservoir oil. Moreover, part of polymer from this slug is also sacrificed for adsorption. Reduction of polymer concentration in the following slug prevents bypassing due to unfavorable mobility ratio. At the same time, following slugs with lower viscosity can be injected easily through formation, improving injectivity of the whole process. A sequential polymer flooding with reduction of polymer mass shows great benefit by reducing total production time and amount of polymer consumed up to 10% without any downside effect. The only advantage of using constant polymer mass is slightly increment of recovery factor (up to 1.4%) while total production time is almost the same. Increasing of residual resistance factor of polymer solution yields a benefit on mobility control by reducing effective permeability to water. Nevertheless, higher adsorption results in low injectivity, extending total production time. Modifying single polymer slug into sequence of reduced polymer concentration yields major benefits on reducing production time as well as polymer mass. With certain design of polymer flooding scheme, recovery factor can even be further increased. This study shows that application of sequential polymer flooding can be certainly applied to reservoir with high value of heterogeneity since it requires nothing complex for real implementation but just a proper design of polymer slug size and concentration.Keywords: polymer flooding, sequential, heterogeneous reservoir, residual resistance factor
Procedia PDF Downloads 47615890 Challenge Response-Based Authentication for a Mobile Voting System
Authors: Tohari Ahmad, Hudan Studiawan, Iwang Aryadinata, Royyana M. Ijtihadie, Waskitho Wibisono
Abstract:
A manual voting system has been implemented worldwide. It has some weaknesses which may decrease the legitimacy of the voting result. An electronic voting system is introduced to minimize this weakness. It has been able to provide a better result, in terms of the total time taken in the voting process and accuracy. Nevertheless, people may be reluctant to go to the polling location because of some reasons, such as distance and time. In order to solve this problem, mobile voting is implemented by utilizing mobile devices. There are many mobile voting architectures available. Overall, authenticity of the users is the common problem of all voting systems. There must be a mechanism which can verify the users’ authenticity such that only verified users can give their vote once; others cannot vote. In this paper, a challenge response-based authentication is proposed by utilizing properties of the users, for example, something they have and know. In terms of speed, the proposed system provides good result, in addition to other capabilities offered by the system.Keywords: authentication, data protection, mobile voting, security
Procedia PDF Downloads 419