Search results for: continuous measurement
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4811

Search results for: continuous measurement

4091 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement

Authors: Hadi Ardiny, Amir Mohammad Beigzadeh

Abstract:

Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.

Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems

Procedia PDF Downloads 123
4090 Implementation of a Web-Based Clinical Outcomes Monitoring and Reporting Platform across the Fortis Network

Authors: Narottam Puri, Bishnu Panigrahi, Narayan Pendse

Abstract:

Background: Clinical Outcomes are the globally agreed upon, evidence-based measurable changes in health or quality of life resulting from the patient care. Reporting of outcomes and its continuous monitoring provides an opportunity for both assessing and improving the quality of patient care. In 2012, International Consortium Of HealthCare Outcome Measurement (ICHOM) was founded which has defined global Standard Sets for measuring the outcome of various treatments. Method: Monitoring of Clinical Outcomes was identified as a pillar of Fortis’ core value of Patient Centricity. The project was started as an in-house developed Clinical Outcomes Reporting Portal by the Fortis Medical IT team. Standard sets of Outcome measurement developed by ICHOM were used. A pilot was run at Fortis Escorts Heart Institute from Aug’13 – Dec’13.Starting Jan’14, it was implemented across 11 hospitals of the group. The scope was hospital-wide and major clinical specialties: Cardiac Sciences, Orthopedics & Joint Replacement were covered. The internally developed portal had its limitations of report generation and also capturing of Patient related outcomes was restricted. A year later, the company provisioned for an ICHOM Certified Software product which could provide a platform for data capturing and reporting to ensure compliance with all ICHOM requirements. Post a year of the launch of the software; Fortis Healthcare has become the 1st Healthcare Provider in Asia to publish Clinical Outcomes data for the Coronary Artery Disease Standard Set comprising of Coronary Artery Bypass Graft and Percutaneous Coronary Interventions) in the public domain. (Jan 2016). Results: This project has helped in firmly establishing a culture of monitoring and reporting Clinical Outcomes across Fortis Hospitals. Given the diverse nature of the healthcare delivery model at Fortis Network, which comprises of hospitals of varying size and specialty-mix and practically covering the entire span of the country, standardization of data collection and reporting methodology is a huge achievement in itself. 95% case reporting was achieved with more than 90% data completion at the end of Phase 1 (March 2016). Post implementation the group now has one year of data from its own hospitals. This has helped identify the gaps and plan towards ways to bridge them and also establish internal benchmarks for continual improvement. Besides the value created for the group includes: 1. Entire Fortis community has been sensitized on the importance of Clinical Outcomes monitoring for patient centric care. Initial skepticism and cynicism has been countered by effective stakeholder engagement and automation of processes. 2. Measuring quality is the first step in improving quality. Data analysis has helped compare clinical results with best-in-class hospitals and identify improvement opportunities. 3. Clinical fraternity is extremely pleased to be part of this initiative and has taken ownership of the project. Conclusion: Fortis Healthcare is the pioneer in the monitoring of Clinical Outcomes. Implementation of ICHOM standards has helped Fortis Clinical Excellence Program in improving patient engagement and strengthening its commitment to its core value of Patient Centricity. Validation and certification of the Clinical Outcomes data by an ICHOM Certified Supplier adds confidence to its claim of being leaders in this space.

Keywords: clinical outcomes, healthcare delivery, patient centricity, ICHOM

Procedia PDF Downloads 236
4089 How to Improve the Environmental Performance in a HEI in Mexico, an EEA Adaptation

Authors: Stephanie Aguirre Moreno, Jesús Everardo Olguín Tiznado, Claudia Camargo Wilson, Juan Andrés López Barreras

Abstract:

This research work presents a proposal to evaluate the environmental performance of a Higher Education Institution (HEI) in Mexico in order to minimize their environmental impact. Given that public education has limited financial resources, it is necessary to conduct studies that support priorities in decision-making situations and thus obtain the best cost-benefit ratio of continuous improvement programs as part of the environmental management system implemented. The methodology employed, adapted from the Environmental Effect Analysis (EEA), weighs the environmental aspects identified in the environmental diagnosis by two characteristics. Number one, environmental priority through the perception of the stakeholders, compliance of legal requirements, and environmental impact of operations. Number two, the possibility of improvement, which depends of factors such as the exchange rate that will be made, the level of investment and the return time of it. The highest environmental priorities, or hot spots, identified in this evaluation were: electricity consumption, water consumption and recycling, and disposal of municipal solid waste. However, the possibility of improvement for the disposal of municipal solid waste is higher, followed by water consumption and recycling, in spite of having an equal possibility of improvement to the energy consumption, time of return and cost-benefit is much greater.

Keywords: environmental performance, environmental priority, possibility of improvement, continuous improvement programs

Procedia PDF Downloads 495
4088 The Effect of Training Program by Using Especial Strength on the Performance Skills of Hockey Players

Authors: Wesam El Bana

Abstract:

The current research aimed at designing a training program for improving specific muscular strength through using the especial strength and identifying its effects on the performance level skills of hockey players. The researcher used the quasi-experimental approach (two – group design) with pre- and post-measurements. Sample: (n= 35) was purposefully chosen from sharkia sports club. Five hockey player were excluded due to their non-punctuality. The rest were divided into two equal groups (experimental and control). The researcher concluded the following: The traditional training program had a positive effect on improving the physical variables under investigation as it led to increasing the improvement percentages of the physical variables and the performance level skills of the control group between the pre- and post-measurement. The recommended training program had a positive effect on improving the physical variables under investigation as it led to increasing the improvement percentages of the physical variable and the performance level skills of the experimental group between the pre- and post-measurements. Exercises using the especial strength training had a positive effect on the post-measurement of the experimental group.

Keywords: hockey, especial strength, performance skills

Procedia PDF Downloads 241
4087 Organizational Resilience in the Perspective of Supply Chain Risk Management: A Scholarly Network Analysis

Authors: William Ho, Agus Wicaksana

Abstract:

Anecdotal evidence in the last decade shows that the occurrence of disruptive events and uncertainties in the supply chain is increasing. The coupling of these events with the nature of an increasingly complex and interdependent business environment leads to devastating impacts that quickly propagate within and across organizations. For example, the recent COVID-19 pandemic increased the global supply chain disruption frequency by at least 20% in 2020 and is projected to have an accumulative cost of $13.8 trillion by 2024. This crisis raises attention to organizational resilience to weather business uncertainty. However, the concept has been criticized for being vague and lacking a consistent definition, thus reducing the significance of the concept for practice and research. This study is intended to solve that issue by providing a comprehensive review of the conceptualization, measurement, and antecedents of operational resilience that have been discussed in the supply chain risk management literature (SCRM). We performed a Scholarly Network Analysis, combining citation-based and text-based approaches, on 252 articles published from 2000 to 2021 in top-tier journals based on three parameters: AJG ranking and ABS ranking, UT Dallas and FT50 list, and editorial board review. We utilized a hybrid scholarly network analysis by combining citation-based and text-based approaches to understand the conceptualization, measurement, and antecedents of operational resilience in the SCRM literature. Specifically, we employed a Bibliographic Coupling Analysis in the research cluster formation stage and a Co-words Analysis in the research cluster interpretation and analysis stage. Our analysis reveals three major research clusters of resilience research in the SCRM literature, namely (1) supply chain network design and optimization, (2) organizational capabilities, and (3) digital technologies. We portray the research process in the last two decades in terms of the exemplar studies, problems studied, commonly used approaches and theories, and solutions provided in each cluster. We then provide a conceptual framework on the conceptualization and antecedents of resilience based on studies in these clusters and highlight potential areas that need to be studied further. Finally, we leverage the concept of abnormal operating performance to propose a new measurement strategy for resilience. This measurement overcomes the limitation of most current measurements that are event-dependent and focus on the resistance or recovery stage - without capturing the growth stage. In conclusion, this study provides a robust literature review through a scholarly network analysis that increases the completeness and accuracy of research cluster identification and analysis to understand conceptualization, antecedents, and measurement of resilience. It also enables us to perform a comprehensive review of resilience research in SCRM literature by including research articles published during the pandemic and connects this development with a plethora of articles published in the last two decades. From the managerial perspective, this study provides practitioners with clarity on the conceptualization and critical success factors of firm resilience from the SCRM perspective.

Keywords: supply chain risk management, organizational resilience, scholarly network analysis, systematic literature review

Procedia PDF Downloads 74
4086 Estimations of Spectral Dependence of Tropospheric Aerosol Single Scattering Albedo in Sukhothai, Thailand

Authors: Siriluk Ruangrungrote

Abstract:

Analyses of available data from MFR-7 measurement were performed and discussed on the study of tropospheric aerosol and its consequence in Thailand. Since, ASSA (w) is one of the most important parameters for a determination of aerosol effect on radioactive forcing. Here the estimation of w was directly determined in terms of the ratio of aerosol scattering optical depth to aerosol extinction optical depth (ωscat/ωext) without any utilization of aerosol computer code models. This is of benefit for providing the elimination of uncertainty causing by the modeling assumptions and the estimation of actual aerosol input data. Diurnal w of 5 cloudless-days in winter and early summer at 5 distinct wavelengths of 415, 500, 615, 673 and 870 nm with the consideration of Rayleigh scattering and atmospheric column NO2 and Ozone contents were investigated, respectively. Besides, the tendency of spectral dependence of ω representing two seasons was observed. The characteristic of spectral results reveals that during wintertime the atmosphere of the inland rural vicinity for the period of measurement possibly dominated with a lesser amount of soil dust aerosols loading than one in early summer. Hence, the major aerosol loading particularly in summer was subject to a mixture of both soil dust and biomass burning aerosols.

Keywords: aerosol scattering optical depth, aerosol extinction optical depth, biomass burning aerosol, soil dust aerosol

Procedia PDF Downloads 405
4085 A Methodology Based on Image Processing and Deep Learning for Automatic Characterization of Graphene Oxide

Authors: Rafael do Amaral Teodoro, Leandro Augusto da Silva

Abstract:

Originated from graphite, graphene is a two-dimensional (2D) material that promises to revolutionize technology in many different areas, such as energy, telecommunications, civil construction, aviation, textile, and medicine. This is possible because its structure, formed by carbon bonds, provides desirable optical, thermal, and mechanical characteristics that are interesting to multiple areas of the market. Thus, several research and development centers are studying different manufacturing methods and material applications of graphene, which are often compromised by the scarcity of more agile and accurate methodologies to characterize the material – that is to determine its composition, shape, size, and the number of layers and crystals. To engage in this search, this study proposes a computational methodology that applies deep learning to identify graphene oxide crystals in order to characterize samples by crystal sizes. To achieve this, a fully convolutional neural network called U-net has been trained to segment SEM graphene oxide images. The segmentation generated by the U-net is fine-tuned with a standard deviation technique by classes, which allows crystals to be distinguished with different labels through an object delimitation algorithm. As a next step, the characteristics of the position, area, perimeter, and lateral measures of each detected crystal are extracted from the images. This information generates a database with the dimensions of the crystals that compose the samples. Finally, graphs are automatically created showing the frequency distributions by area size and perimeter of the crystals. This methodological process resulted in a high capacity of segmentation of graphene oxide crystals, presenting accuracy and F-score equal to 95% and 94%, respectively, over the test set. Such performance demonstrates a high generalization capacity of the method in crystal segmentation, since its performance considers significant changes in image extraction quality. The measurement of non-overlapping crystals presented an average error of 6% for the different measurement metrics, thus suggesting that the model provides a high-performance measurement for non-overlapping segmentations. For overlapping crystals, however, a limitation of the model was identified. To overcome this limitation, it is important to ensure that the samples to be analyzed are properly prepared. This will minimize crystal overlap in the SEM image acquisition and guarantee a lower error in the measurements without greater efforts for data handling. All in all, the method developed is a time optimizer with a high measurement value, considering that it is capable of measuring hundreds of graphene oxide crystals in seconds, saving weeks of manual work.

Keywords: characterization, graphene oxide, nanomaterials, U-net, deep learning

Procedia PDF Downloads 160
4084 Managing the Magnetic Protection of Workers in Magnetic Resonance Imaging

Authors: Safoin Aktaou, Aya Al Masri, Kamel Guerchouche, Malorie Martin, Fouad Maaloul

Abstract:

Introduction: In the ‘Magnetic Resonance Imaging (MRI)’ department, all workers involved in preparing the patient, setting it up, tunnel cleaning, etc. are likely to be exposed to ‘ElectroMagnetic fields (EMF)’ emitted by the MRI device. Exposure to EMF can cause adverse radio-biological effects to workers. The purpose of this study is to propose an organizational process to manage and control EMF risks. Materials and methods: The study was conducted at seven MRI departments using machines with 1.5 and 3 Tesla magnetic fields. We assessed the exposure of each one by measuring the two electromagnetic fields (static and dynamic) at different distances from the MRI machine both inside and around the examination room. Measurement values were compared with British and American references (those of the UK's ‘Medicines and Healthcare Regulatory Agency (MHRA)’ and the ‘American Radiology Society (ACR)’). Results: Following the results of EMF measurements and their comparison with the recommendations of learned societies, a zoning system that adapts to needs of different MRI services across the country has been proposed. In effect, three risk areas have been identified within the MRI services. This has led to the development of a good practice guide related to the magnetic protection of MRI workers. Conclusion: The guide established by our study is a standard that allows MRI workers to protect themselves against the risk of electromagnetic fields.

Keywords: comparison with international references, measurement of electromagnetic fields, magnetic protection of workers, magnetic resonance imaging

Procedia PDF Downloads 164
4083 The Effect of Socialization Tactics on Job Satisfaction of Employees, Regarding to Personality Types in Tehran University of Medical Science’s Employees

Authors: Maryam Hoorzad, Narges Shokry, Mandan Momeni

Abstract:

According to importance of socialization in effectiveness of organizations and on the other hand assessing the impact of individual differences on socialization tactics by measuring employees satisfaction, can be assessed for each of the personality types which socialization tactics is the more effective. The aim of this paper is to investigate how organizational socialization tactics affect job satisfaction of employees according to personality types. A survey was conducted using a measurement tool based on Van Maanen and Schein’s theory on organizational socialization tactics and Myers Briggs’ measurement tools of personality types. The respondents were employees with more than 3 years backward in Tehran University of Medical Science. Data collection was performed using both library and field, the data collection instrument was questionnaires and data were analysed using the Spss and Lisrel programs. It was found that investiture and serial tactics has a significant effect on employees satisfaction, any increase in investiture and serial tactics led to increase in job satisfaction and any increase in divestiture and disjunctive tactics led to reduction of job satisfaction. Investiture tactic has the most effect on employees satisfaction. Also based on the results, personality types affect the relationship between socialization tactics and job satisfaction. In the ESFJ personality type the effect of investiture tactic on employee satisfaction is the most.

Keywords: organizational socialization, organizational socialization tactics, personality types, job satisfaction

Procedia PDF Downloads 441
4082 Investigation of Efficient Production of ¹³⁵La for the Auger Therapy Using Medical Cyclotron in Poland

Authors: N. Zandi, M. Sitarz, J. Jastrzebski, M. Vagheian, J. Choinski, A. Stolarz, A. Trzcinska

Abstract:

¹³⁵La with the half-life of 19.5 h can be considered as a good candidate for Auger therapy. ¹³⁵La decays almost 100% by electron capture to the stable ¹³⁵Ba. In this study, all important possible reactions leading to ¹³⁵La production are investigated in details, and the corresponding theoretical yield for each reaction using the Monte-Carlo method (MCNPX code) are presented. Among them, the best reaction based on the cost-effectiveness and production yield regarding Poland facilities equipped with medical cyclotron has been selected. ¹³⁵La is produced using 16.5 MeV proton beam of general electric PET trace cyclotron through the ¹³⁵Ba(p,n)¹³⁵La reaction. Moreover, for a consistent facilitating comparison between the theoretical calculations and the experimental measurements, the beam current and also the proton beam energy is measured experimentally. Then, the obtained proton energy is considered as the entrance energy for the theoretical calculations. The production yield finally is measured and compared with the results obtained using the MCNPX code. The results show the experimental measurement and the theoretical calculations are in good agreement.

Keywords: efficient ¹³⁵La production, proton cyclotron energy measurement, MCNPX code, theoretical and experimental production yield

Procedia PDF Downloads 142
4081 Alternative Biocides to Reduce Algal Fouling in Seawater Industrial Cooling Towers

Authors: Mohammed Al-Bloushi, Sanghyun Jeong, Torove Leiknes

Abstract:

Biofouling in the open recirculating cooling water systems may cause biological corrosion, which can reduce the performance, increase the energy consummation and lower heat exchange efficiencies of the cooling tower. Seawater cooling towers are prone to biofouling due to the presences of organic and inorganic compounds in the seawater. The availability of organic and inorganic nutrients, along with sunlight and continuous aeration of the cooling tower contributes to an environment that is ideal for microbial growth. Various microorganisms (algae, fungi, and bacteria) can grow in a cooling tower system under certain environmental conditions. The most commonly being used method to control the biofouling in the cooling tower is the addition of biocides such as chlorination. In this study, algae containing diatom and green algae were added to the cooling tower basin, and its viability was monitored in the recirculating cooling seawater loop as well as in the cooling tower basin. Continuous addition of biocides was employed in pilot-scale seawater cooling towers, and it was operated continuously for 2 months. Three different types of oxidizing biocides, namely chlorine, chlorine dioxide and ozone, were tested. The results showed that all biocides were effective in keeping the biological growth to the minimum regardless of algal addition. Amongst the biocides, ozone could reduce 99% of total live cells of bacteria and algae, followed by chlorine dioxide at 97%, while the conventional chlorine showed only 89% reduction in the bioactivities.

Keywords: algae, biocide, biofouling, seawater cooling tower

Procedia PDF Downloads 239
4080 Glaucoma with Normal IOP, Is It True Normal Tension glaucoma or Something Else!

Authors: Sushma Tejwani, Shoruba Dinakaran, Kushal Kacha, K. Bhujang Shetty

Abstract:

Introduction and aim: It is not unusual to find patients with glaucomatous damage and normal intraocular pressure, and to label a patient as Normal tension glaucoma (NTG) majority of clinicians depend on office Intraocular pressures (IOP) recordings; hence, the concern is that whether we are missing the late night or early morning spikes in this group of patients. Also, ischemia to the optic nerve is one of the presumed causes of damage in these patients, however demonstrating the same has been a challenge. The aim of this study was to evaluate IOP variations and patterns in a series of patients with open angles, glaucomatous discs or fields but normal office IOP, and in addition to identify ischemic factors for true NTG patients. Materials & Methods: This was an observational cross- sectional study from a tertiary care centre. The patients that underwent full day DVT from Jan 2012 to April 2014 were studied. All patients underwent IOP measurement on Goldmann applanation tonometry every 3 hours for 24 hours along with a recording of the blood pressure (BP). Further patients with normal IOP throughout the 24- hour period were evaluated with a cardiologist for echocardiography and carotid Doppler. Results: There were 47 patients and a maximum number of patients studied was in the age group of 50-70 years. A biphasic IOP peak was noted for almost all the patients. Out of the 47 patients, 2 were excluded from analysis as they were on treatment.20 patients (42%) were diagnosed on DVT to have an IOP spike and were then diagnosed as open angle glaucoma and another 25 (55%) were diagnosed to have normal tension glaucoma and were subsequently advised a carotid Doppler and a cardiologists consult. Another interesting finding was that 9 patients had a nocturnal dip in their BP and 3 were found to have carotid artery stenosis. Conclusion: A continuous 24-hour monitoring of the IOP and BP is a very useful albeit mildly cumbersome tool which provides a wealth of information in cases of glaucoma presenting with normal office pressures. It is of great value in differentiating between normal tension glaucoma patients & open angle glaucoma patients. It also helps in timely diagnosis & possible intervention due to referral to a cardiologist in cases of carotid artery stenosis.

Keywords: carotid artery disease in NTG, diurnal variation of IOP, ischemia in glaucoma, normal tension glaucoma

Procedia PDF Downloads 285
4079 Development and Validation of the Circular Economy Scale

Authors: Yu Fang Chen, Jeng Fung Hung

Abstract:

This study aimed to develop a circular economy scale to assess the level of recognition among high-level executives in businesses regarding the circular economy. The circular economy is crucial for global ESG sustainable development and poses a challenge for corporate social responsibility. The aim of promoting the circular economy is to reduce resource consumption, move towards sustainable development, reduce environmental impact, maintain ecological balance, increase economic value, and promote employment. This study developed a 23-item Circular Economy Scale, which includes three subscales: "Understanding of Circular Economy by Enterprises" (8 items), "Attitudes" (9 items), and "Behaviors" (6 items). The Likert 5-point scale was used to measure responses, with higher scores indicating higher levels of agreement among senior executives with regard to the circular economy. The study tested 105 senior executives and used a structural equation model (SEM) as a measurement indicator to determine the extent to which potential variables were measured. The standard factor loading of the measurement indicator needs to be higher than 0.7, and the average variance explained (AVE) represents the index of convergent validity, which should be greater than 0.5 or at least 0.45 to be acceptable. Out of the 23 items, 12 did not meet the standard, so they were removed, leaving 5 items, 3 items, and 3 items for each of the three subscales, respectively, all with a factor loading greater than 0.7. The AVE for all three subscales was greater than 0.45, indicating good construct validity. The Cronbach's α reliability values for the three subscales were 0.887, 0.787, and 0.734, respectively, and the total scale was 0.860, all of which were higher than 0.7, indicating good reliability. The Circular Economy Scale developed in this study measures three conceptual components that align with the theoretical framework of the literature review and demonstrate good reliability and validity. It can serve as a measurement tool for evaluating the degree of acceptance of the circular economy among senior executives in enterprises. In the future, this scale can be used by senior executives in enterprises as an evaluation tool to further explore its impact on sustainable development and to promote circular economy and sustainable development based on the reference provided.

Keywords: circular economy, corporate social responsibility, scale development, structural equation model

Procedia PDF Downloads 83
4078 Purity Monitor Studies in Medium Liquid Argon TPC

Authors: I. Badhrees

Abstract:

This paper is an attempt to describe some of the results that had been found through a journey of study in the field of particle physics. This study consists of two parts, one about the measurement of the cross section of the decay of the Z particle in two electrons, and the other deals with the measurement of the cross section of the multi-photon absorption process using a beam of laser in the Liquid Argon Time Projection Chamber. The first part of the paper concerns the results based on the analysis of a data sample containing 8120 ee candidates to reconstruct the mass of the Z particle for each event where each event has an ee pair with PT(e) > 20GeV, and η(e) < 2.5. Monte Carlo templates of the reconstructed Z particle were produced as a function of the Z mass scale. The distribution of the reconstructed Z mass in the data was compared to the Monte Carlo templates, where the total cross section is calculated to be equal to 1432 pb. The second part concerns the Liquid Argon Time Projection Chamber, LAr TPC, the results of the interaction of the UV Laser, Nd-YAG with λ= 266mm, with LAr and through the study of the multi-photon ionization process as a part of the R&D at Bern University. The main result of this study was the cross section of the process of the multi-photon ionization process of the LAr, σe = 1.24±0.10stat±0.30sys.10 -56cm4.

Keywords: ATLAS, CERN, KACST, LArTPC, particle physics

Procedia PDF Downloads 346
4077 A Greener Approach towards the Synthesis of an Antimalarial Drug Lumefantrine

Authors: Luphumlo Ncanywa, Paul Watts

Abstract:

Malaria is a disease that kills approximately one million people annually. Children and pregnant women in sub-Saharan Africa lost their lives due to malaria. Malaria continues to be one of the major causes of death, especially in poor countries in Africa. Decrease the burden of malaria and save lives is very essential. There is a major concern about malaria parasites being able to develop resistance towards antimalarial drugs. People are still dying due to lack of medicine affordability in less well-off countries in the world. If more people could receive treatment by reducing the cost of drugs, the number of deaths in Africa could be massively reduced. There is a shortage of pharmaceutical manufacturing capability within many of the countries in Africa. However one has to question how Africa would actually manufacture drugs, active pharmaceutical ingredients or medicines developed within these research programs. It is quite likely that such manufacturing would be outsourced overseas, hence increasing the cost of production and potentially limiting the full benefit of the original research. As a result the last few years has seen major interest in developing more effective and cheaper technology for manufacturing generic pharmaceutical products. Micro-reactor technology (MRT) is an emerging technique that enables those working in research and development to rapidly screen reactions utilizing continuous flow, leading to the identification of reaction conditions that are suitable for usage at a production level. This emerging technique will be used to develop antimalarial drugs. It is this system flexibility that has the potential to reduce both the time was taken and risk associated with transferring reaction methodology from research to production. Using an approach referred to as scale-out or numbering up, a reaction is first optimized within the laboratory using a single micro-reactor, and in order to increase production volume, the number of reactors employed is simply increased. The overall aim of this research project is to develop and optimize synthetic process of antimalarial drugs in the continuous processing. This will provide a step change in pharmaceutical manufacturing technology that will increase the availability and affordability of antimalarial drugs on a worldwide scale, with a particular emphasis on Africa in the first instance. The research will determine the best chemistry and technology to define the lowest cost manufacturing route to pharmaceutical products. We are currently developing a method to synthesize Lumefantrine in continuous flow using batch process as bench mark. Lumefantrine is a dichlorobenzylidine derivative effective for the treatment of various types of malaria. Lumefantrine is an antimalarial drug used with artemether for the treatment of uncomplicated malaria. The results obtained when synthesizing Lumefantrine in a batch process are transferred into a continuous flow process in order to develop an even better and reproducible process. Therefore, development of an appropriate synthetic route for Lumefantrine is significant in pharmaceutical industry. Consequently, if better (and cheaper) manufacturing routes to antimalarial drugs could be developed and implemented where needed, it is far more likely to enable antimalarial drugs to be available to those in need.

Keywords: antimalarial, flow, lumefantrine, synthesis

Procedia PDF Downloads 202
4076 Enhancing Institutional Roles and Managerial Instruments for Irrigation Modernization in Sudan: The Case of Gezira Scheme

Authors: Mohamed Ahmed Abdelmawla

Abstract:

Calling to achieve Millennium Development Goals (MDGs) engaged with agriculture, i.e. poverty alleviation targets, human resources involved in agricultural sectors with special emphasis on irrigation must receive wealth of practical experience and training. Increased food production, including staple food, is needed to overcome the present and future threats to food security. This should happen within a framework of sustainable management of natural resources, elimination of unsustainable methods of production and poverty reduction (i.e. axes of modernization). A didactic tool to confirm the task of wise and maximum utility is the best management and accurate measurement, as major requisites for modernization process. The key component to modernization as a warranted goal is adhering great attention to management and measurement issues via capacity building. As such, this paper stressed the issues of discharge management and measurement by Field Outlet Pipes (FOP) for selected ones within the Gezira Scheme, where randomly nine FOPs were selected as representative locations. These FOPs extended along the Gezira Main Canal at Kilo 57 areas in the South up to Kilo 194 in the North. The following steps were followed during the field data collection and measurements: For each selected FOP, a 90 v- notch thin plate weir was placed in such away that the water was directed to pass only through the notch. An optical survey level was used to measure the water head of the notch and FOP. Both calculated discharge rates as measured by the v – notch, denoted as [Qc], and the adopted discharges given by (MOIWR), denoted as [Qa], are tackled for the average of three replicated readings undertaken at each location. The study revealed that the FOP overestimates and sometimes underestimates the discharges. This is attributed to the fact that the original design specifications were not fulfilled or met at present conditions where water is allowed to flow day and night with high head fluctuation, knowing that the FOP is non modular structure, i.e. the flow depends on both levels upstream and downstream and confirmed by the results of this study. It is convenient and formative to quantify the discharge in FOP with weirs or Parshall flumes. Cropping calendar should be clearly determined and agreed upon before the beginning of the season in accordance and consistency with the Sudan Gezira Board (SGB) and Ministry of Irrigation and Water Resources. As such, the water indenting should be based on actual Crop Water Requirements (CWRs), not on rules of thumb (420 m3/feddan, irrespective of crop or time of season).

Keywords: management, measurement, MDGs, modernization

Procedia PDF Downloads 251
4075 Control of Biofilm Formation and Inorganic Particle Accumulation on Reverse Osmosis Membrane by Hypochlorite Washing

Authors: Masaki Ohno, Cervinia Manalo, Tetsuji Okuda, Satoshi Nakai, Wataru Nishijima

Abstract:

Reverse osmosis (RO) membranes have been widely used for desalination to purify water for drinking and other purposes. Although at present most RO membranes have no resistance to chlorine, chlorine-resistant membranes are being developed. Therefore, direct chlorine treatment or chlorine washing will be an option in preventing biofouling on chlorine-resistant membranes. Furthermore, if particle accumulation control is possible by using chlorine washing, expensive pretreatment for particle removal can be removed or simplified. The objective of this study was to determine the effective hypochlorite washing condition required for controlling biofilm formation and inorganic particle accumulation on RO membrane in a continuous flow channel with RO membrane and spacer. In this study, direct chlorine washing was done by soaking fouled RO membranes in hypochlorite solution and fluorescence intensity was used to quantify biofilm on the membrane surface. After 48 h of soaking the membranes in high fouling potential waters, the fluorescence intensity decreased to 0 from 470 using the following washing conditions: 10 mg/L chlorine concentration, 2 times/d washing interval, and 30 min washing time. The chlorine concentration required to control biofilm formation decreased as the chlorine concentration (0.5–10 mg/L), the washing interval (1–4 times/d), or the washing time (1–30 min) increased. For the sample solutions used in the study, 10 mg/L chlorine concentration with 2 times/d interval, and 5 min washing time was required for biofilm control. The optimum chlorine washing conditions obtained from soaking experiments proved to be applicable also in controlling biofilm formation in continuous flow experiments. Moreover, chlorine washing employed in controlling biofilm with suspended particles resulted in lower amounts of organic (0.03 mg/cm2) and inorganic (0.14 mg/cm2) deposits on the membrane than that for sample water without chlorine washing (0.14 mg/cm2 and 0.33 mg/cm2, respectively). The amount of biofilm formed was 79% controlled by continuous washing with 10 mg/L of free chlorine concentration, and the inorganic accumulation amount decreased by 58% to levels similar to that of pure water with kaolin (0.17 mg/cm2) as feed water. These results confirmed the acceleration of particle accumulation due to biofilm formation, and that the inhibition of biofilm growth can almost completely reduce further particle accumulation. In addition, effective hypochlorite washing condition which can control both biofilm formation and particle accumulation could be achieved.

Keywords: reverse osmosis, washing condition optimization, hypochlorous acid, biofouling control

Procedia PDF Downloads 352
4074 Experimental Characterization of Composite Material with Non Contacting Methods

Authors: Nikolaos Papadakis, Constantinos Condaxakis, Konstantinos Savvakis

Abstract:

The aim of this paper is to determine the elastic properties (elastic modulus and Poisson ratio) of a composite material based on noncontacting imaging methods. More specifically, the significantly reduced cost of digital cameras has given the opportunity of the high reliability of low-cost strain measurement. The open source platform Ncorr is used in this paper which utilizes the method of digital image correlation (DIC). The use of digital image correlation in measuring strain uses random speckle preparation on the surface of the gauge area, image acquisition, and postprocessing the image correlation to obtain displacement and strain field on surface under study. This study discusses technical issues relating to the quality of results to be obtained are discussed. [0]8 fabric glass/epoxy composites specimens were prepared and tested at different orientations 0[o], 30[o], 45[o], 60[o], 90[o]. Each test was recorded with the camera at a constant frame rate and constant lighting conditions. The recorded images were processed through the use of the image processing software. The parameters of the test are reported. The strain map output which is obtained through strain measurement using Ncorr is validated by a) comparing the elastic properties with expected values from Classical laminate theory, b) through finite element analysis.

Keywords: composites, Ncorr, strain map, videoextensometry

Procedia PDF Downloads 144
4073 Autogenous Diabetic Retinopathy Censor for Ophthalmologists - AKSHI

Authors: Asiri Wijesinghe, N. D. Kodikara, Damitha Sandaruwan

Abstract:

The Diabetic Retinopathy (DR) is a rapidly growing interrogation around the world which can be annotated by abortive metabolism of glucose that causes long-term infection in human retina. This is one of the preliminary reason of visual impairment and blindness of adults. Information on retinal pathological mutation can be recognized using ocular fundus images. In this research, we are mainly focused on resurrecting an automated diagnosis system to detect DR anomalies such as severity level classification of DR patient (Non-proliferative Diabetic Retinopathy approach) and vessel tortuosity measurement of untwisted vessels to assessment of vessel anomalies (Proliferative Diabetic Retinopathy approach). Severity classification method is obtained better results according to the precision, recall, F-measure and accuracy (exceeds 94%) in all formats of cross validation. In ROC (Receiver Operating Characteristic) curves also visualized the higher AUC (Area Under Curve) percentage (exceeds 95%). User level evaluation of severity capturing is obtained higher accuracy (85%) result and fairly better values for each evaluation measurements. Untwisted vessel detection for tortuosity measurement also carried out the good results with respect to the sensitivity (85%), specificity (89%) and accuracy (87%).

Keywords: fundus image, exudates, microaneurisms, hemorrhages, tortuosity, diabetic retinopathy, optic disc, fovea

Procedia PDF Downloads 341
4072 Residual Dipolar Couplings in NMR Spectroscopy Using Lanthanide Tags

Authors: Elias Akoury

Abstract:

Nuclear Magnetic Resonance (NMR) spectroscopy is an indispensable technique used in structure determination of small and macromolecules to study their physical properties, elucidation of characteristic interactions, dynamics and thermodynamic processes. Quantum mechanics defines the theoretical description of NMR spectroscopy and treatment of the dynamics of nuclear spin systems. The phenomenon of residual dipolar coupling (RDCs) has become a routine tool for accurate structure determination by providing global orientation information of magnetic dipole-dipole interaction vectors within a common reference frame. This offers accessibility of distance-independent angular information and insights to local relaxation. The measurement of RDCs requires an anisotropic orientation medium for the molecules to partially align along the magnetic field. This can be achieved by introduction of liquid crystals or attaching a paramagnetic center. Although anisotropic paramagnetic tags continue to mark achievements in the biomolecular NMR of large proteins, its application in small organic molecules remains unspread. Here, we propose a strategy for the synthesis of a lanthanide tag and the measurement of RDCs in organic molecules using paramagnetic lanthanide complexes.

Keywords: lanthanide tags, NMR spectroscopy, residual dipolar coupling, quantum mechanics of spin dynamics

Procedia PDF Downloads 188
4071 Modified Evaluation of the Hydro-Mechanical Dependency of the Water Coefficient of Permeability of a Clayey Sand with a Novel Permeameter for Unsaturated Soils

Authors: G. Adelian, A. Mirzaii, S. S. Yasrobi

Abstract:

This paper represents data of an extensive experimental laboratory testing program for the measurement of the water coefficient of permeability of clayey sand in different hydraulic and mechanical boundary conditions. A novel permeameter was designed and constructed for the experimental testing program, suitable for the study of flow in unsaturated soils in different hydraulic and mechanical loading conditions. In this work, the effect of hydraulic hysteresis, net isotropic confining stress, water flow condition, and sample dimensions are evaluated on the water coefficient of permeability of understudying soil. The experimental results showed a hysteretic variation for the water coefficient of permeability versus matrix suction and degree of saturation, with higher values in drying portions of the SWCC. The measurement of the water permeability in different applied net isotropic stress also signified that the water coefficient of permeability increased within the increment of net isotropic consolidation stress. The water coefficient of permeability also appeared to be independent of different applied flow heads, water flow condition, and sample dimensions.

Keywords: water permeability, unsaturated soils, hydraulic hysteresis, void ratio, matrix suction, degree of saturation

Procedia PDF Downloads 527
4070 Design Challenges for Severely Skewed Steel Bridges

Authors: Muna Mitchell, Akshay Parchure, Krishna Singaraju

Abstract:

There is an increasing need for medium- to long-span steel bridges with complex geometry due to site restrictions in developed areas. One of the solutions to grade separations in congested areas is to use longer spans on skewed supports that avoid at-grade obstructions limiting impacts to the foundation. Where vertical clearances are also a constraint, continuous steel girders can be used to reduce superstructure depths. Combining continuous long steel spans on severe skews can resolve the constraints at a cost. The behavior of skewed girders is challenging to analyze and design with subsequent complexity during fabrication and construction. As a part of a corridor improvement project, Walter P Moore designed two 1700-foot side-by-side bridges carrying four lanes of traffic in each direction over a railroad track. The bridges consist of prestressed concrete girder approach spans and three-span continuous steel plate girder units. The roadway design added complex geometry to the bridge with horizontal and vertical curves combined with superelevation transitions within the plate girder units. The substructure at the steel units was skewed approximately 56 degrees to satisfy the existing railroad right-of-way requirements. A horizontal point of curvature (PC) near the end of the steel units required the use flared girders and chorded slab edges. Due to the flared girder geometry, the cross-frame spacing in each bay is unique. Staggered cross frames were provided based on AASHTO LRFD and NCHRP guidelines for high skew steel bridges. Skewed steel bridges develop significant forces in the cross frames and rotation in the girder websdue to differential displacements along the girders under dead and live loads. In addition, under thermal loads, skewed steel bridges expand and contract not along the alignment parallel to the girders but along the diagonal connecting the acute corners, resulting in horizontal displacement both along and perpendicular to the girders. AASHTO LRFD recommends a 95 degree Fahrenheit temperature differential for the design of joints and bearings. The live load and the thermal loads resulted in significant horizontal forces and rotations in the bearings that necessitated the use of HLMR bearings. A unique bearing layout was selected to minimize the effect of thermal forces. The span length, width, skew, and roadway geometry at the bridges also required modular bridge joint systems (MBJS) with inverted-T bent caps to accommodate movement in the steel units. 2D and 3D finite element analysis models were developed to accurately determine the forces and rotations in the girders, cross frames, and bearings and to estimate thermal displacements at the joints. This paper covers the decision-making process for developing the framing plan, bearing configurations, joint type, and analysis models involved in the design of the high-skew three-span continuous steel plate girder bridges.

Keywords: complex geometry, continuous steel plate girders, finite element structural analysis, high skew, HLMR bearings, modular joint

Procedia PDF Downloads 193
4069 Mathematical Modeling for Continuous Reactive Extrusion of Poly Lactic Acid Formation by Ring Opening Polymerization Considering Metal/Organic Catalyst and Alternative Energies

Authors: Satya P. Dubey, Hrushikesh A Abhyankar, Veronica Marchante, James L. Brighton, Björn Bergmann

Abstract:

Aims: To develop a mathematical model that simulates the ROP of PLA taking into account the effect of alternative energy to be implemented in a continuous reactive extrusion production process of PLA. Introduction: The production of large amount of waste is one of the major challenges at the present time, and polymers represent 70% of global waste. PLA has emerged as a promising polymer as it is compostable, biodegradable thermoplastic polymer made from renewable sources. However, the main limitation for the application of PLA is the traces of toxic metal catalyst in the final product. Thus, a safe and efficient production process needs to be developed to avoid the potential hazards and toxicity. It has been found that alternative energy sources (LASER, ultrasounds, microwaves) could be a prominent option to facilitate the ROP of PLA via continuous reactive extrusion. This process may result in complete extraction of the metal catalysts and facilitate less active organic catalysts. Methodology: Initial investigation were performed using the data available in literature for the reaction mechanism of ROP of PLA based on conventional metal catalyst stannous octoate. A mathematical model has been developed by considering significant parameters such as different initial concentration ratio of catalyst, co-catalyst and impurity. Effects of temperature variation and alternative energies have been implemented in the model. Results: The validation of the mathematical model has been made by using data from literature as well as actual experiments. Validation of the model including alternative energies is in progress based on experimental data for partners of the InnoREX project consortium. Conclusion: The model developed reproduces accurately the polymerisation reaction when applying alternative energy. Alternative energies have a great positive effect to increase the conversion and molecular weight of the PLA. This model could be very useful tool to complement Ludovic® software to predict the large scale production process when using reactive extrusion.

Keywords: polymer, poly-lactic acid (PLA), ring opening polymerization (ROP), metal-catalyst, bio-degradable, renewable source, alternative energy (AE)

Procedia PDF Downloads 362
4068 Possible Reasons for and Consequences of Generalizing Subgroup-Based Measurement Results to Populations: Based on Research Studies Conducted by Elementary Teachers in South Korea

Authors: Jaejun Jong

Abstract:

Many teachers in South Korea conduct research to improve the quality of their instruction. Unfortunately, many researchers generalize the results of measurements based on one subgroup to other students or to the entire population, which can cause problems. This study aims to determine examples of possible problems resulting from generalizing measurements based on one subgroup to an entire population or another group. This study is needed, as teachers’ instruction and class quality significantly affect the overall quality of education, but the quality of research conducted by teachers can become questionable due to overgeneralization. Thus, finding potential problems of overgeneralization can improve the overall quality of education. The data in this study were gathered from 145 sixth-grade elementary school students in South Korea. The result showed that students in different classes could differ significantly in various ways; thus, generalizing the results of subgroups to an entire population can engender erroneous student predictions and evaluations, which can lead to inappropriate instruction plans. This result shows that finding the reasons for such overgeneralization can significantly improve the quality of education.

Keywords: generalization, measurement, research methodology, teacher education

Procedia PDF Downloads 93
4067 Impact of Integrated Watershed Management Programme Based on Four Waters Concept: A Case Study of Sali Village, Rajasthan State of India

Authors: Garima Sharma, R. N. Sharma

Abstract:

Integrated watershed management programme based on 'Four Water Concept' was implemented in Sali village, in Jaipur District, Rajasthan State of India . The latitude 26.7234486 North and longitude 75.023876 East are the geocoordinate of the Sali. 'Four Waters Concept' is evolved by integrating the 'Four Waters', viz. rain water, soil moisture, ground water and surface water This methodology involves various water harvesting techniques to prevent the runoff of water by treatment of catchment, proper utilization of available water harvesting structures, renovation of the non-functional water harvesting structures and creation of new water harvesting structures. The case study included questionnaire survey from farmers and continuous study of village for two years. The total project area is 6153 Hac, and the project cost is Rs. 92.25 million. The sanctioned area of Sali Micro watershed is 2228 Hac with an outlay of Rs. 10.52 million. Watershed treatment activities such as water absorption trench, continuous contour trench, field bunding, check dams, were undertaken on agricultural lands for soil and water conservation. These measures have contributed in preventing runoff and increased the perennial availability of water in wells. According to the survey, water level in open wells in the area has risen by approximately 5 metres after the introduction of water harvesting structures. The continuous availability of water in wells has increased the area under irrigation and helped in crop diversification. Watershed management activities have brought the changes in cropping patterns and crop productivity. It helped in transforming 567 Hac culturable waste land into culturable arable land in the village. The farmers of village have created an additional income from the increased crop production. The programme also assured the availability of water during peak summers for the day to day activities of villagers. The outcomes indicate that there is positive impact of watershed management practices on the water resource potential as well the crop production of the area. This suggests that persistent efforts in this direction may lead to sustainability of the watershed.

Keywords: four water concept, groundwater potential, irrigation potential, watershed management

Procedia PDF Downloads 357
4066 EcoLife and Greed Index Measurement: An Alternative Tool to Promote Sustainable Communities and Eco-Justice

Authors: Louk Aourelien Andrianos, Edward Dommen, Athena Peralta

Abstract:

Greed, as epitomized by overconsumption of natural resources, is at the root of ecological destruction and unsustainability of modern societies. Presently economies rely on unrestricted structural greed which fuels unlimited economic growth, overconsumption, and individualistic competitive behavior. Structural greed undermines the life support system on earth and threatens ecological integrity, social justice and peace. The World Council of Churches (WCC) has developed a program on ecological and economic justice (EEJ) with the aim to promote an economy of life where the economy is embedded in society and society in ecology. This paper aims at analyzing and assessing the economy of life (EcoLife) by offering an empirical tool to measure and monitor the root causes and effects of unsustainability resulting from human greed on global, national, institutional and individual levels. This holistic approach is based on the integrity of ecology and economy in a society founded on justice. The paper will discuss critical questions such as ‘what is an economy of life’ and ‘how to measure and control it from the effect of greed’. A model called GLIMS, which stands for Greed Lines and Indices Measurement System is used to clarify the concept of greed and help measuring the economy of life index by fuzzy logic reasoning. The inputs of the model are from statistical indicators of natural resources consumption, financial realities, economic performance, social welfare and ethical and political facts. The outputs are concrete measures of three primary indices of ecological, economic and socio-political greed (ECOL-GI, ECON-GI, SOCI-GI) and one overall multidimensional economy of life index (EcoLife-I). EcoLife measurement aims to build awareness of an economy life and to address the effects of greed in systemic and structural aspects. It is a tool for ethical diagnosis and policy making.

Keywords: greed line, sustainability indicators, fuzzy logic, eco-justice, World Council of Churches (WCC)

Procedia PDF Downloads 320
4065 On the Possibility of Real Time Characterisation of Ambient Toxicity Using Multi-Wavelength Photoacoustic Instrument

Authors: Tibor Ajtai, Máté Pintér, Noémi Utry, Gergely Kiss-Albert, Andrea Palágyi, László Manczinger, Csaba Vágvölgyi, Gábor Szabó, Zoltán Bozóki

Abstract:

According to the best knowledge of the authors, here we experimentally demonstrate first, a quantified correlation between the real-time measured optical feature of the ambient and the off-line measured toxicity data. Finally, using these correlations we are presenting a novel methodology for real time characterisation of ambient toxicity based on the multi wavelength aerosol phase photoacoustic measurement. Ambient carbonaceous particulate matter is one of the most intensively studied atmospheric constituent in climate science nowadays. Beyond their climatic impact, atmospheric soot also plays an important role as an air pollutant that harms human health. Moreover, according to the latest scientific assessments ambient soot is the second most important anthropogenic emission source, while in health aspect its being one of the most harmful atmospheric constituents as well. Despite of its importance, generally accepted standard methodology for the quantitative determination of ambient toxicology is not available yet. Dominantly, ambient toxicology measurement is based on the posterior analysis of filter accumulated aerosol with limited time resolution. Most of the toxicological studies are based on operational definitions using different measurement protocols therefore the comprehensive analysis of the existing data set is really limited in many cases. The situation is further complicated by the fact that even during its relatively short residence time the physicochemical features of the aerosol can be masked significantly by the actual ambient factors. Therefore, decreasing the time resolution of the existing methodology and developing real-time methodology for air quality monitoring are really actual issues in the air pollution research. During the last decades many experimental studies have verified that there is a relation between the chemical composition and the absorption feature quantified by Absorption Angström Exponent (AAE) of the carbonaceous particulate matter. Although the scientific community are in the common platform that the PhotoAcoustic Spectroscopy (PAS) is the only methodology that can measure the light absorption by aerosol with accurate and reliable way so far, the multi-wavelength PAS which are able to selectively characterise the wavelength dependency of absorption has become only available in the last decade. In this study, the first results of the intensive measurement campaign focusing the physicochemical and toxicological characterisation of ambient particulate matter are presented. Here we demonstrate the complete microphysical characterisation of winter time urban ambient including optical absorption and scattering as well as size distribution using our recently developed state of the art multi-wavelength photoacoustic instrument (4λ-PAS), integrating nephelometer (Aurora 3000) as well as single mobility particle sizer and optical particle counter (SMPS+C). Beyond this on-line characterisation of the ambient, we also demonstrate the results of the eco-, cyto- and genotoxicity measurements of ambient aerosol based on the posterior analysis of filter accumulated aerosol with 6h time resolution. We demonstrate a diurnal variation of toxicities and AAE data deduced directly from the multi-wavelength absorption measurement results.

Keywords: photoacoustic spectroscopy, absorption Angström exponent, toxicity, Ames-test

Procedia PDF Downloads 302
4064 Validity and Reliability of Lifestyle Measurement of the LSAS among Recurrent Stroke Patients in Selected Hospital, Central Java, Indonesia

Authors: Meida Laely Ramdani, Earmporn Thongkrajai, Dedy Purwito

Abstract:

Lifestyle is one of the most important factors affecting health. Measurement of lifestyle behaviors is necessary for the identification of causal associations between unhealthy lifestyle and health outcomes. There was many instruments have been measured for lifestyle, but not specific for stroke recurrence. This study aimed to develop a new questionnaire of Lifestyle Adjustment Scale (LSAS) among recurrent stroke patients in Indonesia and to measure the reliability and validity of LSAS. The instrument consist of 33 items was developed from the responses of 30 recurrent stroke patients with the maximum age 60 years. Data was collected during October to November 2015. The properties of the instrument were evaluated by validity assessment and reliability measures. The content validity was judged adequate by a panel of five experts, with the result of I-CVI was 0.97. The Cronbach’s alpha analysis was carried out to measure the reliability of LSAS. The result showed that Cronbach’s alpha coefficient was 0.819. LSAS were classified under the domains of dietary habit, smoking habit, physical activity, and stress management. The results of Cronbach’s alpha coefficient for each subscale were 0.60, 0.39, 0.67, 0.65 and 0.76 respectively. LSAS instrument was valid and reliable therefore can be used as research tool among recurrent stroke patients. The development of this questionnaire has been adapted to the socio-cultural context in Indonesia.

Keywords: LSAS, recurrent stroke patients, lifestyle, Indonesia

Procedia PDF Downloads 249
4063 Experiences of Timing Analysis of Parallel Embedded Software

Authors: Muhammad Waqar Aziz, Syed Abdul Baqi Shah

Abstract:

The execution time analysis is fundamental to the successful design and execution of real-time embedded software. In such analysis, the Worst-Case Execution Time (WCET) of a program is a key measure, on the basis of which system tasks are scheduled. The WCET analysis of embedded software is also needed for system understanding and to guarantee its behavior. WCET analysis can be performed statically (without executing the program) or dynamically (through measurement). Traditionally, research on the WCET analysis assumes sequential code running on single-core platforms. However, as computation is steadily moving towards using a combination of parallel programs and multi-core hardware, new challenges in WCET analysis need to be addressed. In this article, we report our experiences of performing the WCET analysis of Parallel Embedded Software (PES) running on multi-core platform. The primary purpose was to investigate how WCET estimates of PES can be computed statically, and how they can be derived dynamically. Our experiences, as reported in this article, include the challenges we faced, possible suggestions to these challenges and the workarounds that were developed. This article also provides observations on the benefits and drawbacks of deriving the WCET estimates using the said methods and provides useful recommendations for further research in this area.

Keywords: embedded software, worst-case execution-time analysis, static flow analysis, measurement-based analysis, parallel computing

Procedia PDF Downloads 324
4062 Offshore Wind Assessment and Analysis for South Western Mediterranean Sea

Authors: Abdallah Touaibia, Nachida Kasbadji Merzouk, Mustapha Merzouk, Ryma Belarbi

Abstract:

accuracy assessment and a better understand of the wind resource distribution are the most important tasks for decision making before installing wind energy operating systems in a given region, there where our interest come to the Algerian coastline and its Mediterranean sea area. Despite its large coastline overlooking the border of Mediterranean Sea, there is still no strategy encouraging the development of offshore wind farms in Algerian waters. The present work aims to estimate the offshore wind fields for the Algerian Mediterranean Sea based on wind data measurements ranging from 1995 to 2018 provided of 24 years of measurement by seven observation stations focusing on three coastline cities in Algeria under a different measurement time step recorded from 30 min, 60 min, and 180 min variate from one to each other, two stations in Spain, two other ones in Italy and three in the coast of Algeria from the east Annaba, at the center Algiers, and to Oran taken place at the west of it. The idea behind consists to have multiple measurement points that helping to characterize this area in terms of wind potential by the use of interpolation method of their average wind speed values between these available data to achieve the approximate values of others locations where aren’t any available measurement because of the difficulties against the implementation of masts within the deep depth water. This study is organized as follow: first, a brief description of the studied area and its climatic characteristics were done. After that, the statistical properties of the recorded data were checked by evaluating wind histograms, direction roses, and average speeds using MatLab programs. Finally, ArcGIS and MapInfo soft-wares were used to establish offshore wind maps for better understanding the wind resource distribution, as well as to identify windy sites for wind farm installation and power management. The study pointed out that Cap Carbonara is the windiest site with an average wind speed of 7.26 m/s at 10 m, inducing a power density of 902 W/m², then the site of Cap Caccia with 4.88 m/s inducing a power density of 282 W/m². The average wind speed of 4.83 m/s is occurred for the site of Oran, inducing a power density of 230 W/m². The results indicated also that the dominant wind direction where the frequencies are highest for the site of Cap Carbonara is the West with 34%, an average wind speed of 9.49 m/s, and a power density of 1722 W/m². Then comes the site of Cap Caccia, where the prevailing wind direction is the North-west, about 20% and 5.82 m/s occurring a power density of 452 W/m². The site of Oran comes in third place with the North dominant direction with 32% inducing an average wind speed of 4.59 m/s and power density of 189 W/m². It also shown that the proposed method is either crucial in understanding wind resource distribution for revealing windy sites over a large area and more effective for wind turbines micro-siting.

Keywords: wind ressources, mediterranean sea, offshore, arcGIS, mapInfo, wind maps, wind farms

Procedia PDF Downloads 146