Search results for: easily identification
1265 Ensuring Quality in DevOps Culture
Authors: Sagar Jitendra Mahendrakar
Abstract:
Integrating quality assurance (QA) practices into DevOps culture has become increasingly important in modern software development environments. Collaboration, automation and continuous feedback characterize the seamless integration of DevOps development and operations teams to achieve rapid and reliable software delivery. In this context, quality assurance plays a key role in ensuring that software products meet the highest quality, performance and reliability standards throughout the development life cycle. This brief explores key principles, challenges, and best practices related to quality assurance in a DevOps culture. This emphasizes the importance of quality transfer in the development process, as quality control processes are integrated in every step of the DevOps process. Automation is the cornerstone of DevOps quality assurance, enabling continuous testing, integration and deployment and providing rapid feedback for early problem identification and resolution. In addition, the summary addresses the cultural and organizational challenges of implementing quality assurance in DevOps, emphasizing the need to foster collaboration, break down silos, and promote a culture of continuous improvement. It also discusses the importance of toolchain integration and capability development to support effective QA practices in DevOps environments. Moreover, the abstract discusses the cultural and organizational challenges in implementing QA within DevOps, emphasizing the need for fostering collaboration, breaking down silos, and nurturing a culture of continuous improvement. It also addresses the importance of toolchain integration and skills development to support effective QA practices within DevOps environments. Overall, this collection works at the intersection of QA and DevOps culture, providing insights into how organizations can use DevOps principles to improve software quality, accelerate delivery, and meet the changing demands of today's dynamic software. landscape.Keywords: quality engineer, devops, automation, tool
Procedia PDF Downloads 581264 Investigation of Azol Resistance in Aspergillosis Caused by Gradient Test and Agar Plaque Methods
Authors: Zeynep Yazgan, Gökhan Aygün, Reyhan Çalışkan
Abstract:
Objective: Invasive fungal infections are a serious threat in terms of morbidity and mortality, especially in immunocompromised patients. The most frequently isolated agents are Aspergillus genus fungi, and sensitivity to azoles, which are the first choice in treatment, decreases. In our study, we aimed to investigate the use of the agar plate screening method as a fast, easy, and practical method in determining azole resistance in Aspergillus spp. species. Methods: Our study was conducted with 125 Aspergillus spp. isolates produced from various clinical samples. Aspergillus spp. isolates were identified by conventional methods and azole resistance was determined by gradient test and agar plate screening method. Broth microdilution method was applied to resistant isolates, and CypA-L98H and CypA-M220 mutations in the cyp51A gene were investigated. Results: In our study, 55 A. fumigatus complex (44%), 42 A. flavus (33.6%), 6 A. terreus (5%), 4 A. niger (3%) and 18 Aspergillus spp. (14%) were identified. With the gradient test method, resistance to VOR and POS was detected in 1 (1.8%) of A.fumigatus isolates, and resistance to ITR was detected in 3 (5.45%). With the agar plate method, 1 of the A.fumigatus isolates (1.8%) had VOR, ITR, POS, 1 of the A.terreus isolates (16.7%) had VOR, 1 of the A.niger isolates (25%) had ITR. Resistance to VOR and POS was detected in 2 Aspergillus spp. isolates (11%), and resistance to ITR was detected in 1 (5.6%). Sensitivity and specificity were determined as 100% for VOR and POS in A. fumigatus species, 33.3% and 100% for ITR, respectively, 100% for ITR in A. flavus species, and 100% for ITR and POS in A. terreus species. By broth microdilution method in 7 isolates in which resistance was detected by gradient test and/or agar plate screening method; 1 A.fumigatus resistant to ITR, VOR, POS, 2 A.fumigatus resistant to ITR, 2 Aspergillus spp. ITR, VOR, POS MICs were determined as 2µg/ml and 8µg/ml, 8µg/ml and >32µg/ml, 0.5µg/ml and 4µg/ml, respectively. CypA-L98H mutations were detected in 5 of these isolates, CypA-M220 mutations were detected in 6, and no mutation was detected in 1. CypA-L98H and CypA-M220 mutations were detected in 1 isolate for which resistance was not detected. Conclusion: The need for rapid antifungal susceptibility screening tests is increasing in the treatment of aspergillosis. Although the sensitivity of the agar plate method was determined to be 33.3% for A.fumigatus ITR in our study, its sensitivity and specificity were determined to be 100% for ITR, VOR, and POS in other species. The low sensitivity value detected for A.fumigatus showed that agar plate drug concentrations should be updated in accordance with the latest regulations of EUCAST guidelines. The CypA-L98H and CypA-M220 mutations detected in our study suggested that the distribution of azole resistance-related mutations in different regions in our country should be investigated. In conclusion, it is thought that the agar plate method, which can be easily applied to detect azole resistance, is a fast and practical method in routine use and can contribute to both the determination of effective treatment strategies and the generation of epidemiological data.Keywords: Aspergillus, agar plate, azole resistance, cyp51A, cypA-L98H, cypA-M220
Procedia PDF Downloads 711263 Sample Preparation and Coring of Highly Friable and Heterogeneous Bonded Geomaterials
Authors: Mohammad Khoshini, Arman Khoshghalb, Meghdad Payan, Nasser Khalili
Abstract:
Most of the Earth’s crust surface rocks are technically categorized as weak rocks or weakly bonded geomaterials. Deeply weathered, weakly cemented, friable and easily erodible, they demonstrate complex material behaviour and understanding the overlooked mechanical behaviour of such materials is of particular importance in geotechnical engineering practice. Weakly bonded geomaterials are so susceptible to surface shear and moisture that conventional methods of core drilling fail to extract high-quality undisturbed samples out of them. Moreover, most of these geomaterials are of high heterogeneity rendering less reliable and feasible material characterization. In order to compensate for the unpredictability of the material response, either numerous experiments are needed to be conducted or large factors of safety must be implemented in the design process. However, none of these approaches is sustainable. In this study, a method for dry core drilling of such materials is introduced to take high-quality undisturbed core samples. By freezing the material at certain moisture content, a secondary structure is developed throughout the material which helps the whole structure to remain intact during the core drilling process. Moreover, to address the heterogeneity issue, the natural material was reconstructed artificially to obtain a homogeneous material with very high similarity to the natural one in both micro and macro-mechanical perspectives. The method is verified for both micro and macro scale. In terms of micro-scale analysis, using Scanning Electron Microscopy (SEM), pore spaces and inter-particle bonds were investigated and compared between natural and artificial materials. X-Ray Diffraction, XRD, analyses are also performed to control the chemical composition. At the macro scale, several uniaxial compressive strength tests, as well as triaxial tests, were performed to verify the similar mechanical response of the materials. A high level of agreement is observed between micro and macro results of natural and artificially bonded geomaterials. The proposed methods can play an important role to cut down the costs of experimental programs for material characterization and also to promote the accuracy of the numerical modellings based on the experimental results.Keywords: Artificial geomaterial, core drilling, macro-mechanical behavior, micro-scale, sample preparation, SEM photography, weakly bonded geomaterials
Procedia PDF Downloads 2161262 Numerical Simulation of Air Pollutant Using Coupled AERMOD-WRF Modeling System over Visakhapatnam: A Case Study
Authors: Amit Kumar
Abstract:
Accurate identification of deteriorated air quality regions is very helpful in devising better environmental practices and mitigation efforts. In the present study, an attempt has been made to identify the air pollutant dispersion patterns especially NOX due to vehicular and industrial sources over a rapidly developing urban city, Visakhapatnam (17°42’ N, 83°20’ E), India, during April 2009. Using the emission factors of different vehicles as well as the industry, a high resolution 1 km x 1 km gridded emission inventory has been developed for Visakhapatnam city. A dispersion model AERMOD with explicit representation of planetary boundary layer (PBL) dynamics and offline coupled through a developed coupler mechanism with a high resolution mesoscale model WRF-ARW resolution for simulating the dispersion patterns of NOX is used in the work. The meteorological as well as PBL parameters obtained by employing two PBL schemes viz., non-local Yonsei University (YSU) and local Mellor-Yamada-Janjic (MYJ) of WRF-ARW model, which are reasonably representing the boundary layer parameters are considered for integrating AERMOD. Significantly different dispersion patterns of NOX have been noticed between summer and winter months. The simulated NOX concentration is validated with available six monitoring stations of Central Pollution Control Board, India. Statistical analysis of model evaluated concentrations with the observations reveals that WRF-ARW of YSU scheme with AERMOD has shown better performance. The deteriorated air quality locations are identified over Visakhapatnam based on the validated model simulations of NOX concentrations. The present study advocates the utility of tNumerical Simulation of Air Pollutant Using Coupled AERMOD-WRF Modeling System over Visakhapatnam: A Case Studyhe developed gridded emission inventory of NOX with coupled WRF-AERMOD modeling system for air quality assessment over the study region.Keywords: WRF-ARW, AERMOD, planetary boundary layer, air quality
Procedia PDF Downloads 2801261 Experimental and Modal Determination of the State-Space Model Parameters of a Uni-Axial Shaker System for Virtual Vibration Testing
Authors: Jonathan Martino, Kristof Harri
Abstract:
In some cases, the increase in computing resources makes simulation methods more affordable. The increase in processing speed also allows real time analysis or even more rapid tests analysis offering a real tool for test prediction and design process optimization. Vibration tests are no exception to this trend. The so called ‘Virtual Vibration Testing’ offers solution among others to study the influence of specific loads, to better anticipate the boundary conditions between the exciter and the structure under test, to study the influence of small changes in the structure under test, etc. This article will first present a virtual vibration test modeling with a main focus on the shaker model and will afterwards present the experimental parameters determination. The classical way of modeling a shaker is to consider the shaker as a simple mechanical structure augmented by an electrical circuit that makes the shaker move. The shaker is modeled as a two or three degrees of freedom lumped parameters model while the electrical circuit takes the coil impedance and the dynamic back-electromagnetic force into account. The establishment of the equations of this model, describing the dynamics of the shaker, is presented in this article and is strongly related to the internal physical quantities of the shaker. Those quantities will be reduced into global parameters which will be estimated through experiments. Different experiments will be carried out in order to design an easy and practical method for the identification of the shaker parameters leading to a fully functional shaker model. An experimental modal analysis will also be carried out to extract the modal parameters of the shaker and to combine them with the electrical measurements. Finally, this article will conclude with an experimental validation of the model.Keywords: lumped parameters model, shaker modeling, shaker parameters, state-space, virtual vibration
Procedia PDF Downloads 2701260 Business Feasibility of Online Marketing of Food and Beverages Products in India
Authors: Dimpy Shah
Abstract:
The global economy has substantially changed in last three decades. Now almost all markets are transparent and visible for global customers. The corporates are now no more reliant on local markets for trade. The information technology revolution has changed business dynamics and marketing practices of corporate. The markets are divided into two different formats: traditional and virtual. In very short span of time, many e-commerce portals have captured global market. This strategy is well supported by global delivery system of multinational logistic companies. Now the markets are dealing with global supply chain networks, which are more demand driven and customer oriented. The corporate have realized importance of supply chain integration and marketing in this competitive environment. The Indian markets are also significantly affected with all these changes. In terms of population, India is in second place after China. In terms of demography, almost half of the population is of youth. It has been observed that the Indian youth are more inclined towards e-commerce and prefer to buy goods from web portal. Initially, this trend was observed in Indian service sector, textile and electronic goods and now further extended in other product categories. The FMCG companies have also recognized this change and started integration of their supply chain with e-commerce platform. This paper attempts to understand contemporary marketing practices of corporate in e-commerce business in Indian food and beverages segment and also tries to identify innovative marketing practices for proper execution of their strategies. The findings are mainly focused on supply chain re-integration and brand building strategies with proper utilization of social media.Keywords: FMCG (Fast Moving Consumer Goods), ISCM (Integrated supply chain management), RFID (Radio Frequency Identification), traditional and virtual formats
Procedia PDF Downloads 2751259 The Approach of New Urbanism Model to Identify the Sustainability of 'Kampung Kota'
Authors: Nadhia Maharany Siara, Muammal, Ilham Nurhakim, Rofifah Yusadi, M. Adie Putra Tanggara, I. Nyoman Suluh Wijaya
Abstract:
Urbanization in urban areas has impact to the demand of land use for housing, and it began to occur development in the high-density area called Kampung Kota. Kampung Kota grows and develops without planning or organically. The existence of Kampung Kota, becoming identity of the city development in Indonesia, gives self-identity to the city planning in Indonesia, but the existence of Kampung Kota in the development of the city in Indonesia is often considered as a source of environment, health, and social problems. This cause negative perception about the sustainability of Kampung Kota. This research aims to identify morphology and sustainability level of Kampung Kota in Polehan Sub-District, Blimbing District, Malang City. So far, there have not been many studies that define sustainability of Kampung Kota especially from the perspective of Kampung Kota morphology as a part of urban housing areas. This research took place in in Polehan Sub-District, Blimbing District, Malang City which is one of the oldest Kampung Kota in Malang City. Identification of the sustainability level in this research is done by defining the morphology of Kampung Kota in Polehan Sub-District, Blimbing District, Malang City with a descriptive approach to the observation case (Kampung Kota Polehan Sub-District). After that, definition of sustainability level is defined by quantifying the spatial structure by using the criteria from the new urbanism model which consist of buildings and populations density, compactness, diversity and mix land uses and sustainable transportation. In this case, the use of new urbanism model approach is very appropriate. New Urbanism is a design-driven strategy that is based on traditional forms to minimize urban sprawl in the suburbs. The result obtained from this study is the hometown of the level of sustainability in Polehan Sub-District, Blimbing District, Malang City of 3.2 and can be considered to have a good sustainability.Keywords: Kampung Kota, new urbanism model, sustainability, urban morphology
Procedia PDF Downloads 2901258 Vocational Rehabilitation for People with Disabilities: Employment Rates, Job Persistence and Wages
Authors: Hester Fass, Ofir Pinto
Abstract:
Research indicates gaps in education, employment rates and wages between people with disabilities and those without disabilities. One of the main tools available to reduce these gaps is vocational rehabilitation. In order to examine the effects of vocational rehabilitation, a follow-up study, based on comprehensive administrative data, was conducted. The study included 88,286 people with disabilities who participated in vocational rehabilitation of the National Insurance Institute of Israel (NII), and completed the process between 1999 and 2012. Research variables included: employment rates, job persistence and wage levels. This research, the first of its kind in Israel, has several unique aspects: a)a long-range follow-up study on people who completed vocational rehabilitation; b) examination of a broad population spectrum, including also people that are not eligible to disability pensions ; c) a comparison among those with work-related injuries, those injured in hostile acts and those injured in other circumstances; and finally d) the identification of the characteristics of those who are entitled to vocational rehabilitation but who do not participate in any vocational rehabilitation plan. The most notable results include: 1. Vocational rehabilitation contributed to employment, job persistence and wage levels. Participation in vocational rehabilitation resulted in an employment rate of 65% within two years after completing the program, and 73% eventually. Participation in a vocational rehabilitation plan also contributed to job persistence and wage levels. 2. Vocational rehabilitation plans aimed at integration in universal frameworks increased the chances of being employed, persisting at the job and receiving a higher wage than did the vocational rehabilitation aimed at selective frameworks (such as sheltered workshops). 3. The type of disability affected the chances of integration in a vocational rehabilitation plan and in the labor market. People with a disability from birth had greater chances of integration in a vocational rehabilitation plan, while the type of disability and its severity affected the chances of the person with disabilities to find employment.Keywords: vocational rehabilitation, employment, job persistence, wages
Procedia PDF Downloads 4531257 Exploration and Evaluation of the Effect of Multiple Countermeasures on Road Safety
Authors: Atheer Al-Nuaimi, Harry Evdorides
Abstract:
Every day many people die or get disabled or injured on roads around the world, which necessitates more specific treatments for transportation safety issues. International road assessment program (iRAP) model is one of the comprehensive road safety models which accounting for many factors that affect road safety in a cost-effective way in low and middle income countries. In iRAP model road safety has been divided into five star ratings from 1 star (the lowest level) to 5 star (the highest level). These star ratings are based on star rating score which is calculated by iRAP methodology depending on road attributes, traffic volumes and operating speeds. The outcome of iRAP methodology are the treatments that can be used to improve road safety and reduce fatalities and serious injuries (FSI) numbers. These countermeasures can be used separately as a single countermeasure or mix as multiple countermeasures for a location. There is general agreement that the adequacy of a countermeasure is liable to consistent losses when it is utilized as a part of mix with different countermeasures. That is, accident diminishment appraisals of individual countermeasures cannot be easily added together. The iRAP model philosophy makes utilization of a multiple countermeasure adjustment factors to predict diminishments in the effectiveness of road safety countermeasures when more than one countermeasure is chosen. A multiple countermeasure correction factors are figured for every 100-meter segment and for every accident type. However, restrictions of this methodology incorporate a presumable over-estimation in the predicted crash reduction. This study aims to adjust this correction factor by developing new models to calculate the effect of using multiple countermeasures on the number of fatalities for a location or an entire road. Regression models have been used to establish relationships between crash frequencies and the factors that affect their rates. Multiple linear regression, negative binomial regression, and Poisson regression techniques were used to develop models that can address the effectiveness of using multiple countermeasures. Analyses are conducted using The R Project for Statistical Computing showed that a model developed by negative binomial regression technique could give more reliable results of the predicted number of fatalities after the implementation of road safety multiple countermeasures than the results from iRAP model. The results also showed that the negative binomial regression approach gives more precise results in comparison with multiple linear and Poisson regression techniques because of the overdispersion and standard error issues.Keywords: international road assessment program, negative binomial, road multiple countermeasures, road safety
Procedia PDF Downloads 2401256 The Human Rights Code: Fundamental Rights as the Basis of Human-Robot Coexistence
Authors: Gergely G. Karacsony
Abstract:
Fundamental rights are the result of thousand years’ progress of legislation, adjudication and legal practice. They serve as the framework of peaceful cohabitation of people, protecting the individual from any abuse by the government or violation by other people. Artificial intelligence, however, is the development of the very recent past, being one of the most important prospects to the future. Artificial intelligence is now capable of communicating and performing actions the same way as humans; such acts are sometimes impossible to tell from actions performed by flesh-and-blood people. In a world, where human-robot interactions are more and more common, a new framework of peaceful cohabitation is to be found. Artificial intelligence, being able to take part in almost any kind of interaction where personal presence is not necessary without being recognized as a non-human actor, is now able to break the law, violate people’s rights, and disturb social peace in many other ways. Therefore, a code of peaceful coexistence is to be found or created. We should consider the issue, whether human rights can serve as the code of ethical and rightful conduct in the new era of artificial intelligence and human coexistence. In this paper, we will examine the applicability of fundamental rights to human-robot interactions as well as to the actions of artificial intelligence performed without human interaction whatsoever. Robot ethics has been a topic of discussion and debate of philosophy, ethics, computing, legal sciences and science fiction writing long before the first functional artificial intelligence has been introduced. Legal science and legislation have approached artificial intelligence from different angles, regulating different areas (e.g. data protection, telecommunications, copyright issues), but they are only chipping away at the mountain of legal issues concerning robotics. For a widely acceptable and permanent solution, a more general set of rules would be preferred to the detailed regulation of specific issues. We argue that human rights as recognized worldwide are able to be adapted to serve as a guideline and a common basis of coexistence of robots and humans. This solution has many virtues: people don’t need to adjust to a completely unknown set of standards, the system has proved itself to withstand the trials of time, legislation is easier, and the actions of non-human entities are more easily adjudicated within their own framework. In this paper we will examine the system of fundamental rights (as defined in the most widely accepted source, the 1966 UN Convention on Human Rights), and try to adapt each individual right to the actions of artificial intelligence actors; in each case we will examine the possible effects on the legal system and the society of such an approach, finally we also examine its effect on the IT industry.Keywords: human rights, robot ethics, artificial intelligence and law, human-robot interaction
Procedia PDF Downloads 2441255 The Effective Use of the Network in the Distributed Storage
Authors: Mamouni Mohammed Dhiya Eddine
Abstract:
This work aims at studying the exploitation of high-speed networks of clusters for distributed storage. Parallel applications running on clusters require both high-performance communications between nodes and efficient access to the storage system. Many studies on network technologies led to the design of dedicated architectures for clusters with very fast communications between computing nodes. Efficient distributed storage in clusters has been essentially developed by adding parallelization mechanisms so that the server(s) may sustain an increased workload. In this work, we propose to improve the performance of distributed storage systems in clusters by efficiently using the underlying high-performance network to access distant storage systems. The main question we are addressing is: do high-speed networks of clusters fit the requirements of a transparent, efficient and high-performance access to remote storage? We show that storage requirements are very different from those of parallel computation. High-speed networks of clusters were designed to optimize communications between different nodes of a parallel application. We study their utilization in a very different context, storage in clusters, where client-server models are generally used to access remote storage (for instance NFS, PVFS or LUSTRE). Our experimental study based on the usage of the GM programming interface of MYRINET high-speed networks for distributed storage raised several interesting problems. Firstly, the specific memory utilization in the storage access system layers does not easily fit the traditional memory model of high-speed networks. Secondly, client-server models that are used for distributed storage have specific requirements on message control and event processing, which are not handled by existing interfaces. We propose different solutions to solve communication control problems at the filesystem level. We show that a modification of the network programming interface is required. Data transfer issues need an adaptation of the operating system. We detail several propositions for network programming interfaces which make their utilization easier in the context of distributed storage. The integration of a flexible processing of data transfer in the new programming interface MYRINET/MX is finally presented. Performance evaluations show that its usage in the context of both storage and other types of applications is easy and efficient.Keywords: distributed storage, remote file access, cluster, high-speed network, MYRINET, zero-copy, memory registration, communication control, event notification, application programming interface
Procedia PDF Downloads 2191254 Prevalence of Different Poultry Parasitoses in Farms Modern in the North of Ivory Coast
Authors: Coulibaly Fatoumata, Gragnon Biego, Aka N. David, Mbari K. Benjamin, Soro Y. René, Ndiaye Jean-louis
Abstract:
Poultry is nowadays one of the most consumed sources of protein, and its livestock represents one of the few opportunities for savings, investment and protection against risk. It provides income for the most vulnerable sections of society, in particular, women (70%) and children who mainly practice this breeding. A study was conducted in the commune of Korhogo at the level of 52 poultry farms, the objective of which was to know the epidemiological situation of parasitism external and internal poultry in order to contribute to the improvement of the health status of modern poultry farms in the said commune. The method described by OIE (2005), consisting of using the standard formula (n = δ2*p*(1-p) *c /i2), made it possible to calculate the size of the sample. Then, samples of droppings and ectoparasites were taken from the affected farms. After analysis and identification, two (2) species of mallophagous lice, including Menopon gallinae (50%) and Menacanthus stramineus (33%) and a species of bug Cimex lectularius (17%) were highlighted. The laying hens were more infested than broilers. Regarding gastrointestinal parasites, different species (six) have been identified: Trichostrongylus tenuis (17%), Syngamus trachea (19%), Heterakis sp (10%), Ascaridia sp (17%), Raillietina sp (8%) and Eimeria sp (29%). In addition, coccidiosis (Eimeria sp) proved to be the dominant pathology representing 67% of pathologies in broiler farms and 33% in poultry farms. The presence of these parasitoses in these modern farms constitutes a constraint major contribution to productivity and their development In view of all these difficulties, proposals have been made in order to participate in the establishment of a good prophylaxis program (health and medical). In addition, the Ivorian government, with the support of veterinarians, must interfere more in the organization of the health monitoring of traditional chickens and poultry in general through supervision and training in order to preserve public health ( animal, human and environmental health).Keywords: gastrointestinal parasites, ectoparasites, pathologies, poultry, korhogo.
Procedia PDF Downloads 851253 The Relationship between Environmental Factors and Purchasing Decisions in the Residential Market in Sweden
Authors: Agnieszka Zalejska-Jonsson
Abstract:
The Swedish Green Building Council (SGBC) was established in 2009. Since then, over 1000 buildings have been certified, of which approximately 600 are newly produced and 340 are residential buildings. During that time, approximately 2000 apartment buildings have been built in Sweden. This means that over a five- year period 17% of residential buildings have been certified according to the environmental building scheme. The certification of the building is not a guarantee of environmental progress but it gives us an indication of the extent of the progress. The overarching aim of this study is to investigate the factors behind the relatively slow evolution of the green residential housing market in Sweden. The intention is to examine stated willingness to pay (WTP) for green and low energy apartments, and to explore which factors have a significant effect on stated WTP among apartment owners. A green building was defined as a building certified according to the environmental scheme and a low energy building as a building designed and constructed with high energy efficiency goals. Data for this study were collected through a survey conducted among occupants of comparable apartment buildings: two green and one conventional. The total number of received responses was 429: green A (N=160), response rate 42%; green B (N=138) response rate 35%, and conventional (N=131) response rate 43%. The study applied a quasi-experimental method. Survey responses regarding factors affecting purchase of apartment, stated WTP and environmental literacy have been analysed using descriptive statistics, the Mann–Whitney (rank sum) test and logistic models. Comments received from respondents have been used for further interpretation of results. Results indicate that environmental education has a significant effect on stated WTP. Occupants who declared higher WTP showed a higher level of environmental literacy and indicated that energy efficiency was one of the important factors that affected their decision to buy an apartment. Generally, the respondents were more likely to pay more for low energy buildings than for green buildings. This is to a great extent a consequence of rational customer behaviour and difficulty in apprehending the meaning of green building certification. The analysis shows that people living in green buildings indicate higher WTP for both green and low energy buildings, the difference being statistically significant. It is concluded that growth in the green housing market in Sweden might be achieved if policymakers and developers engage in active education in the environmental labelling system. The demand for green buildings is more likely to increase when the difference between green and conventional buildings is easily understood and information is not only delivered by the estate agent, but is part of an environmental education programme.Keywords: consumer, environmental education, housing market, stated WTP, Sweden
Procedia PDF Downloads 2411252 Comparison of Bactec plus Blood Culture Media to BacT/Alert FAN plus Blood Culture Media for Identification of Bacterial Pathogens in Clinical Samples Containing Antibiotics
Authors: Recep Kesli, Huseyin Bilgin, Ela Tasdogan, Ercan Kurtipek
Abstract:
Aim: The aim of this study was to compare resin based Bactec plus aerobic/anaerobic blood culture bottles (Becton Dickinson, MD, USA) and polymeric beads based BacT/Alert FA/FN plus blood culture bottles (bioMerieux, NC, USA) in terms of microorganisms recovery rates and time to detection (TTD) in the patients receiving antibiotic treatment. Method: Blood culture samples were taken from the patients who admitted to the intensive care unit and received antibiotic treatment. Forty milliliters of blood from patients were equally distributed into four types of bottles: Bactec Plus aerobic, Bactec Plus anaerobic, BacT/Alert FA Plus, BacT/Alert FN Plus. Bactec Plus and BacT/Alert Plus media were compared to culture recovery rates and TTD. Results: Blood culture samples were collected from 382 patients hospitalized in the intensive care unit and 245 patients who were diagnosed as having bloodstream infections were included in the study. A total of 1528 Bactec Plus aerobic, Bactec Plus anaerobic, BacT/Alert FA Plus, BacT/Alert FN Plus blood culture bottles analyzed and 176, 144, 154, 126 bacteria or fungi were isolated, respectively. Gram-negative and gram-positive bacteria were significantly more frequently isolated in the resin-based Bactec Plus bottles than in the polymeric beads based BacT/Alert Plus bottles. The Bactec Plus and BacT/Alert Plus media recovery rates were similar for fungi and anaerobic bacteria. The mean TTDs in the Bactec Plus bottles were shorter than those in the BacT/Alert Plus bottles regardless of the microorganisms. Conclusion: The results of this study showed that resin-containing media is a reliable and time-saving tool for patients who are receiving antibiotic treatment due to sepsis in the intensive care unit.Keywords: Bactec Plus, BacT/Alert Plus, blood culture, antibiotic
Procedia PDF Downloads 1461251 Leisure Time Physical Activity during Pregnancy and the Associated Factors Based on Health Belief Model: A Cross Sectional Study
Authors: Xin Chen, Xiao Yang, Rongrong Han, Lu Chen, Lingling Gao
Abstract:
Background: Leisure time physical activity (LTPA) benefits both pregnant women and their fetuses. The guidelines recommended that pregnant women should do at least 150 minutes of moderate-intensity aerobic physical activity throughout the week. The aim of this study was to investigate the rate of LTPA participation among Chinese pregnant women and to identify its predictors based on the health belief model. Methods: A cross-sectional study was conducted from June 2019 to September 2019 in Changchun, China. A total of 225 pregnant women aged 18 years or older with no severe physical or mental disease were recruited in the obstetric clinic. Self-administered questionnaires were used to collect data. LTPA was assessed by a pregnant physical activity questionnaire (PPAQ). A revised pregnancy physical activity health belief scale and social-demographic and perinatal characteristics factors were collected and used to predict LTPA participation. Data were analyzed using descriptive statistics and multivariate logistic regression. Results: The participants had a high level of perceived susceptibility, perceived severity, perceived benefits, and action clues, with mean item scores above 3.5. The predictors of LTPA in Chinese pregnant women were pre-pregnancy exercise habits [OR 3.236 (95% CI:1.632, 6.416)], perceived susceptibility score [OR 2.083 (95% CI:1.002, 4.331)], and perceived barriers score [OR 3.113 (95%CI:1.462, 6.626)]. Conclusions: The results of this study will lead to better identification of pregnant women who may not participate in LTPA. Healthcare professionals should be cognizant of issues that may affect LTPA participation among pregnant women, including pre-pregnancy exercise habits, perceived susceptibility, and perceived barriers.Keywords: pregnancy, health belief model., leisure time physical activity, factors
Procedia PDF Downloads 791250 Differential Expression of Biomarkers in Cancer Stem Cells and Side Populations in Breast Cancer Cell Lines
Authors: Dipali Dhawan
Abstract:
Cancerous epithelial cells are confined to a primary site by the continued expression of adhesion molecules and the intact basal lamina. However, as the cancer progresses some cells are believed to undergo an epithelial-mesenchymal transition (EMT) event, leading to increased motility, invasion and, ultimately, metastasis of the cells from the primary tumour to secondary sites within the body. These disseminated cancer cells need the ability to self-renew, as stem cells do, in order to establish and maintain a heterogeneous metastatic tumour mass. Identification of the specific subpopulation of cancer stem cells amenable to the process of metastasis is highly desirable. In this study, we have isolated and characterized cancer stem cells from luminal and basal breast cancer cell lines (MDA-MB-231, MDA-MB-453, MDA-MB-468, MCF7 and T47D) on the basis of cell surface markers CD44 and CD24; as well as Side Populations (SP) using Hoechst 33342 dye efflux. The isolated populations were analysed for epithelial and mesenchymal markers like E-cadherin, N-cadherin, Sfrp1 and Vimentin by Western blotting and Immunocytochemistry. MDA-MB-231 cell lines contain a major population of CD44+CD24- cells whereas MCF7, T47D and MDA-MB-231 cell lines show a side population. We observed higher expression of N-cadherin in MCF-7 SP cells as compared to MCF-7NSP (Non-side population) cells suggesting that the SP cells are mesenchymal like cells and hence express increased N-cadherin with stem cell-like properties. There was an expression of Sfrp1 in the MCF7- NSP cells as compared to no expression in MCF7-SP cells, which suggests that the Wnt pathway is expressed in the MCF7-SP cells. The mesenchymal marker Vimentin was expressed only in MDA-MB-231 cells. Hence, understanding the breast cancer heterogeneity would enable a better understanding of the disease progression and therapeutic targeting.Keywords: cancer stem cells, epithelial to mesenchymal transition, biomarkers, breast cancer
Procedia PDF Downloads 5261249 Breeding Cotton for Annual Growth Habit: Remobilizing End-of-season Perennial Reserves for Increased Yield
Authors: Salman Naveed, Nitant Gandhi, Grant Billings, Zachary Jones, B. Todd Campbell, Michael Jones, Sachin Rustgi
Abstract:
Cotton (Gossypium spp.) is the primary source of natural fiber in the U.S. and a major crop in the Southeastern U.S. Despite constant efforts to increase the cotton fiber yield, the yield gain has stagnated. Therefore, we undertook a novel approach to improve the cotton fiber yield by altering its growth habit from perennial to annual. In this effort, we identified genotypes with high-expression alleles of five floral induction and meristem identity genes (FT, SOC1, FUL, LFY, and AP1) from an upland cotton mini-core collection and crossed them in various combinations to develop cotton lines with annual growth habit, optimal flowering time and enhanced productivity. To facilitate the characterization of genotypes with the desired combinations of stacked alleles, we identified markers associated with the gene expression traits via genome-wide association analysis using a 63K SNP Array (Hulse-Kemp et al. 2015 G3 5:1187). Over 14,500 SNPs showed polymorphism and were used for association analysis. A total of 396 markers showed association with expression traits. Out of these 396 markers, 159 mapped to genes, 50 to untranslated regions, and 187 to random genomic regions. Biased genomic distribution of associated markers was observed where more trait-associated markers mapped to the cotton D sub-genome. Many quantitative trait loci coincided at specific genomic regions. This observation has implications as these traits could be bred together. The analysis also allowed the identification of candidate regulators of the expression patterns of these floral induction and meristem identity genes whose functions will be validated via virus-induced gene silencing.Keywords: cotton, GWAS, QTL, expression traits
Procedia PDF Downloads 1511248 Geometric Imperfections in Lattice Structures: A Simulation Strategy to Predict Strength Variability
Authors: Xavier Lorang, Ahmadali Tahmasebimoradi, Chetra Mang, Sylvain Girard
Abstract:
The additive manufacturing processes (e.g. selective laser melting) allow us to produce lattice structures which have less weight, higher impact absorption capacity, and better thermal exchange property compared to the classical structures. Unfortunately, geometric imperfections (defects) in the lattice structures are by-products results of the manufacturing process. These imperfections decrease the lifetime and the strength of the lattice structures and alternate their mechanical responses. The objective of the paper is to present a simulation strategy which allows us to take into account the effect of the geometric imperfections on the mechanical response of the lattice structure. In the first part, an identification method of geometric imperfection parameters of the lattice structure based on point clouds is presented. These point clouds are based on tomography measurements. The point clouds are fed into the platform LATANA (LATtice ANAlysis) developed by IRT-SystemX to characterize the geometric imperfections. This is done by projecting the point clouds of each microbeam along the beam axis onto a 2D surface. Then, by fitting an ellipse to the 2D projections of the points, the geometric imperfections are characterized by introducing three parameters of an ellipse; semi-major/minor axes and angle of rotation. With regard to the calculated parameters of the microbeam geometric imperfections, a statistical analysis is carried out to determine a probability density law based on a statistical hypothesis. The microbeam samples are randomly drawn from the density law and are used to generate lattice structures. In the second part, a finite element model for the lattice structure with the simplified geometric imperfections (ellipse parameters) is presented. This numerical model is used to simulate the generated lattice structures. The propagation of the uncertainties of geometric imperfections is shown through the distribution of the computed mechanical responses of the lattice structures.Keywords: additive manufacturing, finite element model, geometric imperfections, lattice structures, propagation of uncertainty
Procedia PDF Downloads 1871247 Diversification of Rice-Based Cropping Systems under Irrigated Condition
Authors: A. H. Nanher, N. P. Singh
Abstract:
In India, Agriculture is largely in rice- based cropping system. It has indicated decline in factor productivity along with emergence of multi - nutrient deficiency, buildup of soil pathogen and weed flora because it operates and removes nutrients from the same rooting depth. In designing alternative cropping systems, the common approaches are crop intensification, crop diversification and cultivar options. The intensification leads to the diversification of the cropping system. Intensification is achieved by introducing an additional component crop in a pre-dominant sequential system by desirable adjustments in cultivars of one or all the component crops. Invariably, this results in higher land use efficiency and productivity per unit time Crop Diversification through such crop and inclusion of fodder crops help to improve the economic situation of small and marginal farmers because of higher income. Inclusion of crops in sequential and intercropping systems reduces some obnoxious weeds through formation of canopies due to competitive planting pattern and thus provides an opportunity to utilize cropping systems as a tool of weed management with non-chemical means. Use of organic source not only acts as supplement for fertilizer (nitrogen) but also improve the physico-chemical properties of soils. Production and use of nitrogen rich biomass offer better prospect for supplementing chemical fertilizers on regular basis. Such biological diversity brings yield and economic stability because of its potential for compensation among components of the system. In a particular agro-climatic and resource condition, the identification of most suitable crop sequence is based on its productivity, stability, land use efficiency as well as production efficiency and its performance is chiefly judged in terms of productivity and net return.Keywords: integrated farming systems, sustainable intensification, system of crop intensification, wheat
Procedia PDF Downloads 4241246 A Novel Epitope Prediction for Vaccine Designing against Ebola Viral Envelope Proteins
Authors: Manju Kanu, Subrata Sinha, Surabhi Johari
Abstract:
Viral proteins of Ebola viruses belong to one of the best studied viruses; however no effective prevention against EBOV has been developed. Epitope-based vaccines provide a new strategy for prophylactic and therapeutic application of pathogen-specific immunity. A critical requirement of this strategy is the identification and selection of T-cell epitopes that act as vaccine targets. This study describes current methodologies for the selection process, with Ebola virus as a model system. Hence great challenge in the field of ebola virus research is to design universal vaccine. A combination of publicly available bioinformatics algorithms and computational tools are used to screen and select antigen sequences as potential T-cell epitopes of supertypes Human Leukocyte Antigen (HLA) alleles. MUSCLE and MOTIF tools were used to find out most conserved peptide sequences of viral proteins. Immunoinformatics tools were used for prediction of immunogenic peptides of viral proteins in zaire strains of Ebola virus. Putative epitopes for viral proteins (VP) were predicted from conserved peptide sequences of VP. Three tools NetCTL 1.2, BIMAS and Syfpeithi were used to predict the Class I putative epitopes while three tools, ProPred, IEDB-SMM-align and NetMHCII 2.2 were used to predict the Class II putative epitopes. B cell epitopes were predicted by BCPREDS 1.0. Immunogenic peptides were identified and selected manually by putative epitopes predicted from online tools individually for both MHC classes. Finally sequences of predicted peptides for both MHC classes were looked for common region which was selected as common immunogenic peptide. The immunogenic peptides were found for viral proteins of Ebola virus: epitopes FLESGAVKY, SSLAKHGEY. These predicted peptides could be promising candidates to be used as target for vaccine design.Keywords: epitope, b cell, immunogenicity, ebola
Procedia PDF Downloads 3141245 Quantification of Pollution Loads for the Rehabilitation of Pusu River
Authors: Abdullah Al-Mamun, Md. Nuruzzaman, Md. Noor Salleh, Muhammad Abu Eusuf, Ahmad Jalal Khan Chowdhury, Mohd. Zaki M. Amin, Norlida Mohd. Dom
Abstract:
Identification of pollution sources and determination of pollution loads from all areas are very important for sustainable rehabilitation of any contaminated river. Pusu is a small river which, flows through the main campus of International Islamic University Malaysia (IIUM) at Gombak. Poor aesthetics of the river, which is flowing through the entrance of the campus, gives negative impression to the local and international visitors. As such, this study is being conducted to find ways to rehabilitate the river in a sustainable manner. The point and non-point pollution sources of the river basin are identified. Upper part of the 12.6 km2 river basin is covered with secondary forest. However, it is the lower-middle reaches of the river basin which is being cleared for residential development and source of high sediment load. Flow and concentrations of the common pollutants, important for a healthy river, such as Biochemical Oxygen Demand (BOD), Chemical Oxygen Demand (COD), Suspended Solids (SS), Turbidity, pH, Ammoniacal Nitrogen (AN), Total Nitrogen (TN) and Total Phosphorus (TP) are determined. Annual pollution loading to the river was calculated based on the primary and secondary data. Concentrations of SS were high during the rainy day due to contribution from the non-point sources. There are 7 ponds along the river system within the campus, which are severely affected by high sediment load from the land clearing activities. On the other hand, concentrations of other pollutants were high during the non-rainy days. The main sources of point pollution are the hostels, cafeterias, sewage treatment plants located in the campus. Therefore, both pollution sources need to be controlled in order to rehabilitate the river in a sustainable manner.Keywords: river pollution, rehabilitation, point pollution source, non-point pollution sources, pollution loading
Procedia PDF Downloads 3541244 STD-NMR Based Protein Engineering of the Unique Arylpropionate-Racemase AMDase G74C
Authors: Sarah Gaßmeyer, Nadine Hülsemann, Raphael Stoll, Kenji Miyamoto, Robert Kourist
Abstract:
Enzymatic racemization allows the smooth interconversion of stereocenters under very mild reaction conditions. Racemases find frequent applications in deracemization and dynamic kinetic resolutions. Arylmalonate decarboxylase (AMDase) from Bordetella Bronchiseptica has high structural similarity to amino acid racemases. These cofactor-free racemases are able to break chemically strong CH-bonds under mild conditions. The racemase-like catalytic machinery of mutant G74C conveys it a unique activity in the racemisation of pharmacologically relevant derivates of 2-phenylpropionic acid (profenes), which makes AMDase G74C an interesting object for the mechanistic investigation of cofactor-independent racemases. Structure-guided protein engineering achieved a variant of this unique racemase with 40-fold increased activity in the racemisation of several arylaliphatic carboxylic acids. By saturation–transfer–difference NMR spectroscopy (STD-NMR), substrate binding during catalysis was investigated. All atoms of the substrate showed interactions with the enzyme. STD-NMR measurements revealed distinct nuclear Overhauser effects in experiments with and without molecular conversion. The spectroscopic analysis led to the identification of several amino acid residues whose variation increased the activity of G74C. While single-amino acid exchanges increased the activity moderately, structure-guided saturation mutagenesis yielded a quadruple mutant with a 40 times higher reaction rate. This study presents STD-NMR as versatile tool for the analysis of enzyme-substrate interactions in catalytically competent systems and for the guidance of protein engineering.Keywords: racemase, rational protein design, STD-NMR, structure guided saturation mutagenesis
Procedia PDF Downloads 3051243 Cable De-Commissioning of Legacy Accelerators at CERN
Authors: Adya Uluwita, Fernando Pedrosa, Georgi Georgiev, Christian Bernard, Raoul Masterson
Abstract:
CERN is an international organisation funded by 23 countries that provide the particle physics community with excellence in particle accelerators and other related facilities. Founded in 1954, CERN has a wide range of accelerators that allow groundbreaking science to be conducted. Accelerators bring particles to high levels of energy and make them collide with each other or with fixed targets, creating specific conditions that are of high interest to physicists. A chain of accelerators is used to ramp up the energy of particles and eventually inject them into the largest and most recent one: the Large Hadron Collider (LHC). Among this chain of machines is, for instance the Proton Synchrotron, which was started in 1959 and is still in operation. These machines, called "injectors”, keep evolving over time, as well as the related infrastructure. Massive decommissioning of obsolete cables started in 2015 at CERN in the frame of the so-called "injectors de-cabling project phase 1". Its goal was to replace aging cables and remove unused ones, freeing space for new cables necessary for upgrades and consolidation campaigns. To proceed with the de-cabling, a project co-ordination team was assembled. The start of this project led to the investigation of legacy cables throughout the organisation. The identification of cables stacked over half a century proved to be arduous. Phase 1 of the injectors de-cabling was implemented for 3 years with success after overcoming some difficulties. Phase 2, started 3 years later, focused on improving safety and structure with the introduction of a quality assurance procedure. This paper discusses the implementation of this quality assurance procedure throughout phase 2 of the project and the transition between the two phases. Over hundreds of kilometres of cable were removed in the injectors complex at CERN from 2015 to 2023.Keywords: CERN, de-cabling, injectors, quality assurance procedure
Procedia PDF Downloads 931242 Vibrational Spectra and Nonlinear Optical Investigations of a Chalcone Derivative (2e)-3-[4-(Methylsulfanyl) Phenyl]-1-(3-Bromophenyl) Prop-2-En-1-One
Authors: Amit Kumar, Archana Gupta, Poonam Tandon, E. D. D’Silva
Abstract:
Nonlinear optical (NLO) materials are the key materials for the fast processing of information and optical data storage applications. In the last decade, materials showing nonlinear optical properties have been the object of increasing attention by both experimental and computational points of view. Chalcones are one of the most important classes of cross conjugated NLO chromophores that are reported to exhibit good SHG efficiency, ultra fast optical nonlinearities and are easily crystallizable. The basic structure of chalcones is based on the π-conjugated system in which two aromatic rings are connected by a three-carbon α, β-unsaturated carbonyl system. Due to the overlap of π orbitals, delocalization of electronic charge distribution leads to a high mobility of the electron density. On a molecular scale, the extent of charge transfer across the NLO chromophore determines the level of SHG output. Hence, the functionalization of both ends of the π-bond system with appropriate electron donor and acceptor groups can enhance the asymmetric electronic distribution in either or both ground and excited states, leading to an increased optical nonlinearity. In this research, the experimental and theoretical study on the structure and vibrations of (2E)-3-[4-(methylsulfanyl) phenyl]-1-(3-bromophenyl) prop-2-en-1-one (3Br4MSP) is presented. The FT-IR and FT-Raman spectra of the NLO material in the solid phase have been recorded. Density functional theory (DFT) calculations at B3LYP with 6-311++G(d,p) basis set were carried out to study the equilibrium geometry, vibrational wavenumbers, infrared absorbance and Raman scattering activities. The interpretation of vibrational features (normal mode assignments, for instance) has an invaluable aid from DFT calculations that provide a quantum-mechanical description of the electronic energies and forces involved. Perturbation theory allows one to obtain the vibrational normal modes by estimating the derivatives of the Kohn−Sham energy with respect to atomic displacements. The molecular hyperpolarizability β plays a chief role in the NLO properties, and a systematical study on β has been carried out. Furthermore, the first order hyperpolarizability (β) and the related properties such as dipole moment (μ) and polarizability (α) of the title molecule are evaluated by Finite Field (FF) approach. The electronic α and β of the studied molecule are 41.907×10-24 and 79.035×10-24 e.s.u. respectively, indicating that 3Br4MSP can be used as a good nonlinear optical material.Keywords: DFT, MEP, NLO, vibrational spectra
Procedia PDF Downloads 2211241 SAFECARE: Integrated Cyber-Physical Security Solution for Healthcare Critical Infrastructure
Authors: Francesco Lubrano, Fabrizio Bertone, Federico Stirano
Abstract:
Modern societies strongly depend on Critical Infrastructures (CI). Hospitals, power supplies, water supplies, telecommunications are just few examples of CIs that provide vital functions to societies. CIs like hospitals are very complex environments, characterized by a huge number of cyber and physical systems that are becoming increasingly integrated. Ensuring a high level of security within such critical infrastructure requires a deep knowledge of vulnerabilities, threats, and potential attacks that may occur, as well as defence and prevention or mitigation strategies. The possibility to remotely monitor and control almost everything is pushing the adoption of network-connected devices. This implicitly introduces new threats and potential vulnerabilities, posing a risk, especially to those devices connected to the Internet. Modern medical devices used in hospitals are not an exception and are more and more being connected to enhance their functionalities and easing the management. Moreover, hospitals are environments with high flows of people, that are difficult to monitor and can somehow easily have access to the same places used by the staff, potentially creating damages. It is therefore clear that physical and cyber threats should be considered, analysed, and treated together as cyber-physical threats. This means that an integrated approach is required. SAFECARE, an integrated cyber-physical security solution, tries to respond to the presented issues within healthcare infrastructures. The challenge is to bring together the most advanced technologies from the physical and cyber security spheres, to achieve a global optimum for systemic security and for the management of combined cyber and physical threats and incidents and their interconnections. Moreover, potential impacts and cascading effects are evaluated through impact propagation models that rely on modular ontologies and a rule-based engine. Indeed, SAFECARE architecture foresees i) a macroblock related to cyber security field, where innovative tools are deployed to monitor network traffic, systems and medical devices; ii) a physical security macroblock, where video management systems are coupled with access control management, building management systems and innovative AI algorithms to detect behavior anomalies; iii) an integration system that collects all the incoming incidents, simulating their potential cascading effects, providing alerts and updated information regarding assets availability.Keywords: cyber security, defence strategies, impact propagation, integrated security, physical security
Procedia PDF Downloads 1651240 Building on Previous Microvalving Approaches for Highly Reliable Actuation in Centrifugal Microfluidic Platforms
Authors: Ivan Maguire, Ciprian Briciu, Alan Barrett, Dara Kervick, Jens Ducrèe, Fiona Regan
Abstract:
With the ever-increasing myriad of applications of which microfluidic devices are capable, reliable fluidic actuation development has remained fundamental to the success of these microfluidic platforms. There are a number of approaches which can be taken in order to integrate liquid actuation on microfluidic platforms, which can usually be split into two primary categories; active microvalves and passive microvalves. Active microvalves are microfluidic valves which require a physical parameter change by external, or separate interaction, for actuation to occur. Passive microvalves are microfluidic valves which don’t require external interaction for actuation due to the valve’s natural physical parameters, which can be overcome through sample interaction. The purpose of this paper is to illustrate how further improvements to past microvalve solutions can largely enhance systematic reliability and performance, with both novel active and passive microvalves demonstrated. Covered within this scope will be two alternative and novel microvalve solutions for centrifugal microfluidic platforms; a revamped pneumatic-dissolvable film active microvalve (PAM) strategy and a spray-on Sol-Gel based hydrophobic passive microvalve (HPM) approach. Both the PAM and the HPM mechanisms were demonstrated on a centrifugal microfluidic platform consisting of alternating layers of 1.5 mm poly(methyl methacrylate) (PMMA) (for reagent storage) sheets and ~150 μm pressure sensitive adhesive (PSA) (for microchannel fabrication) sheets. The PAM approach differs from previous SOLUBON™ dissolvable film methods by introducing a more reliable and predictable liquid delivery mechanism to microvalve site, thus significantly reducing premature activation. This approach has also shown excellent synchronicity when performed in a multiplexed form. The HPM method utilises a new spray-on and low curing temperature (70°C) sol-gel material. The resultant double layer coating comprises a PMMA adherent sol-gel as the bottom layer and an ultra hydrophobic silica nano-particles (SNPs) film as the top layer. The optimal coating was integrated to microfluidic channels with varying cross-sectional area for assessing microvalve burst frequencies consistency. It is hoped that these microvalving solutions, which can be easily added to centrifugal microfluidic platforms, will significantly improve automation reliability.Keywords: centrifugal microfluidics, hydrophobic microvalves, lab-on-a-disc, pneumatic microvalves
Procedia PDF Downloads 1881239 A User-Directed Approach to Optimization via Metaprogramming
Authors: Eashan Hatti
Abstract:
In software development, programmers often must make a choice between high-level programming and high-performance programs. High-level programming encourages the use of complex, pervasive abstractions. However, the use of these abstractions degrades performance-high performance demands that programs be low-level. In a compiler, the optimizer attempts to let the user have both. The optimizer takes high-level, abstract code as an input and produces low-level, performant code as an output. However, there is a problem with having the optimizer be a built-in part of the compiler. Domain-specific abstractions implemented as libraries are common in high-level languages. As a language’s library ecosystem grows, so does the number of abstractions that programmers will use. If these abstractions are to be performant, the optimizer must be extended with new optimizations to target them, or these abstractions must rely on existing general-purpose optimizations. The latter is often not as effective as needed. The former presents too significant of an effort for the compiler developers, as they are the only ones who can extend the language with new optimizations. Thus, the language becomes more high-level, yet the optimizer – and, in turn, program performance – falls behind. Programmers are again confronted with a choice between high-level programming and high-performance programs. To investigate a potential solution to this problem, we developed Peridot, a prototype programming language. Peridot’s main contribution is that it enables library developers to easily extend the language with new optimizations themselves. This allows the optimization workload to be taken off the compiler developers’ hands and given to a much larger set of people who can specialize in each problem domain. Because of this, optimizations can be much more effective while also being much more numerous. To enable this, Peridot supports metaprogramming designed for implementing program transformations. The language is split into two fragments or “levels”, one for metaprogramming, the other for high-level general-purpose programming. The metaprogramming level supports logic programming. Peridot’s key idea is that optimizations are simply implemented as metaprograms. The meta level supports several specific features which make it particularly suited to implementing optimizers. For instance, metaprograms can automatically deduce equalities between the programs they are optimizing via unification, deal with variable binding declaratively via higher-order abstract syntax, and avoid the phase-ordering problem via non-determinism. We have found that this design centered around logic programming makes optimizers concise and easy to write compared to their equivalents in functional or imperative languages. Overall, implementing Peridot has shown that its design is a viable solution to the problem of writing code which is both high-level and performant.Keywords: optimization, metaprogramming, logic programming, abstraction
Procedia PDF Downloads 881238 In vitro Study of Inflammatory Gene Expression Suppression of Strawberry and Blackberry Extracts
Authors: Franco Van De Velde, Debora Esposito, Maria E. Pirovani, Mary A. Lila
Abstract:
The physiology of various inflammatory diseases is a complex process mediated by inflammatory and immune cells such as macrophages and monocytes. Chronic inflammation, as observed in many cardiovascular and autoimmune disorders, occurs when the low-grade inflammatory response fails to resolve with time. Because of the complexity of the chronic inflammatory disease, major efforts have focused on identifying novel anti-inflammatory agents and dietary regimes that prevent the pro-inflammatory process at the early stage of gene expression of key pro-inflammatory mediators and cytokines. The ability of the extracts of three blackberry cultivars (‘Jumbo’, ‘Black Satin’ and ‘Dirksen’), and one strawberry cultivar (‘Camarosa’) to inhibit four well-known genetic biomarkers of inflammation: inducible nitric oxide synthase (iNOS), cyclooxynase-2 (Cox-2), interleukin-1β (IL-1β) and interleukin-6 (IL-6) in an in vitro lipopolysaccharide-stimulated murine RAW 264.7 macrophage model were investigated. Moreover, the effect of latter extracts on the intracellular reactive oxygen species (ROS) and nitric oxide (NO) production was assessed. Assay was conducted with 50 µg/mL crude extract concentration, an amount that is easily achievable in the gastrointestinal tract after berries consumption. The mRNA expression levels of Cox-2 and IL-6 were reduced consistently (more than 30%) by extracts of ‘Jumbo’ and ‘Black Satin’ blackberries. Strawberry extracts showed high reduction in mRNA expression levels of IL-6 (more than 65%) and exhibited moderate reduction in mRNA expression of Cox-2 (more than 35%). The latter behavior mirrors the intracellular ROS production of the LPS stimulated RAW 264.7 macrophages after the treatment with blackberry ‘Black Satin’ and ‘Jumbo’, and strawberry ‘Camarosa’ extracts, suggesting that phytochemicals from these fruits may play a role in the health maintenance by reducing oxidative stress. On the other hand, effective inhibition in the gene expression of IL-1β and iNOS was not observed by any of blackberry and strawberry extracts. However, suppression in the NO production in the activated macrophages among 5–25% was observed by ‘Jumbo’ and ‘Black Satin’ blackberry extracts and ‘Camarosa’ strawberry extracts, suggesting a higher NO suppression property by phytochemicals of these fruits. All these results suggest the potential beneficial effects of studied berries as functional foods with antioxidant and anti-inflammatory roles. Moreover, the underlying role of phytochemicals from these fruits in the protection of inflammatory process will deserve to be further explored.Keywords: cyclooxygenase-2, functional foods, interleukin-6, reactive oxygen species
Procedia PDF Downloads 2381237 TARF: Web Toolkit for Annotating RNA-Related Genomic Features
Abstract:
Genomic features, the genome-based coordinates, are commonly used for the representation of biological features such as genes, RNA transcripts and transcription factor binding sites. For the analysis of RNA-related genomic features, such as RNA modification sites, a common task is to correlate these features with transcript components (5'UTR, CDS, 3'UTR) to explore their distribution characteristics in terms of transcriptomic coordinates, e.g., to examine whether a specific type of biological feature is enriched near transcription start sites. Existing approaches for performing these tasks involve the manipulation of a gene database, conversion from genome-based coordinate to transcript-based coordinate, and visualization methods that are capable of showing RNA transcript components and distribution of the features. These steps are complicated and time consuming, and this is especially true for researchers who are not familiar with relevant tools. To overcome this obstacle, we develop a dedicated web app TARF, which represents web toolkit for annotating RNA-related genomic features. TARF web tool intends to provide a web-based way to easily annotate and visualize RNA-related genomic features. Once a user has uploaded the features with BED format and specified a built-in transcript database or uploaded a customized gene database with GTF format, the tool could fulfill its three main functions. First, it adds annotation on gene and RNA transcript components. For every features provided by the user, the overlapping with RNA transcript components are identified, and the information is combined in one table which is available for copy and download. Summary statistics about ambiguous belongings are also carried out. Second, the tool provides a convenient visualization method of the features on single gene/transcript level. For the selected gene, the tool shows the features with gene model on genome-based view, and also maps the features to transcript-based coordinate and show the distribution against one single spliced RNA transcript. Third, a global transcriptomic view of the genomic features is generated utilizing the Guitar R/Bioconductor package. The distribution of features on RNA transcripts are normalized with respect to RNA transcript landmarks and the enrichment of the features on different RNA transcript components is demonstrated. We tested the newly developed TARF toolkit with 3 different types of genomics features related to chromatin H3K4me3, RNA N6-methyladenosine (m6A) and RNA 5-methylcytosine (m5C), which are obtained from ChIP-Seq, MeRIP-Seq and RNA BS-Seq data, respectively. TARF successfully revealed their respective distribution characteristics, i.e. H3K4me3, m6A and m5C are enriched near transcription starting sites, stop codons and 5’UTRs, respectively. Overall, TARF is a useful web toolkit for annotation and visualization of RNA-related genomic features, and should help simplify the analysis of various RNA-related genomic features, especially those related RNA modifications.Keywords: RNA-related genomic features, annotation, visualization, web server
Procedia PDF Downloads 2081236 Neurological Complications of HIV/AIDS: Case of Meningitis Caused by Cryptococcus neoformans and Tuberculous Meningitis
Authors: Ndarusanze Berchmans
Abstract:
This research work focused on the analysis of the observations of tuberculous meningitis in HIV-positive patients who were treated by the Prince Regent Charles Hospital in Bujumbura. A number of 246 seropositive patients were examined by the laboratory of Prince Regent Charles in the period between 2010 and 2015. We did a retrospective study; we used data from the registers of the laboratories mentioned above; the objective was to approach the epidemiological, biological, clinical, and therapeutic characteristics of tuberculosis meningitis infection: 124 women (50.40% of AIDS patients) and 122 men (49.59% of AIDS patients) were subject to the diagnosis by identification of cerebrospinal fluid (CSF). The average age of the patients was 30 years for this period. The population at risk has an average age of between 34 and 42 years for the years between 2010-2015. From 2010 to 2012, cases of opportunistic diseases (e.g., tuberculous meningitis and Cryptococcus neoformans meningitis), often found in immunocompromised, were observed at a high rate; in this period, there was a disturbance of the rhythm providing antiretroviral drugs to people with AIDS. The rate of the two meningitis (tuberculous meningitis and Cryptococcus neoformans meningitis) remained above 10% to gradually decrease until 2015, with the gradual return of antiretrovirals. This period records an overall average of 25 cases of tuberculous meningitis, or a percentage of 10.16%. For the year 2015, there were 4 cases of tuberculous meningitis out of a total of 35 seropositive examined (11.42%). This year's percentage shows that the number of tuberculous meningitis cases has fallen from the rate in previous years. This is the result of the care given by associations against HIV/AIDS to HIV-positive people. This decrease in cases of tuberculous meningitis is due to the acquisition of antiretrovirals by all HIV-positive people treated by hospitals. For the moment, these hospitals are taking care of many AIDS patients by providing them permanently with antiretrovirals; Besides that, there are many patients who are supported by associations whose activities are directed against HIV/AIDS.Keywords: Cryptococcus neoformans meningitis, tuberculosis meningitis, neurological complications, epidemiology of meningitis
Procedia PDF Downloads 224