Search results for: automated drift detection and adaptation
833 Preparation of Frozen Bivalent Babesial (Babesia Bovis and Babesia Bigemina) Vaccine from Field Isolates and Evaluation of Its Efficacy in Calves
Authors: Muhammad Fiaz Qamar, Ahmad Faraz, Muhammad Arfan Zaman, Kazim Ali, Waleed Akram
Abstract:
Babesiosis is reflected as the most important disease of cattle that are transmitted by arthropods. In Pakistan, its prevalence is up to 29% in the cattle and buffalo population in different regions. Cattle show a long lasting and durable immunity by giving an infection of B.bovis, B. bigemina, or Babesiadivergens. this is used in cattle to immunize them in a few countries as anti-babesiosis vaccine. Development of frozen vaccine allows for complete testing after production of each batch, However, once thawed, its reduced its shelf life, frozen vaccines are more difficult to transport as well as expensive to produce as compared to chilled vaccine. The contamination of blood derived vaccine has the potential risk that makes pre-production and post-production quality control necessary. For the trail master seed production of whole blood frozen bivalent Babesia(Babesiabovis and Babesiabigemina), 100 blood samples of Babesial positive suspected cattle was taken and processed for separation microscopic detection and rectification by PCR. Vaccine passages were done to reduce the parasitaemiasis in live calves. After 8 passages, parasitemia of Babesia reduced from 80% to 15%. Infected donor calf’s blood was taken by jugular cannulation by using preservative free lithium heparin as an anticoagulant (5 International Units IU heparin/ml blood). In lab, parasite containing blood was mixed in equal volumes with 3 M glycerol in PBS supplemented with 5 mM glucose (final concentration of glycerol 1.5 M) at 37°C. The mixture was then equilibrized at 37°C for 30 minutes and were dispensed in required containers (e.g., 5 ml cryovials).Keywords: distribution, babesia, primer sequences, PCV
Procedia PDF Downloads 100832 Radio-Frequency Technologies for Sensing and Imaging
Authors: Cam Nguyen
Abstract:
Rapid, accurate, and safe sensing and imaging of physical quantities or structures finds many applications and is of significant interest to society. Sensing and imaging using radio-frequency (RF) techniques, particularly, has gone through significant development and subsequently established itself as a unique territory in the sensing world. RF sensing and imaging has played a critical role in providing us many sensing and imaging abilities beyond our human capabilities, benefiting both civilian and military applications - for example, from sensing abnormal conditions underneath some structures’ surfaces to detection and classification of concealed items, hidden activities, and buried objects. We present the developments of several sensing and imaging systems implementing RF technologies like ultra-wide band (UWB), synthetic-pulse, and interferometry. These systems are fabricated completely using RF integrated circuits. The UWB impulse system operates over multiple pulse durations from 450 to 1170 ps with 5.5-GHz RF bandwidth. It performs well through tests of various samples, demonstrating its usefulness for subsurface sensing. The synthetic-pulse system operating from 0.6 to 5.6 GHz can assess accurately subsurface structures. The synthetic-pulse system operating from 29.72-37.7 GHz demonstrates abilities for various surface and near-surface sensing such as profile mapping, liquid-level monitoring, and anti-personnel mine locating. The interferometric system operating at 35.6 GHz demonstrates its multi-functional capability for measurement of displacements and slow velocities. These RF sensors are attractive and useful for various surface and subsurface sensing applications. This paper was made possible by NPRP grant # 6-241-2-102 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the authors.Keywords: RF sensors, radars, surface sensing, subsurface sensing
Procedia PDF Downloads 314831 Prediction of Damage to Cutting Tools in an Earth Pressure Balance Tunnel Boring Machine EPB TBM: A Case Study L3 Guadalajara Metro Line (Mexico)
Authors: Silvia Arrate, Waldo Salud, Eloy París
Abstract:
The wear of cutting tools is one of the most decisive elements when planning tunneling works, programming the maintenance stops and saving the optimum stock of spare parts during the evolution of the excavation. Being able to predict the behavior of cutting tools can give a very competitive advantage in terms of costs and excavation performance, optimized to the needs of the TBM itself. The incredible evolution of data science in recent years gives the option to implement it at the time of analyzing the key and most critical parameters related to machinery with the purpose of knowing how the cutting head is performing in front of the excavated ground. Taking this as a case study, Metro Line 3 of Guadalajara in Mexico will develop the feasibility of using Specific Energy versus data science applied over parameters of Torque, Penetration, and Contact Force, among others, to predict the behavior and status of cutting tools. The results obtained through both techniques are analyzed and verified in the function of the wear and the field situations observed in the excavation in order to determine its effectiveness regarding its predictive capacity. In conclusion, the possibilities and improvements offered by the application of digital tools and the programming of calculation algorithms for the analysis of wear of cutting head elements compared to purely empirical methods allow early detection of possible damage to cutting tools, which is reflected in optimization of excavation performance and a significant improvement in costs and deadlines.Keywords: cutting tools, data science, prediction, TBM, wear
Procedia PDF Downloads 46830 The Economic Burden of Breast Cancer on Women in Nigeria: Implication for Socio-Economic Development
Authors: Tolulope Allo, Mofoluwake P. Ajayi, Adenike E. Idowu, Emmanuel O. Amoo, Fadeke Esther Olu-Owolabi
Abstract:
Breast cancer which was more prevalent in Europe and America in the past is gradually being mirrored across the world today with greater economic burden on low and middle income countries (LMCs). Breast cancer is the most common cancer among women globally and current studies have shown that a woman dies with the diagnosis of breast cancer every thirteen minutes. The economic cost of breast cancer is overwhelming particularly for developing economies. While it causes billion of dollar in losses of national income, it pushes millions of people below poverty line. This study examined the economic burden of breast cancer on Nigerian women, its impacts on their standard of living and its effects on Nigeria’s socio economic development. The study adopts a qualitative research approach using the in-depth interview technique to elicit valuable information from respondents with cancer experience from the Southern part of Nigeria. Respondents constituted women in their reproductive age (15-49 years) that have experienced and survived cancer and also those that are currently receiving treatment. Excerpts from the interviews revealed that the cost of treatment is one of the major factors contributing to the late presentation of breast cancer incidences among women as many of them could not afford to pay for their own treatment. The study also revealed that many women prefer to explore other options such as herbal treatments and spiritual consultations which is less expensive and affordable. The study therefore concludes that breast cancer diagnosis and treatment should be subsidized by the government in order to facilitate easy access and affordability thereby promoting early detection and reducing the economic burden of treatment on women.Keywords: breast cancer, development, economic burden, women
Procedia PDF Downloads 357829 Online Monitoring and Control of Continuous Mechanosynthesis by UV-Vis Spectrophotometry
Authors: Darren A. Whitaker, Dan Palmer, Jens Wesholowski, James Flaherty, John Mack, Ahmad B. Albadarin, Gavin Walker
Abstract:
Traditional mechanosynthesis has been performed by either ball milling or manual grinding. However, neither of these techniques allow the easy application of process control. The temperature may change unpredictably due to friction in the process. Hence the amount of energy transferred to the reactants is intrinsically non-uniform. Recently, it has been shown that the use of Twin-Screw extrusion (TSE) can overcome these limitations. Additionally, TSE enables a platform for continuous synthesis or manufacturing as it is an open-ended process, with feedstocks at one end and product at the other. Several materials including metal-organic frameworks (MOFs), co-crystals and small organic molecules have been produced mechanochemically using TSE. The described advantages of TSE are offset by drawbacks such as increased process complexity (a large number of process parameters) and variation in feedstock flow impacting on product quality. To handle the above-mentioned drawbacks, this study utilizes UV-Vis spectrophotometry (InSpectroX, ColVisTec) as an online tool to gain real-time information about the quality of the product. Additionally, this is combined with real-time process information in an Advanced Process Control system (PharmaMV, Perceptive Engineering) allowing full supervision and control of the TSE process. Further, by characterizing the dynamic behavior of the TSE, a model predictive controller (MPC) can be employed to ensure the process remains under control when perturbed by external disturbances. Two reactions were studied; a Knoevenagel condensation reaction of barbituric acid and vanillin and, the direct amidation of hydroquinone by ammonium acetate to form N-Acetyl-para-aminophenol (APAP) commonly known as paracetamol. Both reactions could be carried out continuously using TSE, nuclear magnetic resonance (NMR) spectroscopy was used to confirm the percentage conversion of starting materials to product. This information was used to construct partial least squares (PLS) calibration models within the PharmaMV development system, which relates the percent conversion to product to the acquired UV-Vis spectrum. Once this was complete, the model was deployed within the PharmaMV Real-Time System to carry out automated optimization experiments to maximize the percentage conversion based on a set of process parameters in a design of experiments (DoE) style methodology. With the optimum set of process parameters established, a series of PRBS process response tests (i.e. Pseudo-Random Binary Sequences) around the optimum were conducted. The resultant dataset was used to build a statistical model and associated MPC. The controller maximizes product quality whilst ensuring the process remains at the optimum even as disturbances such as raw material variability are introduced into the system. To summarize, a combination of online spectral monitoring and advanced process control was used to develop a robust system for optimization and control of two TSE based mechanosynthetic processes.Keywords: continuous synthesis, pharmaceutical, spectroscopy, advanced process control
Procedia PDF Downloads 175828 A Bayesian Parameter Identification Method for Thermorheological Complex Materials
Authors: Michael Anton Kraus, Miriam Schuster, Geralt Siebert, Jens Schneider
Abstract:
Polymers increasingly gained interest in construction materials over the last years in civil engineering applications. As polymeric materials typically show time- and temperature dependent material behavior, which is accounted for in the context of the theory of linear viscoelasticity. Within the context of this paper, the authors show, that some polymeric interlayers for laminated glass can not be considered as thermorheologically simple as they do not follow a simple TTSP, thus a methodology of identifying the thermorheologically complex constitutive bahavioir is needed. ‘Dynamical-Mechanical-Thermal-Analysis’ (DMTA) in tensile and shear mode as well as ‘Differential Scanning Caliometry’ (DSC) tests are carried out on the interlayer material ‘Ethylene-vinyl acetate’ (EVA). A navoel Bayesian framework for the Master Curving Process as well as the detection and parameter identification of the TTSPs along with their associated Prony-series is derived and applied to the EVA material data. To our best knowledge, this is the first time, an uncertainty quantification of the Prony-series in a Bayesian context is shown. Within this paper, we could successfully apply the derived Bayesian methodology to the EVA material data to gather meaningful Master Curves and TTSPs. Uncertainties occurring in this process can be well quantified. We found, that EVA needs two TTSPs with two associated Generalized Maxwell Models. As the methodology is kept general, the derived framework could be also applied to other thermorheologically complex polymers for parameter identification purposes.Keywords: bayesian parameter identification, generalized Maxwell model, linear viscoelasticity, thermorheological complex
Procedia PDF Downloads 262827 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach
Authors: Jean Berger, Nassirou Lo, Martin Noel
Abstract:
Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization
Procedia PDF Downloads 370826 Pre-Industrial Local Architecture According to Natural Properties
Authors: Selin Küçük
Abstract:
Pre-industrial architecture is integration of natural and subsequent properties by intelligence and experience. Since various settlements relatively industrialized or non-industrialized at any time, ‘pre-industrial’ term does not refer to a definite time. Natural properties, which are existent conditions and materials in natural local environment, are climate, geomorphology and local materials. Subsequent properties, which are all anthropological comparatives, are culture of societies, requirements of people and construction techniques that people use. Yet, after industrialization, technology took technique’s place, cultural effects are manipulated, requirements are changed and local/natural properties are almost disappeared in architecture. Technology is universal, global and expands simply; conversely technique is time and experience dependent and should has a considerable cultural background. This research is about construction techniques according to natural properties of a region and classification of these techniques. Understanding local architecture is only possible by searching its background which is hard to reach. There are always changes in positive and negative in architectural techniques through the time. Archaeological layers of a region sometimes give more accurate information about transformation of architecture. However, natural properties of any region are the most helpful elements to perceive construction techniques. Many international sources from different cultures are interested in local architecture by mentioning natural properties separately. Unfortunately, there is no literature deals with this subject as far as systematically in the correct way. This research aims to improve a clear perspective of local architecture existence by categorizing archetypes according to natural properties. The ultimate goal of this research is generating a clear classification of local architecture independent from subsequent (anthropological) properties over the world such like a handbook. Since local architecture is the most sustainable architecture with refer to its economic, ecologic and sociological properties, there should be an excessive information about construction techniques to be learned from. Constructing the same buildings in all over the world is one of the main criticism of modern architectural system. While this critics going on, the same buildings without identity increase incrementally. In post-industrial term, technology widely took technique’s place, yet cultural effects are manipulated, requirements are changed and natural local properties are almost disappeared in architecture. These study does not offer architects to use local techniques, but it indicates the progress of pre-industrial architectural evolution which is healthier, cheaper and natural. Immigration from rural areas to developing/developed cities should be prohibited, thus culture and construction techniques can be preserved. Since big cities have psychological, sensational and sociological impact on people, rural settlers can be convinced to not to immigrate by providing new buildings designed according to natural properties and maintaining their settlements. Improving rural conditions would remove the economical and sociological gulf between cities and rural. What result desired to arrived in, is if there is no deformation (adaptation process of another traditional buildings because of immigration) or assimilation in a climatic region, there should be very similar solutions in the same climatic regions of the world even if there is no relationship (trade, communication etc.) among them.Keywords: climate zones, geomorphology, local architecture, local materials
Procedia PDF Downloads 428825 Design of a Portable Shielding System for a Newly Installed NaI(Tl) Detector
Authors: Mayesha Tahsin, A.S. Mollah
Abstract:
Recently, a 1.5x1.5 inch NaI(Tl) detector based gamma-ray spectroscopy system has been installed in the laboratory of the Nuclear Science and Engineering Department of the Military Institute of Science and Technology for radioactivity detection purposes. The newly installed NaI(Tl) detector has a circular lead shield of 22 mm width. An important consideration of any gamma-ray spectroscopy is the minimization of natural background radiation not originating from the radioactive sample that is being measured. Natural background gamma-ray radiation comes from naturally occurring or man-made radionuclides in the environment or from cosmic sources. Moreover, the main problem with this system is that it is not suitable for measurements of radioactivity with a large sample container like Petridish or Marinelli beaker geometry. When any laboratory installs a new detector or/and new shield, it “must” first carry out quality and performance tests for the detector and shield. This paper describes a new portable shielding system with lead that can reduce the background radiation. Intensity of gamma radiation after passing the shielding will be calculated using shielding equation I=Ioe-µx where Io is initial intensity of the gamma source, I is intensity after passing through the shield, µ is linear attenuation coefficient of the shielding material, and x is the thickness of the shielding material. The height and width of the shielding will be selected in order to accommodate the large sample container. The detector will be surrounded by a 4π-geometry low activity lead shield. An additional 1.5 mm thick shield of tin and 1 mm thick shield of copper covering the inner part of the lead shielding will be added in order to remove the presence of characteristic X-rays from the lead shield.Keywords: shield, NaI (Tl) detector, gamma radiation, intensity, linear attenuation coefficient
Procedia PDF Downloads 156824 Code Embedding for Software Vulnerability Discovery Based on Semantic Information
Authors: Joseph Gear, Yue Xu, Ernest Foo, Praveen Gauravaran, Zahra Jadidi, Leonie Simpson
Abstract:
Deep learning methods have been seeing an increasing application to the long-standing security research goal of automatic vulnerability detection for source code. Attention, however, must still be paid to the task of producing vector representations for source code (code embeddings) as input for these deep learning models. Graphical representations of code, most predominantly Abstract Syntax Trees and Code Property Graphs, have received some use in this task of late; however, for very large graphs representing very large code snip- pets, learning becomes prohibitively computationally expensive. This expense may be reduced by intelligently pruning this input to only vulnerability-relevant information; however, little research in this area has been performed. Additionally, most existing work comprehends code based solely on the structure of the graph at the expense of the information contained by the node in the graph. This paper proposes Semantic-enhanced Code Embedding for Vulnerability Discovery (SCEVD), a deep learning model which uses semantic-based feature selection for its vulnerability classification model. It uses information from the nodes as well as the structure of the code graph in order to select features which are most indicative of the presence or absence of vulnerabilities. This model is implemented and experimentally tested using the SARD Juliet vulnerability test suite to determine its efficacy. It is able to improve on existing code graph feature selection methods, as demonstrated by its improved ability to discover vulnerabilities.Keywords: code representation, deep learning, source code semantics, vulnerability discovery
Procedia PDF Downloads 155823 SEM Detection of Folate Receptor in a Murine Breast Cancer Model Using Secondary Antibody-Conjugated, Gold-Coated Magnetite Nanoparticles
Authors: Yasser A. Ahmed, Juleen M Dickson, Evan S. Krystofiak, Julie A. Oliver
Abstract:
Cancer cells urgently need folate to support their rapid division. Folate receptors (FR) are over-expressed on a wide range of tumor cells, including breast cancer cells. FR are distributed over the entire surface of cancer cells, but are polarized to the apical surface of normal cells. Targeting of cancer cells using specific surface molecules such as folate receptors may be one of the strategies used to kill cancer cells without hurting the neighing normal cells. The aim of the current study was to try a method of SEM detecting FR in a murine breast cancer cell model (4T1 cells) using secondary antibody conjugated to gold or gold-coated magnetite nanoparticles. 4T1 cells were suspended in RPMI medium witth FR antibody and incubated with secondary antibody for fluorescence microscopy. The cells were cultured on 30mm Thermanox coverslips for 18 hours, labeled with FR antibody then incubated with secondary antibody conjugated to gold or gold-coated magnetite nanoparticles and processed to scanning electron microscopy (SEM) analysis. The fluorescence microscopy study showed strong punctate FR expression on 4T1 cell membrane. With SEM, the labeling with gold or gold-coated magnetite conjugates showed a similar pattern. Specific labeling occurred in nanoparticle clusters, which are clearly visualized in backscattered electron images. The 4T1 tumor cell model may be useful for the development of FR-targeted tumor therapy using gold-coated magnetite nano-particles.Keywords: cancer cell, nanoparticles, cell culture, SEM
Procedia PDF Downloads 731822 Evaluation of Firearm Injury Syndromic Surveillance in Utah
Authors: E. Bennion, A. Acharya, S. Barnes, D. Ferrell, S. Luckett-Cole, G. Mower, J. Nelson, Y. Nguyen
Abstract:
Objective: This study aimed to evaluate the validity of a firearm injury query in the Early Notification of Community-based Epidemics syndromic surveillance system. Syndromic surveillance data are used at the Utah Department of Health for early detection of and rapid response to unusually high rates of violence and injury, among other health outcomes. The query of interest was defined by the Centers for Disease Control and Prevention and used chief complaint and discharge diagnosis codes to capture initial emergency department encounters for firearm injury of all intents. Design: Two epidemiologists manually reviewed electronic health records of emergency department visits captured by the query from April-May 2020, compared results, and sent conflicting determinations to two arbiters. Results: Of the 85 unique records captured, 67 were deemed probable, 19 were ruled out, and two were undetermined, resulting in a positive predictive value of 75.3%. Common reasons for false positives included non-initial encounters and misleading keywords. Conclusion: Improving the validity of syndromic surveillance data would better inform outbreak response decisions made by state and local health departments. The firearm injury definition could be refined to exclude non-initial encounters by negating words such as “last month,” “last week,” and “aftercare”; and to exclude non-firearm injury by negating words such as “pellet gun,” “air gun,” “nail gun,” “bullet bike,” and “exit wound” when a firearm is not mentioned.Keywords: evaluation, health information system, firearm injury, syndromic surveillance
Procedia PDF Downloads 165821 Preparedness is Overrated: Community Responses to Floods in a Context of (Perceived) Low Probability
Authors: Kim Anema, Matthias Max, Chris Zevenbergen
Abstract:
For any flood risk manager the 'safety paradox' has to be a familiar concept: low probability leads to a sense of safety, which leads to more investments in the area, which leads to higher potential consequences: keeping the aggregated risk (probability*consequences) at the same level. Therefore, it is important to mitigate potential consequences apart from probability. However, when the (perceived) probability is so low that there is no recognizable trend for society to adapt to, addressing the potential consequences will always be the lagging point on the agenda. Preparedness programs fail because of lack of interest and urgency, policy makers are distracted by their day to day business and there's always a more urgent issue to spend the taxpayer's money on. The leading question in this study was how to address the social consequences of flooding in a context of (perceived) low probability. Disruptions of everyday urban life, large or small, can be caused by a variety of (un)expected things - of which flooding is only one possibility. Variability like this is typically addressed with resilience - and we used the concept of Community Resilience as the framework for this study. Drawing on face to face interviews, an extensive questionnaire and publicly available statistical data we explored the 'whole society response' to two recent urban flood events; the Brisbane Floods (AUS) in 2011 and the Dresden Floods (GE) in 2013. In Brisbane, we studied how the societal impacts of the floods were counteracted by both authorities and the public, and in Dresden we were able to validate our findings. A large part of the reactions, both public as institutional, to these two urban flood events were not fuelled by preparedness or proper planning. Instead, more important success factors in counteracting social impacts like demographic changes in neighborhoods and (non-)economic losses were dynamics like community action, flexibility and creativity from authorities, leadership, informal connections and a shared narrative. These proved to be the determining factors for the quality and speed of recovery in both cities. The resilience of the community in Brisbane was good, due to (i) the approachability of (local) authorities, (ii) a big group of ‘secondary victims’ and (iii) clear leadership. All three of these elements were amplified by the use of social media and/ or web 2.0 by both the communities and the authorities involved. The numerous contacts and social connections made through the web were fast, need driven and, in their own way, orderly. Similarly in Dresden large groups of 'unprepared', ad hoc organized citizens managed to work together with authorities in a way that was effective and speeded up recovery. The concept of community resilience is better fitted than 'social adaptation' to deal with the potential consequences of an (im)probable flood. Community resilience is built on capacities and dynamics that are part of everyday life and which can be invested in pre-event to minimize the social impact of urban flooding. Investing in these might even have beneficial trade-offs in other policy fields.Keywords: community resilience, disaster response, social consequences, preparedness
Procedia PDF Downloads 352820 Fatigue Crack Growth Rate Measurement by Means of Classic Method and Acoustic Emission
Authors: V. Mentl, V. Koula, P. Mazal, J. Volák
Abstract:
Nowadays, the acoustic emission is a widely recognized method of material damage investigation, mainly in cases of cracks initiation and growth observation and evaluation. This is highly important in structures, e.g. pressure vessels, large steam turbine rotors etc., applied both in classic and nuclear power plants. Nevertheless, the acoustic emission signals must be correlated with the real crack progress to be able to evaluate the cracks and their growth by this non-destructive technique alone in real situations and to reach reliable results when the assessment of the structures' safety and reliability is performed and also when the remaining lifetime should be evaluated. The main aim of this study was to propose a methodology for evaluation of the early manifestations of the fatigue cracks and their growth and thus to quantify the material damage by acoustic emission parameters. Specimens made of several steels used in the power producing industry were subjected to fatigue loading in the low- and high-cycle regimes. This study presents results of the crack growth rate measurement obtained by the classic compliance change method and the acoustic emission signal analysis. The experiments were realized in cooperation between laboratories of Brno University of Technology and West Bohemia University in Pilsen within the solution of the project of the Czech Ministry of Industry and Commerce: "A diagnostic complex for the detection of pressure media and material defects in pressure components of nuclear and classic power plants" and the project “New Technologies for Mechanical Engineering”.Keywords: fatigue, crack growth rate, acoustic emission, material damage
Procedia PDF Downloads 370819 Urban Land Use Type Analysis Based on Land Subsidence Areas Using X-Band Satellite Image of Jakarta Metropolitan City, Indonesia
Authors: Ratih Fitria Putri, Josaphat Tetuko Sri Sumantyo, Hiroaki Kuze
Abstract:
Jakarta Metropolitan City is located on the northwest coast of West Java province with geographical location between 106º33’ 00”-107º00’00”E longitude and 5º48’30”-6º24’00”S latitude. Jakarta urban area has been suffered from land subsidence in several land use type as trading, industry and settlement area. Land subsidence hazard is one of the consequences of urban development in Jakarta. This hazard is caused by intensive human activities in groundwater extraction and land use mismanagement. Geologically, the Jakarta urban area is mostly dominated by alluvium fan sediment. The objectives of this research are to make an analysis of Jakarta urban land use type on land subsidence zone areas. The process of producing safer land use and settlements of the land subsidence areas are very important. Spatial distributions of land subsidence detection are necessary tool for land use management planning. For this purpose, Differential Synthetic Aperture Radar Interferometry (DInSAR) method is used. The DInSAR is complementary to ground-based methods such as leveling and global positioning system (GPS) measurements, yielding information in a wide coverage area even when the area is inaccessible. The data were fine tuned by using X-Band image satellite data from 2010 to 2013 and land use mapping data. Our analysis of land use type that land subsidence movement occurred on the northern part Jakarta Metropolitan City varying from 7.5 to 17.5 cm/year as industry and settlement land use type areas.Keywords: land use analysis, land subsidence mapping, urban area, X-band satellite image
Procedia PDF Downloads 273818 Assessment of Airtightness Through a Standardized Procedure in a Nearly-Zero Energy Demand House
Authors: Mar Cañada Soriano, Rafael Royo-Pastor, Carolina Aparicio-Fernández, Jose-Luis Vivancos
Abstract:
The lack of insulation, along with the existence of air leakages, constitute a meaningful impact on the energy performance of buildings. Both of them lead to increases in the energy demand through additional heating and/or cooling loads. Additionally, they cause thermal discomfort. In order to quantify these uncontrolled air currents, pressurization and depressurization tests can be performed. Among them, the Blower Door test is a standardized procedure to determine the airtightness of a space which characterizes the rate of air leakages through the envelope surface, calculating to this purpose an air flow rate indicator. In this sense, the low-energy buildings complying with the Passive House design criteria are required to achieve high levels of airtightness. Due to the invisible nature of air leakages, additional tools are often considered to identify where the infiltrations take place. Among them, the infrared thermography entails a valuable technique to this purpose since it enables their detection. The aim of this study is to assess the airtightness of a typical Mediterranean dwelling house located in the Valencian orchad (Spain) restored under the Passive House standard using to this purpose the blower-door test. Moreover, the building energy performance modelling tools TRNSYS (TRaNsient System Simulation program) and TRNFlow (TRaNsient Flow) have been used to determine its energy performance, and the infiltrations’ identification was carried out by means of infrared thermography. The low levels of infiltrations obtained suggest that this house may comply with the Passive House standard.Keywords: airtightness, blower door, trnflow, infrared thermography
Procedia PDF Downloads 122817 16s rRNA Based Metagenomic Analysis of Palm Sap Samples From Bangladesh
Authors: Ágota Ábrahám, Md Nurul Islam, Karimane Zeghbib, Gábor Kemenesi, Sazeda Akter
Abstract:
Collecting palm sap as a food source is an everyday practice in some parts of the world. However, the consumption of palm juice has been associated with regular infections and epidemics in parts of Bangladesh. This is attributed to fruit-eating bats and other vertebrates or invertebrates native to the area, contaminating the food with their body secretions during the collection process. The frequent intake of palm juice, whether as a processed food product or in its unprocessed form, is a common phenomenon in large areas. The range of pathogens suitable for human infection resulting from this practice is not yet fully understood. Additionally, the high sugar content of the liquid makes it an ideal culture medium for certain bacteria, which can easily propagate and potentially harm consumers. Rapid diagnostics, especially in remote locations, could mitigate health risks associated with palm juice consumption. The primary objective of this research is the rapid genomic detection and risk assessment of bacteria that may cause infections in humans through the consumption of palm juice. Utilizing state-of-the-art third-generation Nanopore metagenomic sequencing technology based on 16S rRNA, and identified bacteria primarily involved in fermenting processes. The swift metagenomic analysis, coupled with the widespread availability and portability of Nanopore products (including real-time analysis options), proves advantageous for detecting harmful pathogens in food sources without relying on extensive industry resources and testing.Keywords: raw date palm sap, NGS, metabarcoding, food safety
Procedia PDF Downloads 53816 Hybrid Data-Driven Drilling Rate of Penetration Optimization Scheme Guided by Geological Formation and Historical Data
Authors: Ammar Alali, Mahmoud Abughaban, William Contreras Otalvora
Abstract:
Optimizing the drilling process for cost and efficiency requires the optimization of the rate of penetration (ROP). ROP is the measurement of the speed at which the wellbore is created, in units of feet per hour. It is the primary indicator of measuring drilling efficiency. Maximization of the ROP can indicate fast and cost-efficient drilling operations; however, high ROPs may induce unintended events, which may lead to nonproductive time (NPT) and higher net costs. The proposed ROP optimization solution is a hybrid, data-driven system that aims to improve the drilling process, maximize the ROP, and minimize NPT. The system consists of two phases: (1) utilizing existing geological and drilling data to train the model prior, and (2) real-time adjustments of the controllable dynamic drilling parameters [weight on bit (WOB), rotary speed (RPM), and pump flow rate (GPM)] that direct influence on the ROP. During the first phase of the system, geological and historical drilling data are aggregated. After, the top-rated wells, as a function of high instance ROP, are distinguished. Those wells are filtered based on NPT incidents, and a cross-plot is generated for the controllable dynamic drilling parameters per ROP value. Subsequently, the parameter values (WOB, GPM, RPM) are calculated as a conditioned mean based on physical distance, following Inverse Distance Weighting (IDW) interpolation methodology. The first phase is concluded by producing a model of drilling best practices from the offset wells, prioritizing the optimum ROP value. This phase is performed before the commencing of drilling. Starting with the model produced in phase one, the second phase runs an automated drill-off test, delivering live adjustments in real-time. Those adjustments are made by directing the driller to deviate two of the controllable parameters (WOB and RPM) by a small percentage (0-5%), following the Constrained Random Search (CRS) methodology. These minor incremental variations will reveal new drilling conditions, not explored before through offset wells. The data is then consolidated into a heat-map, as a function of ROP. A more optimum ROP performance is identified through the heat-map and amended in the model. The validation process involved the selection of a planned well in an onshore oil field with hundreds of offset wells. The first phase model was built by utilizing the data points from the top-performing historical wells (20 wells). The model allows drillers to enhance decision-making by leveraging existing data and blending it with live data in real-time. An empirical relationship between controllable dynamic parameters and ROP was derived using Artificial Neural Networks (ANN). The adjustments resulted in improved ROP efficiency by over 20%, translating to at least 10% saving in drilling costs. The novelty of the proposed system lays is its ability to integrate historical data, calibrate based geological formations, and run real-time global optimization through CRS. Those factors position the system to work for any newly drilled well in a developing field event.Keywords: drilling optimization, geological formations, machine learning, rate of penetration
Procedia PDF Downloads 131815 An Improved Total Variation Regularization Method for Denoising Magnetocardiography
Authors: Yanping Liao, Congcong He, Ruigang Zhao
Abstract:
The application of magnetocardiography signals to detect cardiac electrical function is a new technology developed in recent years. The magnetocardiography signal is detected with Superconducting Quantum Interference Devices (SQUID) and has considerable advantages over electrocardiography (ECG). It is difficult to extract Magnetocardiography (MCG) signal which is buried in the noise, which is a critical issue to be resolved in cardiac monitoring system and MCG applications. In order to remove the severe background noise, the Total Variation (TV) regularization method is proposed to denoise MCG signal. The approach transforms the denoising problem into a minimization optimization problem and the Majorization-minimization algorithm is applied to iteratively solve the minimization problem. However, traditional TV regularization method tends to cause step effect and lacks constraint adaptability. In this paper, an improved TV regularization method for denoising MCG signal is proposed to improve the denoising precision. The improvement of this method is mainly divided into three parts. First, high-order TV is applied to reduce the step effect, and the corresponding second derivative matrix is used to substitute the first order. Then, the positions of the non-zero elements in the second order derivative matrix are determined based on the peak positions that are detected by the detection window. Finally, adaptive constraint parameters are defined to eliminate noises and preserve signal peak characteristics. Theoretical analysis and experimental results show that this algorithm can effectively improve the output signal-to-noise ratio and has superior performance.Keywords: constraint parameters, derivative matrix, magnetocardiography, regular term, total variation
Procedia PDF Downloads 152814 The Expression of the Social Experience in Film Narration: Cinematic ‘Free Indirect Discourse’ in the Dancing Hawk (1977) by Grzegorz Krolikiewicz
Authors: Robert Birkholc
Abstract:
One of the basic issues related to the creation of characters in media, such as literature and film, is the representation of the characters' thoughts, emotions, and perceptions. This paper is devoted to the social perspective (or the focalization) expressed in film narration. The aim of the paper is to show how social point of view of the hero –conditioned by his origin and the environment from which he comes– can be created by using non-verbal, purely audiovisual means of expression. The issue will be considered on the example of the little-known polish movie The Dancing Hawk (1977) by Grzegorz Królikiewicz, based on the novel by Julian Kawalec. The thesis of the paper is that the polish director uses a narrative figure, which is somewhat analogous to literary form of free indirect discourse. In literature, free indirect discourse is formally ‘spoken’ by the external narrator, but the narration is clearly filtered through the language and thoughts of the character. According to some scholars (such as Roy Pascal), the narrator in this form of speech does not cite the character's words, but uses his way of thinking and imitates his perspective – sometimes with a deep irony. Free indirect discourse is frequently used in Julian Kawalec’s novel. Through the linguistic stylization, the author tries to convey the socially determined perspective of a peasant who migrates to the big city after the Second World War. Grzegorz Królikiewicz expresses the same social experience by pure cinematic form in the adaptation of the book. Both Kawalec and Królikiewicz show the consequences of so-called ‘social advancement’ in Poland after 1945, when the communist party took over political power. On the example of the fate of the main character, Michał Toporny, the director presents the experience of peasants who left their villages and had to adapt to new, urban space. However, the paper is not focused on the historical topic itself, but on the audiovisual form of the movie. Although Królikiewicz doesn’t use frequently POV shots, the narration of The Dancing Hawk is filtered through the sensations of the main character, who feels uprooted and alienated in the new social space. The director captures the hero's feelings through very complex audiovisual procedures – high or low points of view (representing the ‘social position’), grotesque soundtrack, expressionist scenery, and associative editing. In this way, he manages to create the world from the perspective of a socially maladjusted and internally split subject. The Dancing Hawk is a successful attempt to adapt the subjective narration of the book to the ‘language’ of the cinema. Mieke Bal’s notion of focalization helps to describe ‘free indirect discourse’ as a transmedial figure of representing of the characters’ perceptions. However, the polysemiotic medium of the film also significantly transforms this figure of representation. The paper shows both the similarities and differences between literary and cinematic ‘free indirect discourse.’Keywords: film and literature, free indirect discourse, social experience, subjective narration
Procedia PDF Downloads 130813 Detection of Trends and Break Points in Climatic Indices: The Case of Umbria Region in Italy
Authors: A. Flammini, R. Morbidelli, C. Saltalippi
Abstract:
The increase of air surface temperature at global scale is a fact, with values around 0.85 ºC since the late nineteen century, as well as a significant change in main features of rainfall regime. Nevertheless, the detected climatic changes are not equally distributed all over the world, but exhibit specific characteristics in different regions. Therefore, studying the evolution of climatic indices in different geographical areas with a prefixed standard approach becomes very useful in order to analyze the existence of climatic trend and compare results. In this work, a methodology to investigate the climatic change and its effects on a wide set of climatic indices is proposed and applied at regional scale in the case study of a Mediterranean area, Umbria region in Italy. From data of the available temperature stations, nine temperature indices have been obtained and the existence of trends has been checked by applying the non-parametric Mann-Kendall test, while the non-parametric Pettitt test and the parametric Standard Normal Homogeneity Test (SNHT) have been applied to detect the presence of break points. In addition, aimed to characterize the rainfall regime, data from 11 rainfall stations have been used and a trend analysis has been performed on cumulative annual rainfall depth, daily rainfall, rainy days, and dry periods length. The results show a general increase in any temperature indices, even if with a trend pattern dependent of indices and stations, and a general decrease of cumulative annual rainfall and average daily rainfall, with a time rainfall distribution over the year different from the past.Keywords: climatic change, temperature, rainfall regime, trend analysis
Procedia PDF Downloads 118812 Invasion of Pectinatella magnifica in Freshwater Resources of the Czech Republic
Authors: J. Pazourek, K. Šmejkal, P. Kollár, J. Rajchard, J. Šinko, Z. Balounová, E. Vlková, H. Salmonová
Abstract:
Pectinatella magnifica (Leidy, 1851) is an invasive freshwater animal that lives in colonies. A colony of Pectinatella magnifica (a gelatinous blob) can be up to several feet in diameter large and under favorable conditions it exhibits an extreme growth rate. Recently European countries around rivers of Elbe, Oder, Danube, Rhine and Vltava have confirmed invasion of Pectinatella magnifica, including freshwater reservoirs in South Bohemia (Czech Republic). Our project (Czech Science Foundation, GAČR P503/12/0337) is focused onto biology and chemistry of Pectinatella magnifica. We monitor the organism occurrence in selected South Bohemia ponds and sandpits during the last years, collecting information about physical properties of surrounding water, and sampling the colonies for various analyses (classification, maps of secondary metabolites, toxicity tests). Because the gelatinous matrix is during the colony lifetime also a host for algae, bacteria and cyanobacteria (co-habitants), in this contribution, we also applied a high performance liquid chromatography (HPLC) method for determination of potentially present cyanobacterial toxins (microcystin-LR, microcystin-RR, nodularin). Results from the last 3-year monitoring show that these toxins are under limit of detection (LOD), so that they do not represent a danger yet. The final goal of our study is to assess toxicity risks related to fresh water resources invaded by Pectinatella magnifica, and to understand the process of invasion, which can enable to control it.Keywords: cyanobacteria, fresh water resources, Pectinatella magnifica invasion, toxicity monitoring
Procedia PDF Downloads 238811 Speciation, Preconcentration, and Determination of Iron(II) and (III) Using 1,10-Phenanthroline Immobilized on Alumina-Coated Magnetite Nanoparticles as a Solid Phase Extraction Sorbent in Pharmaceutical Products
Authors: Hossein Tavallali, Mohammad Ali Karimi, Gohar Deilamy-Rad
Abstract:
The proposed method for speciation, preconcentration and determination of Fe(II) and Fe(III) in pharmaceutical products was developed using of alumina-coated magnetite nanoparticles (Fe3O4/Al2O3 NPs) as solid phase extraction (SPE) sorbent in magnetic mixed hemimicell solid phase extraction (MMHSPE) technique followed by flame atomic absorption spectrometry analysis. The procedure is based on complexation of Fe(II) with 1, 10-phenanthroline (OP) as complexing reagent for Fe(II) that immobilized on the modified Fe3O4/Al2O3 NPs. The extraction and concentration process for pharmaceutical sample was carried out in a single step by mixing the extraction solvent, magnetic adsorbents under ultrasonic action. Then, the adsorbents were isolated from the complicated matrix easily with an external magnetic field. Fe(III) ions determined after facility reduced to Fe(II) by added a proper reduction agent to sample solutions. Compared with traditional methods, the MMHSPE method simplified the operation procedure and reduced the analysis time. Various influencing parameters on the speciation and preconcentration of trace iron, such as pH, sample volume, amount of sorbent, type and concentration of eluent, were studied. Under the optimized operating conditions, the preconcentration factor of the modified nano magnetite for Fe(II) 167 sample was obtained. The detection limits and linear range of this method for iron were 1.0 and 9.0 - 175 ng.mL−1, respectively. Also the relative standard deviation for five replicate determinations of 30.00 ng.mL-1 Fe2+ was 2.3%.Keywords: Alumina-Coated magnetite nanoparticles, Magnetic Mixed Hemimicell Solid-Phase Extraction, Fe(ΙΙ) and Fe(ΙΙΙ), pharmaceutical sample
Procedia PDF Downloads 291810 Implementation of Project-Based Learning with Peer Assessment in Large Classes under Consideration of Faculty’s Scare Resources
Authors: Margit Kastner
Abstract:
To overcome the negative consequences associated with large class sizes and to support students in developing the necessary competences (e.g., critical thinking, problem-solving, or team-work skills) a marketing course has been redesigned by implementing project-based learning with peer assessment (PBL&PA). This means that students can voluntarily take advantage of this supplementary offer and explore -in addition to attending the lecture where clicker questions are asked- a real-world problem, find a solution, and assess the results of peers while working in small collaborative groups. In order to handle this with little further effort, the process is technically supported by the university’s e-learning system in such a way that students upload their solution in form of an assignment which is then automatically distributed to peer groups who have to assess the work of three other groups. Finally, students’ work is graded automatically considering both, students’ contribution to the project and the conformity of the peer assessment. The purpose of this study is to evaluate students’ perception of PBL&PA using an online-questionnaire to collect the data. More specifically, it aims to discover students’ motivations for (not) working on a project and the benefits and problems students encounter. In addition to the survey, students’ performance was analyzed by comparing the final grades of those who participated in PBL&PA with those who did not participate. Among the 260 students who filled out the questionnaire, 47% participated in PBL&PA. Besides extrinsic motivations (bonus credits), students’ participation was often motivated by learning and social benefits. Reasons for not working on a project were connected to students’ organization and management of their studies (e.g., time constraints, no/wrong information) and teamwork concerns (e.g., missing engagement of peers, prior negative experiences). In addition, high workload and insufficient extrinsic motivation (bonus credits) were mentioned. With regards to benefits and problems students encountered during the project, students provided more positive than negative comments. Positive aspects most often stated were learning and social benefits while negative ones were mainly attached to the technical implementation. Interestingly, bonus credits were hardly named as a positive aspect meaning that intrinsic motivations have become more important when working on the project. Team aspects generated mixed feelings. In addition, students who voluntarily participated in PBL&PA were, in general, more active and utilized further course offers such as clicker questions. Examining students’ performance at the final exam revealed that students without participating in any of the offered active learning tasks performed poorest in the exam while students who used all activities were best. In conclusion, the goals of the implementation were met in terms of students’ perceived benefits and the positive impact on students’ exam performance. Since the comparison of the automatic grading with faculty grading showed valid results, it is possible to rely only on automatic grading in the future. That way, the additional workload for faculty will be within limits. Thus, the implementation of project-based learning with peer assessment can be recommended for large classes.Keywords: automated grading, large classes, peer assessment, project-based learning
Procedia PDF Downloads 164809 The Role of Law in Promoting Democratic Governance
Authors: Mozamil Mohamed Ali
Abstract:
Understanding the relationship between law and democratic governance, this research, titled “The Role of Law in Enhancing Democratic Governance: A Comparative Study of Political Systems in Developing Countries,” focuses on examining the impact of legal frameworks on strengthening democratic practices within developing nations. Democratic governance requires transparency and institutional accountability to meet citizens’ needs, which necessitates legal frameworks that ensure compliance with governance standards. These frameworks hold greater significance in developing countries, where challenges such as corruption, weak public institutions, and socio-political conflicts affect their ability to achieve sustainable democratic governance. In this context, the research explores how laws influence these aspects. The study compares various developing countries that have experienced different levels of success and difficulty in enhancing democratic governance, focusing on the legal frameworks and public policies each country has implemented to improve transparency, accountability, and strengthen the role of public institutions. This comparative analysis aims to reveal the effectiveness of legal systems in supporting democratic governance and to identify the factors that lead to the success or failure of these legal frameworks in different contexts. For example, the study includes cases from countries in Asia, Africa, and Latin America, analyzing the legal and institutional policies and their roles in achieving justice and reducing corruption. It examines the impact of legislation that promotes freedom of the press, human rights, and judicial independence as fundamental elements for transparent and democratic governance. Additionally, the research discusses how anti-corruption policies and laws governing electoral competition contribute to improving government responsiveness to public demands. The hypothesis of the research centers on the idea that developing transparent and fair laws contributes to achieving sustainable democratic governance. The analyses show that applying laws equally and impartially strengthens citizens’ trust in public institutions and encourages political participation. At the same time, the research highlights the importance of local adaptation to global legal frameworks, as it may be necessary to consider local socio-political and economic contexts to ensure the success of these frameworks. In conclusion, this research underscores the importance of legal frameworks as a pivotal factor in the success of democratic governance. It provides recommendations related to enhancing judicial independence, enforcing anti-corruption laws, and improving access to information as essential steps for strengthening democratic governance in developing countries. The findings suggest that laws respected and carefully implemented can form a solid foundation for building more transparent and effective government institutions, contributing to sustainable development and social justice in these nations.Keywords: impact of legislation, role of institutions in controlling power, community participation, role of the judiciary
Procedia PDF Downloads 17808 Analysis of Seismic Waves Generated by Blasting Operations and their Response on Buildings
Authors: S. Ziaran, M. Musil, M. Cekan, O. Chlebo
Abstract:
The paper analyzes the response of buildings and industrially structures on seismic waves (low frequency mechanical vibration) generated by blasting operations. The principles of seismic analysis can be applied for different kinds of excitation such as: earthquakes, wind, explosions, random excitation from local transportation, periodic excitation from large rotating and/or machines with reciprocating motion, metal forming processes such as forging, shearing and stamping, chemical reactions, construction and earth moving work, and other strong deterministic and random energy sources caused by human activities. The article deals with the response of seismic, low frequency, mechanical vibrations generated by nearby blasting operations on a residential home. The goal was to determine the fundamental natural frequencies of the measured structure; therefore it is important to determine the resonant frequencies to design a suitable modal damping. The article also analyzes the package of seismic waves generated by blasting (Primary waves – P-waves and Secondary waves S-waves) and investigated the transfer regions. For the detection of seismic waves resulting from an explosion, the Fast Fourier Transform (FFT) and modal analysis, in the frequency domain, is used and the signal was acquired and analyzed also in the time domain. In the conclusions the measured results of seismic waves caused by blasting in a nearby quarry and its effect on a nearby structure (house) is analyzed. The response on the house, including the fundamental natural frequency and possible fatigue damage is also assessed.Keywords: building structure, seismic waves, spectral analysis, structural response
Procedia PDF Downloads 399807 The Effect of Rheological Properties and Spun/Meltblown Fiber Characteristics on “Hotmelt Bleed through” Behavior in High Speed Textile Backsheet Lamination Process
Authors: Kinyas Aydin, Fatih Erguney, Tolga Ceper, Serap Ozay, Ipar N. Uzun, Sebnem Kemaloglu Dogan, Deniz Tunc
Abstract:
In order to meet high growth rates in baby diaper industry worldwide, the high-speed textile backsheet lamination lines have recently been introduced to the market for non-woven/film lamination applications. It is a process where two substrates are bonded to each other via hotmelt adhesive (HMA). Nonwoven (NW) lamination system basically consists of 4 components; polypropylene (PP) nonwoven, polyethylene (PE) film, HMA and applicator system. Each component has a substantial effect on the process efficiency of continuous line and final product properties. However, for a precise subject cover, we will be addressing only the main challenges and possible solutions in this paper. The NW is often produced by spunbond method (SSS or SMS configuration) and has a 10-12 gsm (g/m²) basis weight. The NW rolls can have a width and length up to 2.060 mm and 30.000 linear meters, respectively. The PE film is the 2ⁿᵈ component in TBS lamination, which is usually a 12-14 gsm blown or cast breathable film. HMA is a thermoplastic glue (mostly rubber based) that can be applied in a large range of viscosity ranges. The main HMA application technology in TBS lamination is the slot die application in which HMA is spread on the top of the NW along the whole width at high temperatures in the melt form. Then, the NW is passed over chiller rolls with a certain open time depending on the line speed. HMAs are applied at certain levels in order to provide a proper de-lamination strength in cross and machine directions to the entire structure. Current TBS lamination line speed and width can be as high as 800 m/min and 2100 mm, respectively. They also feature an automated web control tension system for winders and unwinders. In order to run a continuous trouble-free mass production campaign on the fast industrial TBS lines, rheological properties of HMAs and micro-properties of NWs can have adverse effects on the line efficiency and continuity. NW fiber orientation and fineness, as well as spun/melt blown composition fabric micro-level properties, are the significant factors to affect the degree of “HMA bleed through.” As a result of this problem, frequent line stops are observed to clean the glue that is being accumulated on the chiller rolls, which significantly reduces the line efficiency. HMA rheology is also important and to eliminate any bleed through the problem; one should have a good understanding of rheology driven potential complications. So, the applied viscosity/temperature should be optimized in accordance with the line speed, line width, NW characteristics and the required open time for a given HMA formulation. In this study, we will show practical aspects of potential preventative actions to minimize the HMA bleed through the problem, which may stem from both HMA rheological properties and NW spun melt/melt blown fiber characteristics.Keywords: breathable, hotmelt, nonwoven, textile backsheet lamination, spun/melt blown
Procedia PDF Downloads 357806 Coherent All-Fiber and Polarization Maintaining Source for CO2 Range-Resolved Differential Absorption Lidar
Authors: Erwan Negre, Ewan J. O'Connor, Juha Toivonen
Abstract:
The need for CO2 monitoring technologies grows simultaneously with the worldwide concerns regarding environmental challenges. To that purpose, we developed a compact coherent all-fiber ranged-resolved Differential Absorption Lidar (RR-DIAL). It has been designed along a tunable 2x1fiber optic switch set to a frequency of 1 Hz between two Distributed FeedBack (DFB) lasers emitting in the continuous-wave mode at 1571.41 nm (absorption line of CO2) and 1571.25 nm (CO2 absorption-free line), with linewidth and tuning range of respectively 1 MHz and 3 nm over operating wavelength. A three stages amplification through Erbium and Erbium-Ytterbium doped fibers coupled to a Radio Frequency (RF) driven Acousto-Optic Modulator (AOM) generates 100 ns pulses at a repetition rate from 10 to 30 kHz with a peak power up to 2.5 kW and a spatial resolution of 15 m, allowing fast and highly resolved CO2 profiles. The same afocal collection system is used for the output of the laser source and the backscattered light which is then directed to a circulator before being mixed with the local oscillator for heterodyne detection. Packaged in an easily transportable box which also includes a server and a Field Programmable Gate Array (FPGA) card for on-line data processing and storing, our setup allows an effective and quick deployment for versatile in-situ analysis, whether it be vertical atmospheric monitoring, large field mapping or sequestration site continuous oversight. Setup operation and results from initial field measurements will be discussed.Keywords: CO2 profiles, coherent DIAL, in-situ atmospheric sensing, near infrared fiber source
Procedia PDF Downloads 127805 Cytotoxicity and Androgenic Potential of Antifungal Drug Substances on MDA-KB2 Cells
Authors: Benchouala Amira, Bojic Clement, Poupin Pascal, Cossu Leguille-carole
Abstract:
The objective of this study is to evaluate in vitro the cytotoxic and androgenic potential of several antifungal molecules (amphotericin B, econazole, ketoconazole and miconazole) on MDA-Kb2 cell lines. This biological model is an effective tool for the detection of endocrine disruptors because it responds well to the main agonist of the androgen receptor (testosterone) and also to an antagonist: flutamide. The cytotoxicity of each chemical compound tested was measured using an MTT assay (tetrazolium salt, 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide) which measures the activity of the reductase function of mitochondrial succinate dehydrogenase enzymes of cultured cells. This complementary cytotoxicity test is essential to ensure that the effects of reduction in luminescence intensity observed during androgenic tests are only attributable to the anti-androgenic action of the compounds tested and not to their possible cytotoxic properties. Tests of the androgenic activity of antifungals show that these compounds do not have the capacity to induce transcription of the luciferase gene. These compounds do not exert an androgenic effect on MDA-Kb2 cells in culture for the environmental concentrations tested. The addition of flutamide for the same tested concentrations of antifungal molecules reduces the luminescence induced by amphotericin B, econazole and miconazole, which is explained by a strong interaction of these molecules with flutamide which may have a greater toxic effect than when tested alone. The cytotoxicity test shows that econazole and ketoconazole can cause cell death at certain concentrations tested. This cell mortality is perhaps induced by a direct or indirect action on deoxyribonucleic acid (DNA), ribonucleic acid (RNA) or proteins necessary for cell division.Keywords: cytotoxicity, androgenic potential, antifungals, MDA-Kb2
Procedia PDF Downloads 45804 African Swine Fewer Situation and Diagnostic Methods in Lithuania
Authors: Simona Pileviciene
Abstract:
On 24th January 2014, Lithuania notified two primary cases of African swine fever (ASF) in wild boars. The animals were tested positive for ASF virus (ASFV) genome by real-time PCR at the National Reference Laboratory for ASF in Lithuania (NRL), results were confirmed by the European Union Reference Laboratory for African swine fever (CISA-INIA). Intensive wild and domestic animal monitoring program was started. During the period of 2014-2017 ASF was confirmed in two large commercial pig holding with the highest biosecurity. Pigs were killed and destroyed. Since 2014 ASF outbreak territory from east and south has expanded to the middle of Lithuania. Diagnosis by PCR is one of the highly recommended diagnostic methods by World Organization for Animal Health (OIE) for diagnosis of ASF. The aim of the present study was to compare singleplex real-time PCR assays to a duplex assay allowing the identification of ASF and internal control in a single PCR tube and to compare primers, that target the p72 gene (ASF 250 bp and ASF 75 bp) effectivity. Multiplex real-time PCR assays prove to be less time consuming and cost-efficient and therefore have a high potential to be applied in the routine analysis. It is important to have effective and fast method that allows virus detection at the beginning of disease for wild boar population and in outbreaks for domestic pigs. For experiments, we used reference samples (INIA, Spain), and positive samples from infected animals in Lithuania. Results show 100% sensitivity and specificity.Keywords: African swine fewer, real-time PCR, wild boar, domestic pig
Procedia PDF Downloads 164