Search results for: evaluation accuracy
1669 The Effect of a Weed-Killer Sulfonylurea on Durum Wheat (Triticum Durum Desf)
Authors: L. Meksem Amara, M. Ferfar, N. Meksem, M. R. Djebar
Abstract:
The wheat is the cereal the most consumed in the world. In Algeria, the production of this cereal covers only 20 in 25 % of the needs for the country, the rest being imported. To improve the efficiency and the productivity of the durum wheat, the farmers turn to the use of pesticides: weed-killers, fungicides and insecticides. However this use often entrains losses of products more at least important contaminating the environment and all the food chain. Weed-killers are substances developed to control or destroy plants considered unwanted. That they are natural or produced by the human being (molecule of synthesis), the absorption and the metabolization of weed-killers by plants cause the death of these plants. In this work, we set as goal the evaluation of the effect of a weed-killer sulfonylurea, the CossackOD with various concentrations (0, 2, 4 and 9 µg) on variety of Triticum durum: Cirta. We evaluated the plant growth by measuring the leaves and root length, compared with the witness as well as the content of proline and analyze the level of one of the antioxydative enzymes: catalase, after 14 days of treatment. Sulfonylurea is foliar and root weed-killers inhibiting the acetolactate synthase: a vegetable enzyme essential to the development of the plant. This inhibition causes the ruling of the growth then the death. The obtained results show a diminution of the average length of leaves and roots this can be explained by the fact that the ALS inhibitors are more active in the young and increasing regions of the plant, what inhibits the cellular division and talks a limitation of the foliar and root’s growth. We also recorded a highly significant increase in the proline levels and a stimulation of the catalase activity. As a response to increasing the herbicide concentrations a particular increases in antioxidative mechanisms in wheat cultivar Cirta suggest that the high sensitivity of Cirta to this sulfonylurea herbicide is related to the enhanced production and oxidative damage of reactive oxygen species.Keywords: sulfonylurea, triticum durum, oxydative stress, toxicity
Procedia PDF Downloads 4181668 The Rule of Architectural Firms in Enhancing Building Energy Efficiency in Emerging Countries: Processes and Tools Evaluation of Architectural Firms in Egypt
Authors: Mahmoud F. Mohamadin, Ahmed Abdel Malek, Wessam Said
Abstract:
Achieving energy efficient architecture in general, and in emerging countries in particular, is a challenging process that requires the contribution of various governmental, institutional, and individual entities. The rule of architectural design is essential in this process as it is considered as one of the earliest steps on the road to sustainability. Architectural firms have a moral and professional responsibility to respond to these challenges and deliver buildings that consume less energy. This study aims to evaluate the design processes and tools in practice of Egyptian architectural firms based on a limited survey to investigate if their processes and methods can lead to projects that meet the Egyptian Code of Energy Efficiency Improvement. A case study of twenty architectural firms in Cairo was selected and categorized according to their scale; large-scale, medium-scale, and small-scale. A questionnaire was designed and distributed to the firms, and personal meetings with the firms’ representatives took place. The questionnaire answered three main points; the design processes adopted, the usage of performance-based simulation tools, and the usage of BIM tools for energy efficiency purposes. The results of the study revealed that only little percentage of the large-scale firms have clear strategies for building energy efficiency in their building design, however the application is limited to certain project types, or according to the client request. On the other hand, the percentage of medium-scale firms is much less, and it is almost absent in the small-scale ones. This demonstrates the urgent need of enhancing the awareness of the Egyptian architectural design community of the great importance of implementing these methods starting from the early stages of the building design. Finally, the study proposed recommendations for such firms to be able to create a healthy built environment and improve the quality of life in emerging countries.Keywords: architectural firms, emerging countries, energy efficiency, performance-based simulation tools
Procedia PDF Downloads 2861667 Green Economy and Environmental Protection Economic Policy Challenges in Georgia
Authors: Gulnaz Erkomaishvili
Abstract:
Introduction. One of the most important issues of state economic policy in the 21st century is the problem of environmental protection. The Georgian government considers the green economy as one of the most important means of sustainable economic development and takes the initiative to implement voluntary measures to promote sustainable development. In this context, it is important to promote the development of ecosystem services, clean production, environmental education and green jobs.The development of the green economy significantly reduces the inefficient use of natural resources, waste generation, emissions into the atmosphere and the discharge of untreated water into bodies of water.It is, therefore, an important instrument in the environmental orientation of sustainable development. Objectives.The aim of the paper is to analyze the current status of the green economy in Georgia and identify effective ways to improve the environmental, economic policy of sustainable development. Methodologies: This paper uses general and specific methods, in particular, analysis, synthesis, induction, deduction, scientific abstraction, comparative and statistical methods, as well as experts’ evaluation. bibliographic research of scientific works and reports of organizations was conducted; Publications of the National Statistics Office of Georgia are used to determine the regularity between analytical and statistical estimations. Also, theoretical and applied research of international organizations and scientist-economists are used. Contributions: The country should implement such an economic policy that ensures the transition to a green economy, in particular, revising water, air and waste laws, strengthening existing environmental management tools and introcing new tools (including economic tools). Perfecting the regulatory legal framework of the environmental impact assessment system, which includes the harmonization of Georgian legislation with the requirements of the European Union. To ensure the protection and rational use of Georgia's forests, emphasis should be placed on sustainable forestry, protection and restoration of forests.Keywords: green economy, environmental protection, environmental protection economic policy, environmental protection policy challanges
Procedia PDF Downloads 701666 Association of Clostridium difficile Infection and Bone Cancer
Authors: Daniela Prado, Lexi Frankel, Amalia Ardeljan, Lokesh Manjani, Matthew Cardeiro, Omar Rashid
Abstract:
Background: Clostridium difficile (C. diff) is a gram-positive bacterium that is known to cause life-threatening diarrhea and severe inflammation of the colon. It originates as an alteration of the gut microbiome and can be transmitted through spores. Recent studies have shown a high association between the development of C. diff in cancer patients due to extensive hospitalization. However, research is lacking regarding C. diff’s association in the causation or prevention of cancer. The objective of this study was to therefore assess the correlation between Clostridium difficile infection (CDI) and the incidence of bone cancer. Methods: This retrospective analysis used data provided by a Health Insurance Portability and Accountability Act (HIPAA) compliant national database to evaluate the patients infected versus patients not infected with C. diff using ICD-10 and ICD-9 codes. Access to the database was granted by the Holy Cross Health, Fort Lauderdale, for the purpose of academic research. Standard statistical methods were used. Results: Between January 2010 and December 2019, the query was analyzed and resulted in 78863 patients in both the infected and control group, respectively. The two groups were matched by age range and CCI score. The incidence of bone cancer was 659 patients (0.835%) in the C. diff group compared to 1941 patients (2.461%) in the control group. The difference was statistically significant by a P-value < 2.2x10^-16 with an odds ratio (OR)= 0.33 (0.31-0.37) with a 95% confidence interval (CI). Treatment for CDI was analyzed for both C. diff infected and noninfected populations. 91 out of 16,676 (0.55%) patients with a prior C. diff infection and treated with antibiotics were compared to the control group were 275 out of 16,676 (1.65%) patients with no history of CDI and received antibiotic treatment. Results remained statistically significant by P-value <2.2x10-16 with an OR= 0.42 (0.37, 0.48). and a 95% CI. Conclusion: The study shows a statistically significant correlation between C. diff and a reduced incidence of bone cancer. Further evaluation is recommended to assess the potential of C. difficile in reducing bone cancer incidence.Keywords: bone cancer, colitis, clostridium difficile, microbiome
Procedia PDF Downloads 2831665 Rainfall Estimation over Northern Tunisia by Combining Meteosat Second Generation Cloud Top Temperature and Tropical Rainfall Measuring Mission Microwave Imager Rain Rates
Authors: Saoussen Dhib, Chris M. Mannaerts, Zoubeida Bargaoui, Ben H. P. Maathuis, Petra Budde
Abstract:
In this study, a new method to delineate rain areas in northern Tunisia is presented. The proposed approach is based on the blending of the geostationary Meteosat Second Generation (MSG) infrared channel (IR) with the low-earth orbiting passive Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI). To blend this two products, we need to apply two main steps. Firstly, we have to identify the rainy pixels. This step is achieved based on a classification using MSG channel IR 10.8 and the water vapor WV 0.62, applying a threshold on the temperature difference of less than 11 Kelvin which is an approximation of the clouds that have a high likelihood of precipitation. The second step consists on fitting the relation between IR cloud top temperature with the TMI rain rates. The correlation coefficient of these two variables has a negative tendency, meaning that with decreasing temperature there is an increase in rainfall intensity. The fitting equation will be applied for the whole day of MSG 15 minutes interval images which will be summed. To validate this combined product, daily extreme rainfall events occurred during the period 2007-2009 were selected, using a threshold criterion for large rainfall depth (> 50 mm/day) occurring at least at one rainfall station. Inverse distance interpolation method was applied to generate rainfall maps for the drier summer season (from May to October) and the wet winter season (from November to April). The evaluation results of the estimated rainfall combining MSG and TMI was very encouraging where all the events were detected rainy and the correlation coefficients were much better than previous evaluated products over the study area such as MSGMPE and PERSIANN products. The combined product showed a better performance during wet season. We notice also an overestimation of the maximal estimated rain for many events.Keywords: combination, extreme, rainfall, TMI-MSG, Tunisia
Procedia PDF Downloads 1781664 The Challenge of Assessing Social AI Threats
Authors: Kitty Kioskli, Theofanis Fotis, Nineta Polemi
Abstract:
The European Union (EU) directive Artificial Intelligence (AI) Act in Article 9 requires that risk management of AI systems includes both technical and human oversight, while according to NIST_AI_RFM (Appendix C) and ENISA AI Framework recommendations, claim that further research is needed to understand the current limitations of social threats and human-AI interaction. AI threats within social contexts significantly affect the security and trustworthiness of the AI systems; they are interrelated and trigger technical threats as well. For example, lack of explainability (e.g. the complexity of models can be challenging for stakeholders to grasp) leads to misunderstandings, biases, and erroneous decisions. Which in turn impact the privacy, security, accountability of the AI systems. Based on the NIST four fundamental criteria for explainability it can also classify the explainability threats into four (4) sub-categories: a) Lack of supporting evidence: AI systems must provide supporting evidence or reasons for all their outputs. b) Lack of Understandability: Explanations offered by systems should be comprehensible to individual users. c) Lack of Accuracy: The provided explanation should accurately represent the system's process of generating outputs. d) Out of scope: The system should only function within its designated conditions or when it possesses sufficient confidence in its outputs. Biases may also stem from historical data reflecting undesired behaviors. When present in the data, biases can permeate the models trained on them, thereby influencing the security and trustworthiness of the of AI systems. Social related AI threats are recognized by various initiatives (e.g., EU Ethics Guidelines for Trustworthy AI), standards (e.g. ISO/IEC TR 24368:2022 on AI ethical concerns, ISO/IEC AWI 42105 on guidance for human oversight of AI systems) and EU legislation (e.g. the General Data Protection Regulation 2016/679, the NIS 2 Directive 2022/2555, the Directive on the Resilience of Critical Entities 2022/2557, the EU AI Act, the Cyber Resilience Act). Measuring social threats, estimating the risks to AI systems associated to these threats and mitigating them is a research challenge. In this paper it will present the efforts of two European Commission Projects (FAITH and THEMIS) from the HorizonEurope programme that analyse the social threats by building cyber-social exercises in order to study human behaviour, traits, cognitive ability, personality, attitudes, interests, and other socio-technical profile characteristics. The research in these projects also include the development of measurements and scales (psychometrics) for human-related vulnerabilities that can be used in estimating more realistically the vulnerability severity, enhancing the CVSS4.0 measurement.Keywords: social threats, artificial Intelligence, mitigation, social experiment
Procedia PDF Downloads 691663 Inelastic and Elastic Taping in Plantar Pressure of Runners Pronators: Clinical Trial
Authors: Liana Gomide, Juliana Rodrigues
Abstract:
The morphology of the foot defines its mode of operation and a biomechanical reform indispensable for a symmetrical distribution of plantar pressures in order not to overload some of its components in isolation. High plantar pressures at specific points in the foot may be a causal factor in several orthopedic disorders that affect the feet such as pain and stress fracture. With digital baro-podometry equipment one can observe an intensity of pressures along the entire foot and quantify some of the movements, such as a subtalar pronation present in the midfoot region. Although, they are involved in microtraumas. In clinical practice, excessive movement has been limited with the use of different taping techniques applied on the plantar arch. Thus, the objective of the present study was to analyze and compare the influence of the inelastic and elastic taping on the distribution of plantar pressure of runners pronators. This is a randomized clinical trial and blind-crossover. Twenty (20) male subjects, mean age 33 ± 7 years old, mean body mass of 71 ± 7 kg, mean height of 174 ± 6 cm, were included in the study. A data collection was carried out by a single research through barop-odometry equipment - Tekscan, model F-scan mobile. The tests were performed at three different times. In the first, an initial barop-odometric evaluation was performed, without a bandage application, with edges at a speed of 9.0 km/h. In the second and third moments, the inelastic or elastic taping was applied consecutively, according to the definition defined in the randomization. As results, it was observed that both as inelastic and elastic taping, provided significant reductions in contact pressure and peak pressure values when compared to the moment without a taping. However, an elastic taping was more effective in decreasing contact pressure (no bandage = 714 ± 201, elastic taping = 690 ± 210 and inelastic taping = 716 ± 180) and no peak pressure in the midfoot region (no bandage = 1490 ± 42, elastic taping = 1273 ± 323 and inelastic taping = 1487 ± 437). It is possible to conclude that it is an elastic taping provided by pressure in the middle region, thereby reducing the subtalar pronunciation event during the run.Keywords: elastic taping, inelastic taping, running, subtalar pronation
Procedia PDF Downloads 1581662 An Evaluation of the Influence of Corn Cob Ash on the Strength Parameters of Lateritic SoiLs
Authors: O. A. Apampa, Y. A. Jimoh
Abstract:
The paper reports the investigation of Corn Cob Ash as a chemical stabilizing agent for laterite soils. Corn cob feedstock was obtained from Maya, a rural community in the derived savannah agro-ecological zone of South-Western Nigeria and burnt to ashes of pozzolanic quality. Reddish brown silty clayey sand material characterized as AASHTO A-2-6(3) lateritic material was obtained from a borrow pit in Abeokuta and subjected to strength characterization tests according to BS 1377: 2000. The soil was subsequently mixed with CCA in varying percentages of 0-7.5% at 1.5% intervals. The influence of CCA stabilized soil was determined for the Atterberg limits, compaction characteristics, CBR and the unconfined compression strength. The tests were repeated on laterite cement-soil mixture in order to establish a basis for comparison. The result shows a similarity in the compaction characteristics of soil-cement and soil-CCA. With increasing addition of binder from 1.5% to 7.5%, Maximum Dry Density progressively declined while the OMC steadily increased. For the CBR, the maximum positive impact was observed at 1.5% CCA addition at a value of 85% compared to the control value of 65% for the cement stabilization, but declined steadily thereafter with increasing addition of CCA, while that of soil-cement continued to increase with increasing addition of cement beyond 1.5% though at a relatively slow rate. Similar behavior was observed in the UCS values for the soil-CCA mix, increasing from a control value of 0.4 MN/m2 to 1.0 MN/m2 at 1.5% CCA and declining thereafter, while that for soil-cement continued to increase with increasing cement addition, but at a slower rate. This paper demonstrates that CCA is effective for chemical stabilization of a typical Nigerian AASHTO A-2-6 lateritic soil at maximum stabilizer content limit of 1.5% and therefore recommends its use as a way of finding further application for agricultural waste products and achievement of environmental sustainability in line with the ideals of the millennium development goals because of the economic and technical feasibility of the processing of the cobs from corn.Keywords: corn cob ash, pozzolan, cement, laterite, stabilizing agent, cation exchange capacity
Procedia PDF Downloads 3031661 The Impact of Heat Waves on Human Health: State of Art in Italy
Authors: Vito Telesca, Giuseppina A. Giorgio
Abstract:
The earth system is subject to a wide range of human activities that have changed the ecosystem more rapidly and extensively in the last five decades. These global changes have a large impact on human health. The relationship between extreme weather events and mortality are widely documented in different studies. In particular, a number of studies have investigated the relationship between climatological variations and the cardiovascular and respiratory system. The researchers have become interested in the evaluation of the effect of environmental variations on the occurrence of different diseases (such as infarction, ischemic heart disease, asthma, respiratory problems, etc.) and mortality. Among changes in weather conditions, the heat waves have been used for investigating the association between weather conditions and cardiovascular events and cerebrovascular, using thermal indices, which combine air temperature, relative humidity, and wind speed. The effects of heat waves on human health are mainly found in the urban areas and they are aggravated by the presence of atmospheric pollution. The consequences of these changes for human health are of growing concern. In particular, meteorological conditions are one of the environmental aspects because cardiovascular diseases are more common among the elderly population, and such people are more sensitive to weather changes. In addition, heat waves, or extreme heat events, are predicted to increase in frequency, intensity, and duration with climate change. In this context, are very important public health and climate change connections increasingly being recognized by the medical research, because these might help in informing the public at large. Policy experts claim that a growing awareness of the relationships of public health and climate change could be a key in breaking through political logjams impeding action on mitigation and adaptation. The aims of this study are to investigate about the importance of interactions between weather variables and your effects on human health, focusing on Italy. Also highlighting the need to define strategies and practical actions of monitoring, adaptation and mitigation of the phenomenon.Keywords: climate change, illness, Italy, temperature, weather
Procedia PDF Downloads 2501660 Investigating Early Markers of Alzheimer’s Disease Using a Combination of Cognitive Tests and MRI to Probe Changes in Hippocampal Anatomy and Functionality
Authors: Netasha Shaikh, Bryony Wood, Demitra Tsivos, Michael Knight, Risto Kauppinen, Elizabeth Coulthard
Abstract:
Background: Effective treatment of dementia will require early diagnosis, before significant brain damage has accumulated. Memory loss is an early symptom of Alzheimer’s disease (AD). The hippocampus, a brain area critical for memory, degenerates early in the course of AD. The hippocampus comprises several subfields. In contrast to healthy aging where CA3 and dentate gyrus are the hippocampal subfields with most prominent atrophy, in AD the CA1 and subiculum are thought to be affected early. Conventional clinical structural neuroimaging is not sufficiently sensitive to identify preferential atrophy in individual subfields. Here, we will explore the sensitivity of new magnetic resonance imaging (MRI) sequences designed to interrogate medial temporal regions as an early marker of Alzheimer’s. As it is likely a combination of tests may predict early Alzheimer’s disease (AD) better than any single test, we look at the potential efficacy of such imaging alone and in combination with standard and novel cognitive tasks of hippocampal dependent memory. Methods: 20 patients with mild cognitive impairment (MCI), 20 with mild-moderate AD and 20 age-matched healthy elderly controls (HC) are being recruited to undergo 3T MRI (with sequences designed to allow volumetric analysis of hippocampal subfields) and a battery of cognitive tasks (including Paired Associative Learning from CANTAB, Hopkins Verbal Learning Test and a novel hippocampal-dependent abstract word memory task). AD participants and healthy controls are being tested just once whereas patients with MCI will be tested twice a year apart. We will compare subfield size between groups and correlate subfield size with cognitive performance on our tasks. In the MCI group, we will explore the relationship between subfield volume, cognitive test performance and deterioration in clinical condition over a year. Results: Preliminary data (currently on 16 participants: 2 AD; 4 MCI; 9 HC) have revealed subfield size differences between subject groups. Patients with AD perform with less accuracy on tasks of hippocampal-dependent memory, and MCI patient performance and reaction times also differ from healthy controls. With further testing, we hope to delineate how subfield-specific atrophy corresponds with changes in cognitive function, and characterise how this progresses over the time course of the disease. Conclusion: Novel sequences on a MRI scanner such as those in route in clinical use can be used to delineate hippocampal subfields in patients with and without dementia. Preliminary data suggest that such subfield analysis, perhaps in combination with cognitive tasks, may be an early marker of AD.Keywords: Alzheimer's disease, dementia, memory, cognition, hippocampus
Procedia PDF Downloads 5741659 A Method to Assess Aspect of Sustainable Development: Walkability
Authors: Amna Ali Al-Saadi, Riken Homma, Kazuhisa Iki
Abstract:
Despite the fact that many places have successes in achieving some aspects of sustainable urban development, there are no scientific facts to convince decision makers. Also, each of them was developed to fulfill the need of specific city only. Therefore, objective method to generate the solutions from a successful case is the aim of this research. The questions were: how to learn the lesson from each case study; how to distinguish the potential criteria and negative one; snd how to quantify their effects in the future development. Walkability has been selected as a goal. This is because it has been found as a solution to achieve healthy life style as well as social, environmental and economic sustainability. Moreover, it has complication as every aspect of sustainable development. This research is stand on quantitative- comparative methodology in order to assess pedestrian oriented development. Three analyzed area (AAs) were selected. One site is located in Oman in which hypotheses as motorized oriented development, while two sites are in Japan where the development is pedestrian friendly. The study used Multi- criteria evaluation method (MCEM). Initially, MCEM stands on analytic hierarchy process (AHP). The later was structured into main goal (walkability), objectives (functions and layout) and attributes (the urban form criteria). Secondly, the GIS were used to evaluate the attributes in multi-criteria maps. Since each criterion has different scale of measurement, all results were standardized by z-score and used to measure the co-relations among criteria. As results, different scenario was generated from each AA. MCEM (AHP-OWA)-GIS measured the walkability score and determined the priority of criteria development in the non-walker friendly environment. The comparison criteria for z-score presented a measurable distinguished orientation of development. This result has been used to prove that Oman is motorized environment while Japan is walkable. Also, it defined the powerful criteria and week criteria regardless to the AA. This result has been used to generalize the priority for walkable development. In conclusion, the method was found successful in generate scientific base for policy decisions.Keywords: walkability, policy decisions, sustainable development, GIS
Procedia PDF Downloads 4421658 Evaluation of κ -Carrageenan Hydrogel Efficiency in Wound-Healing
Authors: Ali Ayatic, Emad Mozaffari, Bahareh Tanhaei, Maryam Khajenoori, Saeedeh Movaghar Khoshkho, Ali Ayati
Abstract:
The abuse of antibiotics, such as tetracycline (TC), is a great global threat to people and the use of topical antibiotics is a promising tact that can help to solve this problem. Antibiotic therapy is often appropriate and necessary for acute wound infections, while topical tetracycline can be highly efficient in improving the wound healing process in diabetics. Due to the advantages of drug-loaded hydrogels as wound dressing, such as ease of handling, high moisture resistance, excellent biocompatibility, and the ability to activate immune cells to speed wound healing, it was found as an ideal wound treatment. In this work, the tetracycline-loaded hydrogels combining agar (AG) and κ-carrageenan (k-CAR) as polymer materials were prepared, in which span60 surfactant was introduced inside as a drug carrier. The Field Emission Scanning Electron Microscopes (FESEM) and Fourier-transform infrared spectroscopy (FTIR) techniques were employed to provide detailed information on the morphology, composition, and structure of fabricated drug-loaded hydrogels and their mechanical properties, and hydrogel permeability to water vapor was investigated as well. Two types of gram-negative and gram-positive bacteria were used to explore the antibacterial properties of prepared tetracycline-contained hydrogels. Their swelling and drug release behavior was studied using the changing factors such as the ratio of polysaccharides (MAG/MCAR), the span60 surfactant concentration, potassium chloride (KCl) concentration and different release media (deionized water (DW), phosphate-buffered saline (PBS), and simulated wound fluid (SWF)) at different times. Finally, the kinetic behavior of hydrogel swelling was studied. Also, the experimental data of TC release to DW, PBS, and SWF using various mathematical models such as Higuchi, Korsmeyer-Peppas, zero-order, and first-order in the linear and nonlinear modes were evaluated.Keywords: drug release, hydrogel, tetracycline, wound healing
Procedia PDF Downloads 871657 Integration of Educational Data Mining Models to a Web-Based Support System for Predicting High School Student Performance
Authors: Sokkhey Phauk, Takeo Okazaki
Abstract:
The challenging task in educational institutions is to maximize the high performance of students and minimize the failure rate of poor-performing students. An effective method to leverage this task is to know student learning patterns with highly influencing factors and get an early prediction of student learning outcomes at the timely stage for setting up policies for improvement. Educational data mining (EDM) is an emerging disciplinary field of data mining, statistics, and machine learning concerned with extracting useful knowledge and information for the sake of improvement and development in the education environment. The study is of this work is to propose techniques in EDM and integrate it into a web-based system for predicting poor-performing students. A comparative study of prediction models is conducted. Subsequently, high performing models are developed to get higher performance. The hybrid random forest (Hybrid RF) produces the most successful classification. For the context of intervention and improving the learning outcomes, a feature selection method MICHI, which is the combination of mutual information (MI) and chi-square (CHI) algorithms based on the ranked feature scores, is introduced to select a dominant feature set that improves the performance of prediction and uses the obtained dominant set as information for intervention. By using the proposed techniques of EDM, an academic performance prediction system (APPS) is subsequently developed for educational stockholders to get an early prediction of student learning outcomes for timely intervention. Experimental outcomes and evaluation surveys report the effectiveness and usefulness of the developed system. The system is used to help educational stakeholders and related individuals for intervening and improving student performance.Keywords: academic performance prediction system, educational data mining, dominant factors, feature selection method, prediction model, student performance
Procedia PDF Downloads 1111656 An Investigation on Opportunities and Obstacles on Implementation of Building Information Modelling for Pre-fabrication in Small and Medium Sized Construction Companies in Germany: A Practical Approach
Authors: Nijanthan Mohan, Rolf Gross, Fabian Theis
Abstract:
The conventional method used in the construction industries often resulted in significant rework since most of the decisions were taken onsite under the pressure of project deadlines and also due to the improper information flow, which results in ineffective coordination. However, today’s architecture, engineering, and construction (AEC) stakeholders demand faster and accurate deliverables, efficient buildings, and smart processes, which turns out to be a tall order. Hence, the building information modelling (BIM) concept was developed as a solution to fulfill the above-mentioned necessities. Even though BIM is successfully implemented in most of the world, it is still in the early stages in Germany, since the stakeholders are sceptical of its reliability and efficiency. Due to the huge capital requirement, the small and medium-sized construction companies are still reluctant to implement BIM workflow in their projects. The purpose of this paper is to analyse the opportunities and obstacles to implementing BIM for prefabrication. Among all other advantages of BIM, pre-fabrication is chosen for this paper because it plays a vital role in creating an impact on time as well as cost factors of a construction project. The positive impact of prefabrication can be explicitly observed by the project stakeholders and participants, which enables the breakthrough of the skepticism factor among the small scale construction companies. The analysis consists of the development of a process workflow for implementing prefabrication in building construction, followed by a practical approach, which was executed with two case studies. The first case study represents on-site prefabrication, and the second was done for off-site prefabrication. It was planned in such a way that the first case study gives a first-hand experience for the workers at the site on the BIM model so that they can make much use of the created BIM model, which is a better representation compared to the traditional 2D plan. The main aim of the first case study is to create a belief in the implementation of BIM models, which was succeeded by the execution of offshore prefabrication in the second case study. Based on the case studies, the cost and time analysis was made, and it is inferred that the implementation of BIM for prefabrication can reduce construction time, ensures minimal or no wastes, better accuracy, less problem-solving at the construction site. It is also observed that this process requires more planning time, better communication, and coordination between different disciplines such as mechanical, electrical, plumbing, architecture, etc., which was the major obstacle for successful implementation. This paper was carried out in the perspective of small and medium-sized mechanical contracting companies for the private building sector in Germany.Keywords: building information modelling, construction wastes, pre-fabrication, small and medium sized company
Procedia PDF Downloads 1191655 Covid Medical Imaging Trial: Utilising Artificial Intelligence to Identify Changes on Chest X-Ray of COVID
Authors: Leonard Tiong, Sonit Singh, Kevin Ho Shon, Sarah Lewis
Abstract:
Investigation into the use of artificial intelligence in radiology continues to develop at a rapid rate. During the coronavirus pandemic, the combination of an exponential increase in chest x-rays and unpredictable staff shortages resulted in a huge strain on the department's workload. There is a World Health Organisation estimate that two-thirds of the global population does not have access to diagnostic radiology. Therefore, there could be demand for a program that could detect acute changes in imaging compatible with infection to assist with screening. We generated a conventional neural network and tested its efficacy in recognizing changes compatible with coronavirus infection. Following ethics approval, a deidentified set of 77 normal and 77 abnormal chest x-rays in patients with confirmed coronavirus infection were used to generate an algorithm that could train, validate and then test itself. DICOM and PNG image formats were selected due to their lossless file format. The model was trained with 100 images (50 positive, 50 negative), validated against 28 samples (14 positive, 14 negative), and tested against 26 samples (13 positive, 13 negative). The initial training of the model involved training a conventional neural network in what constituted a normal study and changes on the x-rays compatible with coronavirus infection. The weightings were then modified, and the model was executed again. The training samples were in batch sizes of 8 and underwent 25 epochs of training. The results trended towards an 85.71% true positive/true negative detection rate and an area under the curve trending towards 0.95, indicating approximately 95% accuracy in detecting changes on chest X-rays compatible with coronavirus infection. Study limitations include access to only a small dataset and no specificity in the diagnosis. Following a discussion with our programmer, there are areas where modifications in the weighting of the algorithm can be made in order to improve the detection rates. Given the high detection rate of the program, and the potential ease of implementation, this would be effective in assisting staff that is not trained in radiology in detecting otherwise subtle changes that might not be appreciated on imaging. Limitations include the lack of a differential diagnosis and application of the appropriate clinical history, although this may be less of a problem in day-to-day clinical practice. It is nonetheless our belief that implementing this program and widening its scope to detecting multiple pathologies such as lung masses will greatly assist both the radiology department and our colleagues in increasing workflow and detection rate.Keywords: artificial intelligence, COVID, neural network, machine learning
Procedia PDF Downloads 1001654 Production Process for Diesel Fuel Components Polyoxymethylene Dimethyl Ethers from Methanol and Formaldehyde Solution
Authors: Xiangjun Li, Huaiyuan Tian, Wujie Zhang, Dianhua Liu
Abstract:
Polyoxymethylene dimethyl ethers (PODEn) as clean diesel additive can improve the combustion efficiency and quality of diesel fuel and alleviate the problem of atmospheric pollution. Considering synthetic routes, PODE production from methanol and formaldehyde is regarded as the most economical and promising synthetic route. However, methanol used for synthesizing PODE can produce water, which causes the loss of active center of catalyst and hydrolysis of PODEn in the production process. Macroporous strong acidic cation exchange resin catalyst was prepared, which has comparative advantages over other common solid acid catalysts in terms of stability and catalytic efficiency for synthesizing PODE. Catalytic reactions were carried out under 353 K, 1 MPa and 3mL·gcat-1·h-1 in a fixed bed reactor. Methanol conversion and PODE3-6 selectivity reached 49.91% and 23.43%, respectively. Catalyst lifetime evaluation showed that resin catalyst retained its catalytic activity for 20 days without significant changes and catalytic activity of completely deactivated resin catalyst can basically return to previous level by simple acid regeneration. The acid exchange capacities of original and deactivated catalyst were 2.5191 and 0.0979 mmol·g-1, respectively, while regenerated catalyst reached 2.0430 mmol·g-1, indicating that the main reason for resin catalyst deactivation is that Brønsted acid sites of original resin catalyst were temporarily replaced by non-hydrogen ion cations. A separation process consisting of extraction and distillation for PODE3-6 product was designed for separation of water and unreacted formaldehyde from reactive mixture and purification of PODE3-6, respectively. The concentration of PODE3-6 in final product can reach up to 97%. These results indicate that the scale-up production of PODE3-6 from methanol and formaldehyde solution is feasible.Keywords: inactivation, polyoxymethylene dimethyl ethers, separation process, sulfonic cation exchange resin
Procedia PDF Downloads 1401653 Comparative Evaluation of Seropositivity and Patterns Distribution Rates of the Anti-Nuclear Antibodies in the Diagnosis of Four Different Autoimmune Collagen Tissue Diseases
Authors: Recep Kesli, Onur Turkyilmaz, Cengiz Demir
Abstract:
Objective: Autoimmune collagen diseases occur with the immune reactions against the body’s own cell or tissues which cause inflammation and damage the tissues and organs. In this study, it was aimed to compare seropositivity rates and patterns of the anti-nuclear antibodies (ANA) in the diagnosis of four different autoimmune collagen tissue diseases (Rheumatoid Arthritis-RA, Systemic Lupus Erythematous-SLE, Scleroderma-SSc and Sjogren Syndrome-SS) with each other. Methods: One hundred eighty-eight patients applied to different clinics in Afyon Kocatepe University ANS Practice and Research Hospital between 11.07.2014 and 14.07.2015 that thought the different collagen disease such as RA, SLE, SSc and SS have participated in the study retrospectively. All the data obtained from the patients participated in the study were evaluated according to the included criteria. The historical archives belonging to the patients have been screened, assessed in terms of ANA positivity. The obtained data was analysed by using the descriptive statistics; chi-squared, Fischer's exact test. The evaluations were performed by SPSS 20.0 version and p < 0.05 level was considered as significant. Results: Distribution rates of the totally one hundred eighty-eight patients according to the diagnosis were found as follows: 82 (43.6%) were RA, 38 (20.2%) were SLE, 22 (11.7%) were SSc, and 46 (24.5%) were SS. Distribution of ANA positivity rates according to the collagen tissue diseases were found as follows; for RA were 54 (65,9 %), for SLE were 36 (94,7 %), for SSc were 18 (81,8 %), and for SS were 43 (93,5 %). Rheumatoid arthritis should be evaluated and classified as a different class among all the other investigated three autoimmune illnesses. ANA positivity rates were found as differently higher (91.5 %) in the SLE, SSc, and SS, from the RA (65.9 %). Differences at ANA positivity rates for RA and the other three diseases were found as statistically significant (p=0.015). Conclusions: Systemic autoimmune illnesses show broad spectrum. ANA positivity was found as an important predictor marker in the diagnosis of the rheumatologic illnesses. ANA positivity should be evaluated as more valuable and sensitive a predictor diagnostic marker in the laboratory findings of the SLE, SSc, and SS according to RA.Keywords: antinuclear antibody (ANA), rheumatoid arthritis, scleroderma, Sjogren syndrome, systemic lupus Erythemotosus
Procedia PDF Downloads 2461652 Cross-Validation of the Data Obtained for ω-6 Linoleic and ω-3 α-Linolenic Acids Concentration of Hemp Oil Using Jackknife and Bootstrap Resampling
Authors: Vibha Devi, Shabina Khanam
Abstract:
Hemp (Cannabis sativa) possesses a rich content of ω-6 linoleic and ω-3 linolenic essential fatty acid in the ratio of 3:1, which is a rare and most desired ratio that enhances the quality of hemp oil. These components are beneficial for the development of cell and body growth, strengthen the immune system, possess anti-inflammatory action, lowering the risk of heart problem owing to its anti-clotting property and a remedy for arthritis and various disorders. The present study employs supercritical fluid extraction (SFE) approach on hemp seed at various conditions of parameters; temperature (40 - 80) °C, pressure (200 - 350) bar, flow rate (5 - 15) g/min, particle size (0.430 - 1.015) mm and amount of co-solvent (0 - 10) % of solvent flow rate through central composite design (CCD). CCD suggested 32 sets of experiments, which was carried out. As SFE process includes large number of variables, the present study recommends the application of resampling techniques for cross-validation of the obtained data. Cross-validation refits the model on each data to achieve the information regarding the error, variability, deviation etc. Bootstrap and jackknife are the most popular resampling techniques, which create a large number of data through resampling from the original dataset and analyze these data to check the validity of the obtained data. Jackknife resampling is based on the eliminating one observation from the original sample of size N without replacement. For jackknife resampling, the sample size is 31 (eliminating one observation), which is repeated by 32 times. Bootstrap is the frequently used statistical approach for estimating the sampling distribution of an estimator by resampling with replacement from the original sample. For bootstrap resampling, the sample size is 32, which was repeated by 100 times. Estimands for these resampling techniques are considered as mean, standard deviation, variation coefficient and standard error of the mean. For ω-6 linoleic acid concentration, mean value was approx. 58.5 for both resampling methods, which is the average (central value) of the sample mean of all data points. Similarly, for ω-3 linoleic acid concentration, mean was observed as 22.5 through both resampling. Variance exhibits the spread out of the data from its mean. Greater value of variance exhibits the large range of output data, which is 18 for ω-6 linoleic acid (ranging from 48.85 to 63.66 %) and 6 for ω-3 linoleic acid (ranging from 16.71 to 26.2 %). Further, low value of standard deviation (approx. 1 %), low standard error of the mean (< 0.8) and low variance coefficient (< 0.2) reflect the accuracy of the sample for prediction. All the estimator value of variance coefficients, standard deviation and standard error of the mean are found within the 95 % of confidence interval.Keywords: resampling, supercritical fluid extraction, hemp oil, cross-validation
Procedia PDF Downloads 1431651 Vehicles Analysis, Assessment and Redesign Related to Ergonomics and Human Factors
Authors: Susana Aragoneses Garrido
Abstract:
Every day, the roads are scenery of numerous accidents involving vehicles, producing thousands of deaths and serious injuries all over the world. Investigations have revealed that Human Factors (HF) are one of the main causes of road accidents in modern societies. Distracted driving (including external or internal aspects of the vehicle), which is considered as a human factor, is a serious and emergent risk to road safety. Consequently, a further analysis regarding this issue is essential due to its transcendence on today’s society. The objectives of this investigation are the detection and assessment of the HF in order to provide solutions (including a better vehicle design), which might mitigate road accidents. The methodology of the project is divided in different phases. First, a statistical analysis of public databases is provided between Spain and The UK. Second, data is classified in order to analyse the major causes involved in road accidents. Third, a simulation between different paths and vehicles is presented. The causes related to the HF are assessed by Failure Mode and Effects Analysis (FMEA). Fourth, different car models are evaluated using the Rapid Upper Body Assessment (RULA). Additionally, the JACK SIEMENS PLM tool is used with the intention of evaluating the Human Factor causes and providing the redesign of the vehicles. Finally, improvements in the car design are proposed with the intention of reducing the implication of HF in traffic accidents. The results from the statistical analysis, the simulations and the evaluations confirm that accidents are an important issue in today’s society, especially the accidents caused by HF resembling distractions. The results explore the reduction of external and internal HF through the global analysis risk of vehicle accidents. Moreover, the evaluation of the different car models using RULA method and the JACK SIEMENS PLM prove the importance of having a good regulation of the driver’s seat in order to avoid harmful postures and therefore distractions. For this reason, a car redesign is proposed for the driver to acquire the optimum position and consequently reducing the human factors in road accidents.Keywords: analysis vehicles, asssesment, ergonomics, car redesign
Procedia PDF Downloads 3421650 Development and Validation of a Carbon Dioxide TDLAS Sensor for Studies on Fermented Dairy Products
Authors: Lorenzo Cocola, Massimo Fedel, Dragiša Savić, Bojana Danilović, Luca Poletto
Abstract:
An instrument for the detection and evaluation of gaseous carbon dioxide in the headspace of closed containers has been developed in the context of Packsensor Italian-Serbian joint project. The device is based on Tunable Diode Laser Absorption Spectroscopy (TDLAS) with a Wavelength Modulation Spectroscopy (WMS) technique in order to accomplish a non-invasive measurement inside closed containers of fermented dairy products (yogurts and fermented cheese in cups and bottles). The purpose of this instrument is the continuous monitoring of carbon dioxide concentration during incubation and storage of products over a time span of the whole shelf life of the product, in the presence of different microorganisms. The instrument’s optical front end has been designed to be integrated in a thermally stabilized incubator. An embedded computer provides processing of spectral artifacts and storage of an arbitrary set of calibration data allowing a properly calibrated measurement on many samples (cups and bottles) of different shapes and sizes commonly found in the retail distribution. A calibration protocol has been developed in order to be able to calibrate the instrument on the field also on containers which are notoriously difficult to seal properly. This calibration protocol is described and evaluated against reference measurements obtained through an industry standard (sampling) carbon dioxide metering technique. Some sets of validation test measurements on different containers are reported. Two test recordings of carbon dioxide concentration evolution are shown as an example of instrument operation. The first demonstrates the ability to monitor a rapid yeast growth in a contaminated sample through the increase of headspace carbon dioxide. Another experiment shows the dissolution transient with a non-saturated liquid medium in presence of a carbon dioxide rich headspace atmosphere.Keywords: TDLAS, carbon dioxide, cups, headspace, measurement
Procedia PDF Downloads 3271649 Parametric Evaluation for the Optimization of Gastric Emptying Protocols Used in Health Care Institutions
Authors: Yakubu Adamu
Abstract:
The aim of this research was to assess the factors contributing to the need for optimisation of the gastric emptying protocols in nuclear medicine and molecular imaging (SNMMI) procedures. The objective is to suggest whether optimisation is possible and provide supporting evidence for the current imaging protocols of gastric emptying examination used in nuclear medicine. The research involved the use of some selected patients with 30 dynamic series for the image processing using ImageJ, and by so doing, the calculated half-time, retention fraction to the 60 x1 minute, 5 minute and 10-minute protocol, and other sampling intervals were obtained. Results from the study IDs for the gastric emptying clearance half-time were classified into normal, abnormal fast, and abnormal slow categories. In the normal category, which represents 50% of the total gastric emptying image IDs processed, their clearance half-time was within the range of 49.5 to 86.6 minutes of the mean counts. Also, under the abnormal fast category, their clearance half-time fell between 21 to 43.3 minutes of the mean counts, representing 30% of the total gastric emptying image IDs processed, and the abnormal slow category had clearance half-time within the range of 138.6 to 138.6 minutes of the mean counts, representing 20%. The results indicated that the calculated retention fraction values from the 1, 5, and 10-minute sampling curves and the measured values of gastric emptying retention fraction from sampling curves of the study IDs had a normal retention fraction of <60% and decreased exponentially with an increase in time and it was evident with low percentages of retention fraction ratios of < 10% after the 4 hours. Thus, this study does not change categories suggesting that these values could feasibly be used instead of having to acquire actual images. Findings from the study suggest that the current gastric emptying protocol can be optimized by acquiring fewer images. The study recommended that the gastric emptying studies should be performed with imaging at a minimum of 0, 1, 2, and 4 hours after meal ingestion.Keywords: gastric emptying, retention fraction, clearance halftime, optimisation, protocol
Procedia PDF Downloads 191648 Explore and Reduce the Performance Gap between Building Modelling Simulations and the Real World: Case Study
Authors: B. Salehi, D. Andrews, I. Chaer, A. Gillich, A. Chalk, D. Bush
Abstract:
With the rapid increase of energy consumption in buildings in recent years, especially with the rise in population and growing economies, the importance of energy savings in buildings becomes more critical. One of the key factors in ensuring energy consumption is controlled and kept at a minimum is to utilise building energy modelling at the very early stages of the design. So, building modelling and simulation is a growing discipline. During the design phase of construction, modelling software can be used to estimate a building’s projected energy consumption, as well as building performance. The growth in the use of building modelling software packages opens the door for improvements in the design and also in the modelling itself by introducing novel methods such as building information modelling-based software packages which promote conventional building energy modelling into the digital building design process. To understand the most effective implementation tools, research projects undertaken should include elements of real-world experiments and not just rely on theoretical and simulated approaches. Upon review of the related studies undertaken, it’s evident that they are mostly based on modelling and simulation, which can be due to various reasons such as the more expensive and time-consuming nature of real-time data-based studies. Taking in to account the recent rise of building energy software modelling packages and the increasing number of studies utilising these methods in their projects and research, the accuracy and reliability of these modelling software packages has become even more crucial and critical. This Energy Performance Gap refers to the discrepancy between the predicted energy savings and the realised actual savings, especially after buildings implement energy-efficient technologies. There are many different software packages available which are either free or have commercial versions. In this study, IES VE (Integrated Environmental Solutions Virtual Environment) is used as it is a common Building Energy Modeling and Simulation software in the UK. This paper describes a study that compares real time results with those in a virtual model to illustrate this gap. The subject of the study is a north west facing north-west (345°) facing, naturally ventilated, conservatory within a domestic building in London is monitored during summer to capture real-time data. Then these results are compared to the virtual results of IES VE, which is a commonly used building energy modelling and simulation software in the UK. In this project, the effect of the wrong position of blinds on overheating is studied as well as providing new evidence of Performance Gap. Furthermore, the challenges of drawing the input of solar shading products in IES VE will be considered.Keywords: building energy modelling and simulation, integrated environmental solutions virtual environment, IES VE, performance gap, real time data, solar shading products
Procedia PDF Downloads 1421647 Evaluation of Neonicotinoids Against Sucking Insect Pests of Cotton in Laboratory and Field Conditions
Authors: Muhammad Sufyan, Muhammad D. Gogi, Muhammad Arshad, Ahmad Nawaz, Muhammad Usman
Abstract:
Cotton (Gossypium hirsutum) universally known as silver fiber and is one of the most important cash crop of Pakistan. A wide array of pests constraints cotton production among which sucking insect pests cause serious losses. Mostly new chemistry insecticides used to control a wide variety of insect pests including sucking insect pests. In the present study efficacy of different neonicotinoids was evaluated against sucking insect pests of cotton in the field and in laboratory for red and dusky cotton bug. The experiment was conducted at Entomology Research Station, University of Agriculture Faisalabad, in a Randomized Complete Block Design (RCBD). Field trial was conducted to evaluate the efficacy of Confidence Ultra (Imidacloprid) 70% SL, Confidor (Imidacloprid) 20% SL, Kendo (Lambda cyhalothrin) 24.7 SC, Actara (Thiamethoxam) 25% WG, Forcast (Tebufenozide+ Emamectin benzoate) 8.8 EW and Timer (Emamectin benzoate) 1.9 EC at their recommended doses. The data was collected on per leaf basis of thrips, aphid, jassid and whitefly before 24 hours of spray. The post treatment data was recorded after 24, 48 and 72 hours. The fresh, non-infested and untreated cotton leaves was collected from the field and brought to the laboratory to assess the efficacy of neonicotinoids against red and dusky cotton bug. After data analysis all the insecticides were found effective against sucking pests. Confidence Ultra was highly effective against the aphid, jassid, and whitefly and gave maximum mortality, while showed non-significant results against thrips. In case of aphid plot which was treated with Kando 24.7 SC showed significant mortality after 72 hours of pesticide application. Similar trends were found in laboratory conditions with all these treatments by making different concentrations and had significant impact on dusky cotton bug and red cotton bug population after 24, 48 and 72 hours after application.Keywords: cotton, laboratory and field conditions, neonicotinoids, sucking insect pests
Procedia PDF Downloads 2481646 The Integrated Methodological Development of Reliability, Risk and Condition-Based Maintenance in the Improvement of the Thermal Power Plant Availability
Authors: Henry Pariaman, Iwa Garniwa, Isti Surjandari, Bambang Sugiarto
Abstract:
Availability of a complex system of thermal power plant is strongly influenced by the reliability of spare parts and maintenance management policies. A reliability-centered maintenance (RCM) technique is an established method of analysis and is the main reference for maintenance planning. This method considers the consequences of failure in its implementation, but does not deal with further risk of down time that associated with failures, loss of production or high maintenance costs. Risk-based maintenance (RBM) technique provides support strategies to minimize the risks posed by the failure to obtain maintenance task considering cost effectiveness. Meanwhile, condition-based maintenance (CBM) focuses on monitoring the application of the conditions that allow the planning and scheduling of maintenance or other action should be taken to avoid the risk of failure prior to the time-based maintenance. Implementation of RCM, RBM, CBM alone or combined RCM and RBM or RCM and CBM is a maintenance technique used in thermal power plants. Implementation of these three techniques in an integrated maintenance will increase the availability of thermal power plants compared to the use of maintenance techniques individually or in combination of two techniques. This study uses the reliability, risks and conditions-based maintenance in an integrated manner to increase the availability of thermal power plants. The method generates MPI (Priority Maintenance Index) is RPN (Risk Priority Number) are multiplied by RI (Risk Index) and FDT (Failure Defense Task) which can generate the task of monitoring and assessment of conditions other than maintenance tasks. Both MPI and FDT obtained from development of functional tree, failure mode effects analysis, fault-tree analysis, and risk analysis (risk assessment and risk evaluation) were then used to develop and implement a plan and schedule maintenance, monitoring and assessment of the condition and ultimately perform availability analysis. The results of this study indicate that the reliability, risks and conditions-based maintenance methods, in an integrated manner can increase the availability of thermal power plants.Keywords: integrated maintenance techniques, availability, thermal power plant, MPI, FDT
Procedia PDF Downloads 7991645 Erosion Modeling of Surface Water Systems for Long Term Simulations
Authors: Devika Nair, Sean Bellairs, Ken Evans
Abstract:
Flow and erosion modeling provides an avenue for simulating the fine suspended sediment in surface water systems like streams and creeks. Fine suspended sediment is highly mobile, and many contaminants that may have been released by any sort of catchment disturbance attach themselves to these sediments. Therefore, a knowledge of fine suspended sediment transport is important in assessing contaminant transport. The CAESAR-Lisflood Landform Evolution Model, which includes a hydrologic model (TOPMODEL) and a hydraulic model (Lisflood), is being used to assess the sediment movement in tropical streams on account of a disturbance in the catchment of the creek and to determine the dynamics of sediment quantity in the creek through the years by simulating the model for future years. The accuracy of future simulations depends on the calibration and validation of the model to the past and present events. Calibration and validation of the model involve finding a combination of parameters of the model, which, when applied and simulated, gives model outputs similar to those observed for the real site scenario for corresponding input data. Calibrating the sediment output of the CAESAR-Lisflood model at the catchment level and using it for studying the equilibrium conditions of the landform is an area yet to be explored. Therefore, the aim of the study was to calibrate the CAESAR-Lisflood model and then validate it so that it could be run for future simulations to study how the landform evolves over time. To achieve this, the model was run for a rainfall event with a set of parameters, plus discharge and sediment data for the input point of the catchment, to analyze how similar the model output would behave when compared with the discharge and sediment data for the output point of the catchment. The model parameters were then adjusted until the model closely approximated the real site values of the catchment. It was then validated by running the model for a different set of events and checking that the model gave similar results to the real site values. The outcomes demonstrated that while the model can be calibrated to a greater extent for hydrology (discharge output) throughout the year, the sediment output calibration may be slightly improved by having the ability to change parameters to take into account the seasonal vegetation growth during the start and end of the wet season. This study is important to assess hydrology and sediment movement in seasonal biomes. The understanding of sediment-associated metal dispersion processes in rivers can be used in a practical way to help river basin managers more effectively control and remediate catchments affected by present and historical metal mining.Keywords: erosion modelling, fine suspended sediments, hydrology, surface water systems
Procedia PDF Downloads 901644 Development of an Appropriate Method for the Determination of Multiple Mycotoxins in Pork Processing Products by UHPLC-TCFLD
Authors: Jason Gica, Yi-Hsieng Samuel Wu, Deng-Jye Yang, Yi-Chen Chen
Abstract:
Mycotoxins, harmful secondary metabolites produced by certain fungi species, pose significant risks to animals and humans worldwide. Their stable properties lead to contamination during grain harvesting, transportation, and storage, as well as in processed food products. The prevalence of mycotoxin contamination has attracted significant attention due to its adverse impact on food safety and global trade. The secondary contamination pathway from animal products has been identified as an important route of exposure, posing health risks for livestock and humans consuming contaminated products. Pork, one of the highly consumed meat products in Taiwan according to the National Food Consumption Database, plays a critical role in the nation's diet and economy. Given its substantial consumption, pork processing products are a significant component of the food supply chain and a potential source of mycotoxin contamination. This study is paramount for formulating effective regulations and strategies to mitigate mycotoxin-related risks in the food supply chain. By establishing a reliable analytical method, this research contributes to safeguarding public health and enhancing the quality of pork processing products. The findings will serve as valuable guidance for policymakers, food industries, and consumers to ensure a safer food supply chain in the face of emerging mycotoxin challenges. An innovative and efficient analytical approach is proposed using Ultra-High Performance Liquid Chromatography coupled with Temperature Control Fluorescence Detector Light (UHPLC-TCFLD) to determine multiple mycotoxins in pork meat samples due to its exceptional capacity to detect multiple mycotoxins at the lowest levels of concentration, making it highly sensitive and reliable for comprehensive mycotoxin analysis. Additionally, its ability to simultaneously detect multiple mycotoxins in a single run significantly reduces the time and resources required for analysis, making it a cost-effective solution for monitoring mycotoxin contamination in pork processing products. The research aims to optimize the efficient mycotoxin QuEChERs extraction method and rigorously validate its accuracy and precision. The results will provide crucial insights into mycotoxin levels in pork processing products.Keywords: multiple-mycotoxin analysis, pork processing products, QuEChERs, UHPLC-TCFLD, validation
Procedia PDF Downloads 821643 Exploring the Design of Prospective Human Immunodeficiency Virus Type 1 Reverse Transcriptase Inhibitors through a Comprehensive Approach of Quantitative Structure Activity Relationship Study, Molecular Docking, and Molecular Dynamics Simulations
Authors: Mouna Baassi, Mohamed Moussaoui, Sanchaita Rajkhowa, Hatim Soufi, Said Belaaouad
Abstract:
The objective of this paper is to address the challenging task of targeting Human Immunodeficiency Virus type 1 Reverse Transcriptase (HIV-1 RT) in the treatment of AIDS. Reverse Transcriptase inhibitors (RTIs) have limitations due to the development of Reverse Transcriptase mutations that lead to treatment resistance. In this study, a combination of statistical analysis and bioinformatics tools was adopted to develop a mathematical model that relates the structure of compounds to their inhibitory activities against HIV-1 Reverse Transcriptase. Our approach was based on a series of compounds recognized for their HIV-1 RT enzymatic inhibitory activities. These compounds were designed via software, with their descriptors computed using multiple tools. The most statistically promising model was chosen, and its domain of application was ascertained. Furthermore, compounds exhibiting comparable biological activity to existing drugs were identified as potential inhibitors of HIV-1 RT. The compounds underwent evaluation based on their chemical absorption, distribution, metabolism, excretion, toxicity properties, and adherence to Lipinski's rule. Molecular docking techniques were employed to examine the interaction between the Reverse Transcriptase (Wild Type and Mutant Type) and the ligands, including a known drug available in the market. Molecular dynamics simulations were also conducted to assess the stability of the RT-ligand complexes. Our results reveal some of the new compounds as promising candidates for effectively inhibiting HIV-1 Reverse Transcriptase, matching the potency of the established drug. This necessitates further experimental validation. This study, beyond its immediate results, provides a methodological foundation for future endeavors aiming to discover and design new inhibitors targeting HIV-1 Reverse Transcriptase.Keywords: QSAR, ADMET properties, molecular docking, molecular dynamics simulation, reverse transcriptase inhibitors, HIV type 1
Procedia PDF Downloads 961642 Water-Controlled Fracturing with Fuzzy-Ball Fluid in Tight Gas Reservoirs of Deep Coal Measures in Sulige
Authors: Xiangchun Wang, Lihui Zheng, Maozong Gan, Peng Zhang, Tong Wu, An Chang
Abstract:
The deep coal measure tight gas reservoir in Sulige is usually reformed by fracturing, because the reservoir thickness is small, the water layers can be easily communicated during fracturing, which will lead to water production of gas wells and lower production of gas wells. Therefore, it is necessary to control water during fracturing in deep coal measure tight gas reservoir. Using fuzzy-ball fluid to control water fracturing can not only increase the output but also reduce the water output. The fuzzy-ball fluid was prepared indoors to carry out evaluation experiments. The fuzzy ball fluid was mixed in equal volume with the pre-fluid and formation water to test its compatibility. The core displacement device was used to test the gas and water breaking through the matrix and fractured cores blocked by fuzzy-ball fluid. The breakthrough pressure of the plunger tests its water blocking performance. The experimental results show that there is no precipitation after the fuzzy-ball fluid is mixed with the pad fluid and the formation water, respectively. The breakthrough pressure gradients of gas and water after the fuzzy-ball fluid plugged the cracks were 0.02MPa/cm and 0.04MPa/cm, respectively, and the breakthrough pressure gradients of gas and water after the matrix was plugged were 0.03MPa/cm and 0.2MPa/cm, respectively, which meet the requirements of field operation. Two wells A and B in the Sulige Gas Field were used on site to implement water control fracturing. After the pre-fluid was injected into the two wells, 50m3 of fuzzy-ball fluid was pumped to plug the water. The construction went smoothly. After water control and fracturing, the average daily output in 161 days was increased by 13.71% and 6.99% compared with that of adjacent wells in the same layer. The adjacent wells were bubbled for 3 times and 63 times respectively, while there was no effusion in A and B construction wells. The results show that fuzzy-ball fluid is a water plugging material suitable for water control fracturing in tight gas wells, and its water control mechanism can also provide a new idea for the development of water control fracturing materials.Keywords: coal seam, deep layer, fracking, fuzzy-ball fluid, reservoir reconstruction
Procedia PDF Downloads 2351641 Evaluation of Microwave-Assisted Pretreatment for Spent Coffee Grounds
Authors: Shady S. Hassan, Brijesh K. Tiwari, Gwilym A. Williams, Amit K. Jaiswal
Abstract:
Waste materials from a wide range of agro-industrial processes may be used as substrates for microbial growth, and subsequently the production of a range of high value products and bioenergy. In addition, utilization of these agro-residues in bioprocesses has the dual advantage of providing alternative substrates, as well as solving their disposal problems. Spent coffee grounds (SCG) are a by-product (45%) of coffee processing. SCG is a lignocellulosic material, which is composed mainly of cellulose, hemicelluloses, and lignin. Thus, a pretreatment process is required to facilitate an efficient enzymatic hydrolysis of such carbohydrates. In this context, microwave pretreatment of lignocellulosic biomass without the addition of harsh chemicals represents a green technology. Moreover, microwave treatment has a high heating efficiency and is easy to implement. Thus, microwave pretreatment of SCG without adding of harsh chemicals investigated as a green technology to enhance enzyme hydrolysis. In the present work, microwave pretreatment experiments were conducted on SCG at varying power levels (100, 250, 440, 600, and 1000 W) for 60 s. By increasing microwave power to a certain level (which vary by varying biomass), reducing sugar increases, then reducing sugar from biomass start to decrease with microwave power increase beyond this level. Microwave pretreatment of SCG at 60s followed by enzymatic hydrolysis resulted in total reducing sugars of 91.6 ± 7.0 mg/g of biomass (at microwave power of 100 w). Fourier transform Infrared Spectroscopy (FTIR) was employed to investigate changes in functional groups of biomass after pretreatment, while high-performance liquid chromatography (HPLC) was employed for determination of glucose. Pretreatment of lignocellulose using microwave was found to be an effective and energy efficient technology to improve saccharification and glucose yield. Energy performance will be evaluated for the microwave pretreatment, and the enzyme hydrolysate will be used as media component substitute for the production of ethanol and other high value products.Keywords: lignocellulose, microwave, pretreatment, spent coffee grounds
Procedia PDF Downloads 4231640 From Name-Calling to Insidious Rhetoric: Construction and Evolution of the Transgender Imagery in News Discourse, 1953-2016
Authors: Hsiao-Yung Wang
Abstract:
This essay aims to examine how the transgender imagery has been constructed in the Taiwanese news media and its evolution from 1953 to 2016. It also explores the discourse patterns and rhetorical strategies in the transgender-related issues which contributed to levels of evaluation in forming ‘social deviance.’ Samples for analysis were selected from mainstream newspapers, including China Times, United Daily and Apple Daily. The time frame for sample selection is from August 1953 (when the first transgender case was reported in Taiwan) to June 2016. To enhance understanding of media representation as nominalistic-based, the author refers to the representative of critical rhetoric Raymie McKerrow for his study on remembrance and forgetfulness in public discourse (especially in his model of ‘critique of domination’); thereby categorizing the 64 years of transgender discourse into five periods: (1) transgender as ‘intersex’ of surgical-reparative medical treatment; (2) transgender as ‘freak gender-bender’ with criminal behaviors; (3) transgender as ‘ladyboy’ (‘katoey in a Thai term) of bar girls or sex workers; (4) transgender as ‘cross dresser’ of transvestite performance; and (5) transgender as ‘life-style or human right’ of spontaneous gender identification. Based on the research findings, this essay argues that the characterization of transgender reporting as a site for the production of compulsory sexism and gender stereotype by the specific forms of name-calling. Besides, the evolution of word-image addressing to transgender issues also pinpoints media as a reflection of fashion of the day. While the transgender imagery might be crystallized as ‘still social problems’ or ‘gender transgression’ in insidious rhetoric; and while the so-called ‘phobia’ persistently embodies in media discourse to exercise name-calling in an ambiguous (rather than in a bullying) way or under the cover of humanist-liberalist rationales, these emergent rhetorical dilemma should be resolved without any delay.Keywords: critical rhetoric, media representation, McKerrow, nominalistic, social deviance, transgender
Procedia PDF Downloads 317