Search results for: forest law and regulation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2427

Search results for: forest law and regulation

1497 Cognitive Models of Future in Political Texts

Authors: Solopova Olga

Abstract:

The present paper briefly recalls theoretical preconditions for investigating cognitive-discursive models of future in political discourse. The author reviews theories and methods used for strengthening a future focus in this discourse working out two main tools – a model of future and a metaphorical scenario. The paper examines the implications of metaphorical analogies for modeling future in mass media. It argues that metaphor is not merely a rhetorical ornament in the political discourse of media regulation but a conceptual model that legislates and regulates our understanding of future.

Keywords: cognitive approach, future research, political discourse, model, scenario, metaphor

Procedia PDF Downloads 395
1496 Safety Evaluation of Post-Consumer Recycled PET Materials in Chilean Industry by Overall Migration Tests

Authors: Evelyn Ilabaca, Ximena Valenzuela, Alejandra Torres, María José Galotto, Abel Guarda

Abstract:

One of the biggest problems in food packaging industry, especially with the plastic materials, is the fact that these materials are usually obtained from non-renewable resources and also remain as waste after its use, causing environmental issues. This is an international concern and particular attention is given to reduction, reuse and recycling strategies for decreasing the waste from plastic packaging industry. In general, polyethylenes represent most plastic waste and recycling process of post-consumer polyethylene terephthalate (PCR-PET) has been studied. US Food and Drug Administration (FDA), European Food Safety Authority (EFSA) and Southern Common Market (MERCOSUR) have generated different legislative documents to control the use of PCR-PET in the production of plastic packaging intended direct food contact in order to ensure the capacity of recycling process to remove possible contaminants that can migrate into food. Consequently, it is necessary to demonstrate by challenge test that the recycling process is able to remove specific contaminants, obtaining a safe recycled plastic to human health. These documents establish that the concentration limit for substitute contaminants in PET is 220 ppb (ug/kg) and the specific migration limit is 10 ppb (ug/kg) for each contaminant, in addition to assure the sensorial characteristics of food are not affected. Moreover, under the Commission Regulation (EU) N°10/2011 on plastic materials and articles intended to come into contact with food, it is established that overall migration limit is 10 mg of substances per 1 dm2 of surface area of the plastic material. Thus, the aim of this work is to determine the safety of PCR-PET-containing food packaging materials in Chile by measuring their overall migration, and their comparison with the established limits at international level. This information will serve as a basis to provide a regulation to control and regulate the use of recycled plastic materials in the manufacture of plastic packaging intended to be in direct contact with food. The methodology used involves a procedure according to EN-1186:2002 with some modifications. The food simulants used were ethanol 10 % (v/v) and acetic acid 3 % (v/v) as aqueous food simulants, and ethanol 95 % (v/v) and isooctane as substitutes of fatty food simulants. In this study, preliminary results showed that Chilean food packaging plastics with different PCR-PET percentages agree with the European Legislation for food aqueous character.

Keywords: contaminants, polyethylene terephthalate, plastic food packaging, recycling

Procedia PDF Downloads 276
1495 Three-Level Converters Back-To-Back DC Bus Control for Torque Ripple Reduction of Induction Motor

Authors: T. Abdelkrim, K. Benamrane, B. Bezza, Aeh Benkhelifa, A. Borni

Abstract:

This paper proposes a regulation method of back-to-back connected three-level converters in order to reduce the torque ripple in induction motor. First part is dedicated to the presentation of the feedback control of three-level PWM rectifier. In the second part, three-level NPC voltage source inverter balancing DC bus algorithm is presented. A theoretical analysis with a complete simulation of the system is presented to prove the excellent performance of the proposed technique.

Keywords: back-to-back connection, feedback control, neutral-point balance, three-level converter, torque ripple

Procedia PDF Downloads 497
1494 Preliminary Result on the Impact of Anthropogenic Noise on Understory Bird Population in Primary Forest of Gaya Island

Authors: Emily A. Gilbert, Jephte Sompud, Andy R. Mojiol, Cynthia B. Sompud, Alim Biun

Abstract:

Gaya Island of Sabah is known for its wildlife and marine biodiversity. It has marks itself as one of the hot destinations of tourists from all around the world. Gaya Island tourism activities have contributed to Sabah’s economy revenue with the high number of tourists visiting the island. However, it has led to the increased anthropogenic noise derived from tourism activities. This may greatly interfere with the animals such as understory birds that rely on acoustic signals as a tool for communication. Many studies in other parts of the regions reveal that anthropogenic noise does decrease species richness of avian community. However, in Malaysia, published research regarding the impact of anthropogenic noise on the understory birds is still very lacking. This study was conducted in order to fill up this gap. This study aims to investigate the anthropogenic noise’s impact towards understory bird population. There were three sites within the Primary forest of Gaya Island that were chosen to sample the level of anthropogenic noise in relation to the understory bird population. Noise mapping method was used to measure the anthropogenic noise level and identify the zone with high anthropogenic noise level (> 60dB) and zone with low anthropogenic noise level (< 60dB) based on the standard threshold of noise level. The methods that were used for this study was solely mist netting and ring banding. This method was chosen as it can determine the diversity of the understory bird population in Gaya Island. The preliminary study was conducted from 15th to 26th April and 5th to 10th May 2015 whereby there were 2 mist nets that were set up at each of the zones within the selected site. The data was analyzed by using the descriptive analysis, presence and absence analysis, diversity indices and diversity t-test. Meanwhile, PAST software was used to analyze the obtain data. The results from this study present a total of 60 individuals that consisted of 12 species from 7 families of understory birds were recorded in three of the sites in Gaya Island. The Shannon-Wiener index shows that diversity of species in high anthropogenic noise zone and low anthropogenic noise zone were 1.573 and 2.009, respectively. However, the statistical analysis shows that there was no significant difference between these zones. Nevertheless, based on the presence and absence analysis, it shows that the species at the low anthropogenic noise zone was higher as compared to the high anthropogenic noise zone. Thus, this result indicates that there is an impact of anthropogenic noise on the population diversity of understory birds. There is still an urgent need to conduct an in-depth study by increasing the sample size in the selected sites in order to fully understand the impact of anthropogenic noise towards the understory birds population so that it can then be in cooperated into the wildlife management for a sustainable environment in Gaya Island.

Keywords: anthropogenic noise, biodiversity, Gaya Island, understory bird

Procedia PDF Downloads 365
1493 Net Neutrality and Asymmetric Platform Competition

Authors: Romain Lestage, Marc Bourreau

Abstract:

In this paper we analyze the interplay between access to the last-mile network and net neutrality in the market for Internet access. We consider two Internet Service Providers (ISPs), which act as platforms between Internet users and Content Providers (CPs). One of the ISPs is vertically integrated and provides access to its last-mile network to the other (non-integrated) ISP. We show that a lower access price increases the integrated ISP's incentives to charge CPs positive termination fees (i.e., to deviate from net neutrality), and decreases the non-integrated ISP's incentives to charge positive termination fees.

Keywords: net neutrality, access regulation, internet access, two-sided markets

Procedia PDF Downloads 377
1492 Competences for Learning beyond the Academic Context

Authors: Cristina Galván-Fernández

Abstract:

Students differentiate the different contexts of their lives as well as employment, hobbies or studies. In higher education is needed to transfer the experiential knowledge to theory and viceversa. However, is difficult to achieve than students use their personal experiences and social readings for get the learning evidences. In an experience with 178 education students from Chile and Spain we have used an e-portfolio system and a methodology for 4 years with the aims of help them to: 1) self-regulate their learning process and 2) use social networks and professional experiences for make the learning evidences. These two objectives have been controlled by interviews to the same students in different moments and two questionnaires. The results of this study show that students recognize the ownership of their learning and progress in planning and reflection of their own learning.

Keywords: competences, e-portfolio, higher education, self-regulation

Procedia PDF Downloads 301
1491 The Development of Private Housing Schemes to Address the Housing Problem: A Case Study of Islamabad

Authors: Zafar Iqbal Zafar, Abdul Waheed

Abstract:

The Capital Development Authority (CDA) Ordinance 1960 requires CDA to acquire land for the provision of housing in Islamabad. However, the pace of residential development was slow and the demand for housing was increasing rapidly. To resolve the growing housing problem, CDA involved the private sector in the development of housing schemes. Detailed bye-laws for regulation of private housing schemes were prepared and these bylaws were called “Modalities & Procedures”. This paper explains how the Modalities and Procedures of CDA have been successful in regulating the development of private housing schemes in Islamabad.

Keywords: housing schemes, master plan, development works, zoning regulations

Procedia PDF Downloads 203
1490 Fire Risk Information Harmonization for Transboundary Fire Events between Portugal and Spain

Authors: Domingos Viegas, Miguel Almeida, Carmen Rocha, Ilda Novo, Yolanda Luna

Abstract:

Forest fires along the more than 1200km of the Spanish-Portuguese border are more and more frequent, currently achieving around 2000 fire events per year. Some of these events develop to large international wildfire requiring concerted operations based on shared information between the two countries. The fire event of Valencia de Alcantara (2003) causing several fatalities and more than 13000ha burnt, is a reference example of these international events. Currently, Portugal and Spain have a specific cross-border cooperation protocol on wildfires response for a strip of about 30km (15 km for each side). It is recognized by public authorities the successfulness of this collaboration however it is also assumed that this cooperation should include more functionalities such as the development of a common risk information system for transboundary fire events. Since Portuguese and Spanish authorities use different approaches to determine the fire risk indexes inputs and different methodologies to assess the fire risk, sometimes the conjoint firefighting operations are jeopardized since the information is not harmonized and the understanding of the situation by the civil protection agents from both countries is not unique. Thus, a methodology aiming the harmonization of the fire risk calculation and perception by Portuguese and Spanish Civil protection authorities is hereby presented. The final results are presented as well. The fire risk index used in this work is the Canadian Fire Weather Index (FWI), which is based on meteorological data. The FWI is limited on its application as it does not take into account other important factors with great effect on the fire appearance and development. The combination of these factors is very complex since, besides the meteorology, it addresses several parameters of different topics, namely: sociology, topography, vegetation and soil cover. Therefore, the meaning of FWI values is different from region to region, according the specific characteristics of each region. In this work, a methodology for FWI calibration based on the number of fire occurrences and on the burnt area in the transboundary regions of Portugal and Spain, in order to assess the fire risk based on calibrated FWI values, is proposed. As previously mentioned, the cooperative firefighting operations require a common perception of the information shared. Therefore, a common classification of the fire risk for the fire events occurred in the transboundary strip is proposed with the objective of harmonizing this type of information. This work is integrated in the ECHO project SpitFire - Spanish-Portuguese Meteorological Information System for Transboundary Operations in Forest Fires, which aims the development of a web platform for the sharing of information and supporting decision tools to be used in international fire events involving Portugal and Spain.

Keywords: data harmonization, FWI, international collaboration, transboundary wildfires

Procedia PDF Downloads 254
1489 Combining Shallow and Deep Unsupervised Machine Learning Techniques to Detect Bad Actors in Complex Datasets

Authors: Jun Ming Moey, Zhiyaun Chen, David Nicholson

Abstract:

Bad actors are often hard to detect in data that imprints their behaviour patterns because they are comparatively rare events embedded in non-bad actor data. An unsupervised machine learning framework is applied here to detect bad actors in financial crime datasets that record millions of transactions undertaken by hundreds of actors (<0.01% bad). Specifically, the framework combines ‘shallow’ (PCA, Isolation Forest) and ‘deep’ (Autoencoder) methods to detect outlier patterns. Detection performance analysis for both the individual methods and their combination is reported.

Keywords: detection, machine learning, deep learning, unsupervised, outlier analysis, data science, fraud, financial crime

Procedia PDF Downloads 97
1488 A Fishery Regulation Model: Bargaining over Fishing Pressure

Authors: Duplan Yves Jamont Junior

Abstract:

The Diamond-Mortensen-Pissarides model widely used in labor economics is tailored to fishery. By this way, a fishing function is defined to depict the fishing technology, and Bellman equations are established to describe the behaviors of fishermen and conservationists. On this basis, a negotiation takes place as a Nash-bargaining over the upper limit of the fishing pressure between both political representative groups of fishermen and conservationists. The existence and uniqueness conditions of the Nash-bargained fishing pressure are established. Given the biomass evolution equation, the dynamics of the model variables (fishing pressure, biomass, fish need) is studied.

Keywords: conservation, fishery, fishing, Nash bargaining

Procedia PDF Downloads 260
1487 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow

Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat

Abstract:

Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.

Keywords: affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, student engagement

Procedia PDF Downloads 95
1486 A Review on the Re-Usage of Single-Use Medical Devices

Authors: Lucas B. Naves, Maria José Abreu

Abstract:

Reprocessing single-use device has attracted interesting on the medical environment over the last decades. The reprocessing technique was sought in order to reduce the cost of purchasing the new medical device, which can achieve almost double of the price of the reprocessed product. In this manuscript, we have done a literature review, aiming the reuse of medical device that was firstly designed for single use only, but has become, more and more, effective on its reprocessing procedure. We also show the regulation, the countries which allows this procedure, the classification of these device and also the most important issue concerning the re-utilization of medical device, how to minimizing the risk of gram positive and negative bacteria, avoid cross-contamination, hepatitis B (HBV), and C (HCV) virus, and also human immunodeficiency virus (HIV).

Keywords: reusing, reprocessing, single-use medical device, HIV, hepatitis B and C

Procedia PDF Downloads 394
1485 Tuberculosis and Associated Transient Hyperglycaemia in Peri-Urban South Africa: Implications for Diabetes Screening in High Tuberculosis/HIV Burden Settings

Authors: Mmamapudi Kubjane, Natacha Berkowitz, Rene Goliath, Naomi S. Levitt, Robert J. Wilkinson, Tolu Oni

Abstract:

Background: South Africa remains a high tuberculosis (TB) burden country globally and the burden of diabetes – a TB risk factor is growing rapidly. As an infectious disease, TB also induces transient hyperglycaemia. Therefore, screening for diabetes in newly diagnosed tuberculosis patients may result in misclassification of transient hyperglycaemia as diabetes. Objective: The objective of this study was to determine and compare the prevalence of hyperglycaemia (diabetes and impaired glucose regulation (IGR)) in TB patients and to assess the cross-sectional association between TB and hyperglycaemia at enrolment and after three months of follow-up. Methods: Consecutive adult TB and non-TB participants presenting at a TB clinic in Cape Town were enrolled in this cross-sectional study and follow-up between July 2013 and August 2015. Diabetes was defined as self-reported diabetes, fasting plasma glucose (FPG) ≥ 7.0 mmol·L⁻¹ or glycated haemoglobin (HbA1c) ≥ 6.5%. IGR was defined as FPG 5.5– < 7.0 mmol·L⁻¹ or HbA1c 5.7– < 6.5%. TB patients initiated treatment. After three months, all participants were followed up and screened for diabetes again. The association between TB and hyperglycaemia was assessed using logistic regression adjusting for potential confounders including sex, age, income, hypertension, waist circumference, previous prisoner, marital status, work status, HIV status. Results: Diabetes screening was performed in 852 participants (414 TB and 438 non-TB) at enrolment and in 639 (304 TB and 335 non-TB) at three-month follow-up. The prevalence of HIV-1 infection was 69.6% (95% confidence interval (CI), 64.9–73.8 %) among TB patients, and 58.2% (95% CI, 53.5–62.8 %) among the non-TB participants. Glycaemic levels were much higher in TB patients than in the non-TB participants but decreased over time. Among TB patients, the prevalence of IGR was 65.2% (95% CI 60.1 - 69.9) at enrollment and 21.5% (95% CI 17.2-26.5) at follow-up; and was 50% (45.1 - 54.94) and 32% (95% CI 27.9 - 38.0) respectively, among non-TB participants. The prevalence of diabetes in TB patients was 12.5% (95% CI 9.69 – 16.12%) at enrolment and 9.2% (95% CI, 6.43–13.03%) at follow-up; and was 10.04% (95% CI, 7.55–13.24%) and 8.06% (95% CI, 5.58–11.51) respectively, among non-TB participants. The association between TB and IGT was significant at enrolment (adjusted odds ratio (OR) 2.26 (95% CI, 1.55-3.31) but disappeared at follow-up 0.84 (0.53 - 1.36). However, the TB-diabetes association remained positive and significant both at enrolment (2.41 (95% CI, 1.3-4.34)) and follow-up (OR 3.31 (95% CI, 1.5 - 7.25)). Conclusion: Transient hyperglycaemia exists during tuberculosis. This has implications on diabetes screening in TB patients and suggests a need for diabetes confirmation tests during or after TB treatment. Nonetheless, the association between TB and diabetes noted at enrolment persists at 3 months highlighting the importance of diabetes control and prevention for TB control. Further research is required to investigate the impact of hyperglycaemia (transient or otherwise) on TB outcomes to ascertain the clinical significance of hyperglycemia at enrolment.

Keywords: diabetes, impaired glucose regulation, transient hyperglycaemia, tuberculosis

Procedia PDF Downloads 165
1484 Labile and Humified Carbon Storage in Natural and Anthropogenically Affected Luvisols

Authors: Kristina Amaleviciute, Ieva Jokubauskaite, Alvyra Slepetiene, Jonas Volungevicius, Inga Liaudanskiene

Abstract:

The main task of this research was to investigate the chemical composition of the differently used soil in profiles. To identify the differences in the soil were investigated organic carbon (SOC) and its fractional composition: dissolved organic carbon (DOC), mobile humic acids (MHA) and C to N ratio of natural and anthropogenically affected Luvisols. Research object: natural and anthropogenically affected Luvisol, Akademija, Kedainiai, distr. Lithuania. Chemical analyses were carried out at the Chemical Research Laboratory of Institute of Agriculture, LAMMC. Soil samples for chemical analyses were taken from the genetics soil horizons. SOC was determined by the Tyurin method modified by Nikitin, measuring with spectrometer Cary 50 (VARIAN) in 590 nm wavelength using glucose standards. For mobile humic acids (MHA) determination the extraction procedure was carried out using 0.1 M NaOH solution. Dissolved organic carbon (DOC) was analyzed using an ion chromatograph SKALAR. pH was measured in 1M H2O. N total was determined by Kjeldahl method. Results: Based on the obtained results, it can be stated that transformation of chemical composition is going through the genetic soil horizons. Morphology of the upper layers of soil profile which is formed under natural conditions was changed by anthropomorphic (agrogenic, urbogenic, technogenic and others) structure. Anthropogenic activities, mechanical and biochemical disturbances destroy the natural characteristics of soil formation and complicates the interpretation of soil development. Due to the intensive cultivation, the pH values of the curve equals (disappears acidification characteristic for E horizon) with natural Luvisol. Luvisols affected by agricultural activities was characterized by a decrease in the absolute amount of humic substances in separate horizons. But there was observed more sustainable, higher carbon sequestration and thicker storage of humic horizon compared with forest Luvisol. However, the average content of humic substances in the soil profile was lower. Soil organic carbon content in anthropogenic Luvisols was lower compared with the natural forest soil, but there was more evenly spread over in the wider thickness of accumulative horizon. These data suggest that the organization of geo-ecological declines and agroecological increases in Luvisols. Acknowledgement: This work was supported by the National Science Program ‘The effect of long-term, different-intensity management of resources on the soils of different genesis and on other components of the agro-ecosystems’ [grant number SIT-9/2015] funded by the Research Council of Lithuania.

Keywords: agrogenization, dissolved organic carbon, luvisol, mobile humic acids, soil organic carbon

Procedia PDF Downloads 237
1483 Idiocenntrism to innovative action, multi-level perspective on moderating effects of emotional self-regulation, trust and CSR

Authors: Shuhong Wang, Xiang Yi

Abstract:

Through a survey of approximately 340 employees from four Chinese companies and employing cross-level analysis, this study reveals that certain cultural syndromes may exert both direct and indirect influences on such behaviors. Notably, individuals with a strong individualistic self-concept are more inclined towards innovative actions compared to their less individualistic counterparts. This research also identifies several moderating factors. For instance, trust amplifies the positive relationship between individualism and innovative actions, particularly at higher trust levels. The paper concludes by highlighting its theoretical contributions, and practical implications, and suggesting directions for future research.

Keywords: Innovation, Self-Determination Theory, Trust, Team dynamic, Allocentrism

Procedia PDF Downloads 60
1482 'CardioCare': A Cutting-Edge Fusion of IoT and Machine Learning to Bridge the Gap in Cardiovascular Risk Management

Authors: Arpit Patil, Atharav Bhagwat, Rajas Bhope, Pramod Bide

Abstract:

This research integrates IoT and ML to predict heart failure risks, utilizing the Framingham dataset. IoT devices gather real-time physiological data, focusing on heart rate dynamics, while ML, specifically Random Forest, predicts heart failure. Rigorous feature selection enhances accuracy, achieving over 90% prediction rate. This amalgamation marks a transformative step in proactive healthcare, highlighting early detection's critical role in cardiovascular risk mitigation. Challenges persist, necessitating continual refinement for improved predictive capabilities.

Keywords: cardiovascular diseases, internet of things, machine learning, cardiac risk assessment, heart failure prediction, early detection, cardio data analysis

Procedia PDF Downloads 14
1481 Effects of Exercise Training in the Cold on Browning of White Fat in Obese Rats

Authors: Xiquan Weng, Chaoge Wang, Guoqin Xu, Wentao Lin

Abstract:

Objective: Cold exposure and exercise serve as two powerful physiological stimuli to launch the conversion of fat-accumulating white adipose tissue (WAT) into energy-dissipating brown adipose tissue (BAT). So far, it remains to be elucidated whether exercise plus cold exposure can produce an addictive effect on promoting WAT browning. Methods: 64 SD rats were subjected to high-fat and high-sugar diets for 9-week and successfully established an obesity model. They were randomly divided into 8 groups: normal control group (NC), normal exercise group (NE), continuous cold control group (CC), continuous cold exercise group (CE), intermittent cold control group (IC) and intermittent cold exercise group (IE). For continuous cold exposure, the rats stayed in a cold environment all day; For intermittent cold exposure, the rats were exposed to cold for only 4h per day. The protocol for treadmill exercises were as follows: 25m/min (speed), 0°C (slope), 30mins each time, an interval for 10 mins between two exercises, twice/two days, lasting for 5 weeks. Sampling were conducted on the 5th weekend. The body length and weight of the rats were measured, and the Lee's index was calculated. The visceral fat rate (VFR), subcutaneous fat rate (SFR), brown fat rate (BrFR) and body fat rate (BoFR) were measured by Micro-CT LCT200, and the expression of UCP1 protein in inguinal fat was examined by Western-blot. SPSS 22.0 was used for statistical analysis of the experimental results, and the ANOVA analysis was performed between groups (P < 0.05 was significant). Results: (1) Compared with the NC group, the weight of obese rats was significantly declined in the NE, CE and IE groups (P < 0.05), the Lee's index of obese rats significantly declined in the CE group (P < 0.05). Compared with the NE group, the weight of obese rats was significantly declined in the CE and IE groups (P < 0.05). (2)Compared with the NC group, the VFR and BoFR of the rats significantly declined in the NE, CE and IE groups (P < 0.05), the SFR of the rats significantly declined in the CE and IE groups (P < 0.05), and the BFR of the rats was significantly higher in the CC and IC groups (P < 0.05), respectively. Compared with the NE group, the VFR and BoFR of the rats significantly declined in the CE group (P < 0.05), the SFR of the rats was significantly higher in the CC and IS groups (P < 0.05), and the BrFR of the rats was significantly higher in the IC group (P < 0.05). (3)Compared with the NC group, the up-regulation of UCP1 protein expression in the inguinal fat of the rats was significant in the NE, CC, CE, IC and IE groups (P < 0.05). Compared with the NE group, the up-regulation of UCP1 protein expression in the inguinal fat of the rats was significant in the CC, CE and IE groups (P < 0.05). Conclusions: Exercise in the continuous and intermittent cold, especially in the former, can effectively decline the weight and body fat rate of obese rats. This is related to the effect of cold and exercise on the browning of white fat in rats.

Keywords: cold, browning of white fat, exercise, obesity

Procedia PDF Downloads 134
1480 Design of an Electric Arc Furnace for the Production of Metallurgical Grade Silicon

Authors: M. Barbouche, M. Hajji, H. Ezzaouia

Abstract:

This project is a step to manufacture solar grade silicon. It consists in designing an electrical arc furnace in order to produce metallurgical silicon Mg-Si with mutually carbon and high purity of silica. It concerns, first, the development of a functional analysis, a mechanical design and thermodynamic study. Our study covers also, the design of the temperature control system and the design of the electric diagrams. The furnace works correctly. A Labview interface was developed to control all parameters and to supervise the operation of furnace. Characterization tests with X-ray technique and Raman spectroscopy allow us to confirm the metallurgical silicon production.

Keywords: arc furnace, electrical design, silicon manufacturing, regulation, x-ray characterization

Procedia PDF Downloads 496
1479 Meta-Analysis of Previously Unsolved Cases of Aviation Mishaps Employing Molecular Pathology

Authors: Michael Josef Schwerer

Abstract:

Background: Analyzing any aircraft accident is mandatory based on the regulations of the International Civil Aviation Organization and the respective country’s criminal prosecution authorities. Legal medicine investigations are unavoidable when fatalities involve the flight crew or when doubts arise concerning the pilot’s aeromedical health status before the event. As a result of frequently tremendous blunt and sharp force trauma along with the impact of the aircraft to the ground, consecutive blast or fire exposition of the occupants or putrefaction of the dead bodies in cases of delayed recovery, relevant findings can be masked or destroyed and therefor being inaccessible in standard pathology practice comprising just forensic autopsy and histopathology. Such cases are of considerable risk of remaining unsolved without legal consequences for those responsible. Further, no lessons can be drawn from these scenarios to improve flight safety and prevent future mishaps. Aims and Methods: To learn from previously unsolved aircraft accidents, re-evaluations of the investigation files and modern molecular pathology studies were performed. Genetic testing involved predominantly PCR-based analysis of gene regulation, studying DNA promotor methylations, RNA transcription and posttranscriptional regulation. In addition, the presence or absence of infective agents, particularly DNA- and RNA-viruses, was studied. Technical adjustments of molecular genetic procedures when working with archived sample material were necessary. Standards for the proper interpretation of the respective findings had to be settled. Results and Discussion: Additional molecular genetic testing significantly contributes to the quality of forensic pathology assessment in aviation mishaps. Previously undetected cardiotropic viruses potentially explain e.g., a pilot’s sudden incapacitation resulting from cardiac failure or myocardial arrhythmia. In contrast, negative results for infective agents participate in ruling out concerns about an accident pilot’s fitness to fly and the aeromedical examiner’s precedent decision to issue him or her an aeromedical certificate. Care must be taken in the interpretation of genetic testing for pre-existing diseases such as hypertrophic cardiomyopathy or ischemic heart disease. Molecular markers such as mRNAs or miRNAs, which can establish these diagnoses in clinical patients, might be misleading in-flight crew members because of adaptive changes in their tissues resulting from repeated mild hypoxia during flight, for instance. Military pilots especially demonstrate significant physiological adjustments to their somatic burdens in flight, such as cardiocirculatory stress and air combat maneuvers. Their non-pathogenic alterations in gene regulation and expression will likely be misinterpreted for genuine disease by inexperienced investigators. Conclusions: The growing influence of molecular pathology on legal medicine practice has found its way into aircraft accident investigation. As appropriate quality standards for laboratory work and data interpretation are provided, forensic genetic testing supports the medico-legal analysis of aviation mishaps and potentially reduces the number of unsolved events in the future.

Keywords: aviation medicine, aircraft accident investigation, forensic pathology, molecular pathology

Procedia PDF Downloads 47
1478 Machine Learning for Disease Prediction Using Symptoms and X-Ray Images

Authors: Ravija Gunawardana, Banuka Athuraliya

Abstract:

Machine learning has emerged as a powerful tool for disease diagnosis and prediction. The use of machine learning algorithms has the potential to improve the accuracy of disease prediction, thereby enabling medical professionals to provide more effective and personalized treatments. This study focuses on developing a machine-learning model for disease prediction using symptoms and X-ray images. The importance of this study lies in its potential to assist medical professionals in accurately diagnosing diseases, thereby improving patient outcomes. Respiratory diseases are a significant cause of morbidity and mortality worldwide, and chest X-rays are commonly used in the diagnosis of these diseases. However, accurately interpreting X-ray images requires significant expertise and can be time-consuming, making it difficult to diagnose respiratory diseases in a timely manner. By incorporating machine learning algorithms, we can significantly enhance disease prediction accuracy, ultimately leading to better patient care. The study utilized the Mask R-CNN algorithm, which is a state-of-the-art method for object detection and segmentation in images, to process chest X-ray images. The model was trained and tested on a large dataset of patient information, which included both symptom data and X-ray images. The performance of the model was evaluated using a range of metrics, including accuracy, precision, recall, and F1-score. The results showed that the model achieved an accuracy rate of over 90%, indicating that it was able to accurately detect and segment regions of interest in the X-ray images. In addition to X-ray images, the study also incorporated symptoms as input data for disease prediction. The study used three different classifiers, namely Random Forest, K-Nearest Neighbor and Support Vector Machine, to predict diseases based on symptoms. These classifiers were trained and tested using the same dataset of patient information as the X-ray model. The results showed promising accuracy rates for predicting diseases using symptoms, with the ensemble learning techniques significantly improving the accuracy of disease prediction. The study's findings indicate that the use of machine learning algorithms can significantly enhance disease prediction accuracy, ultimately leading to better patient care. The model developed in this study has the potential to assist medical professionals in diagnosing respiratory diseases more accurately and efficiently. However, it is important to note that the accuracy of the model can be affected by several factors, including the quality of the X-ray images, the size of the dataset used for training, and the complexity of the disease being diagnosed. In conclusion, the study demonstrated the potential of machine learning algorithms for disease prediction using symptoms and X-ray images. The use of these algorithms can improve the accuracy of disease diagnosis, ultimately leading to better patient care. Further research is needed to validate the model's accuracy and effectiveness in a clinical setting and to expand its application to other diseases.

Keywords: K-nearest neighbor, mask R-CNN, random forest, support vector machine

Procedia PDF Downloads 157
1477 Analysis of Spatial and Temporal Data Using Remote Sensing Technology

Authors: Kapil Pandey, Vishnu Goyal

Abstract:

Spatial and temporal data analysis is very well known in the field of satellite image processing. When spatial data are correlated with time, series analysis it gives the significant results in change detection studies. In this paper the GIS and Remote sensing techniques has been used to find the change detection using time series satellite imagery of Uttarakhand state during the years of 1990-2010. Natural vegetation, urban area, forest cover etc. were chosen as main landuse classes to study. Landuse/ landcover classes within several years were prepared using satellite images. Maximum likelihood supervised classification technique was adopted in this work and finally landuse change index has been generated and graphical models were used to present the changes.

Keywords: GIS, landuse/landcover, spatial and temporal data, remote sensing

Procedia PDF Downloads 433
1476 Diagnosis of Diabetes Using Computer Methods: Soft Computing Methods for Diabetes Detection Using Iris

Authors: Piyush Samant, Ravinder Agarwal

Abstract:

Complementary and Alternative Medicine (CAM) techniques are quite popular and effective for chronic diseases. Iridology is more than 150 years old CAM technique which analyzes the patterns, tissue weakness, color, shape, structure, etc. for disease diagnosis. The objective of this paper is to validate the use of iridology for the diagnosis of the diabetes. The suggested model was applied in a systemic disease with ocular effects. 200 subject data of 100 each diabetic and non-diabetic were evaluated. Complete procedure was kept very simple and free from the involvement of any iridologist. From the normalized iris, the region of interest was cropped. All 63 features were extracted using statistical, texture analysis, and two-dimensional discrete wavelet transformation. A comparison of accuracies of six different classifiers has been presented. The result shows 89.66% accuracy by the random forest classifier.

Keywords: complementary and alternative medicine, classification, iridology, iris, feature extraction, disease prediction

Procedia PDF Downloads 408
1475 Application of Fuzzy Multiple Criteria Decision Making for Flooded Risk Region Selection in Thailand

Authors: Waraporn Wimuktalop

Abstract:

This research will select regions which are vulnerable to flooding in different level. Mathematical principles will be systematically and rationally utilized as a tool to solve problems of selection the regions. Therefore the method called Multiple Criteria Decision Making (MCDM) has been chosen by having two analysis standards, TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) and AHP (Analytic Hierarchy Process). There are three criterions that have been considered in this research. The first criterion is climate which is the rainfall. The second criterion is geography which is the height above mean sea level. The last criterion is the land utilization which both forest and agriculture use. The study found that the South has the highest risk of flooding, then the East, the Centre, the North-East, the West and the North, respectively.

Keywords: multiple criteria decision making, TOPSIS, analytic hierarchy process, flooding

Procedia PDF Downloads 236
1474 Comprehensive Longitudinal Multi-omic Profiling in Weight Gain and Insulin Resistance

Authors: Christine Y. Yeh, Brian D. Piening, Sarah M. Totten, Kimberly Kukurba, Wenyu Zhou, Kevin P. F. Contrepois, Gucci J. Gu, Sharon Pitteri, Michael Snyder

Abstract:

Three million deaths worldwide are attributed to obesity. However, the biomolecular mechanisms that describe the link between adiposity and subsequent disease states are poorly understood. Insulin resistance characterizes approximately half of obese individuals and is a major cause of obesity-mediated diseases such as Type II diabetes, hypertension and other cardiovascular diseases. This study makes use of longitudinal quantitative and high-throughput multi-omics (genomics, epigenomics, transcriptomics, glycoproteomics etc.) methodologies on blood samples to develop multigenic and multi-analyte signatures associated with weight gain and insulin resistance. Participants of this study underwent a 30-day period of weight gain via excessive caloric intake followed by a 60-day period of restricted dieting and return to baseline weight. Blood samples were taken at three different time points per patient: baseline, peak-weight and post weight loss. Patients were characterized as either insulin resistant (IR) or insulin sensitive (IS) before having their samples processed via longitudinal multi-omic technologies. This comparative study revealed a wealth of biomolecular changes associated with weight gain after using methods in machine learning, clustering, network analysis etc. Pathways of interest included those involved in lipid remodeling, acute inflammatory response and glucose metabolism. Some of these biomolecules returned to baseline levels as the patient returned to normal weight whilst some remained elevated. IR patients exhibited key differences in inflammatory response regulation in comparison to IS patients at all time points. These signatures suggest differential metabolism and inflammatory pathways between IR and IS patients. Biomolecular differences associated with weight gain and insulin resistance were identified on various levels: in gene expression, epigenetic change, transcriptional regulation and glycosylation. This study was not only able to contribute to new biology that could be of use in preventing or predicting obesity-mediated diseases, but also matured novel biomedical informatics technologies to produce and process data on many comprehensive omics levels.

Keywords: insulin resistance, multi-omics, next generation sequencing, proteogenomics, type ii diabetes

Procedia PDF Downloads 429
1473 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs

Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.

Abstract:

Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.

Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification

Procedia PDF Downloads 128
1472 Market Solvency Capital Requirement Minimization: How Non-linear Solvers Provide Portfolios Complying with Solvency II Regulation

Authors: Abraham Castellanos, Christophe Durville, Sophie Echenim

Abstract:

In this article, a portfolio optimization problem is performed in a Solvency II context: it illustrates how advanced optimization techniques can help to tackle complex operational pain points around the monitoring, control, and stability of Solvency Capital Requirement (SCR). The market SCR of a portfolio is calculated as a combination of SCR sub-modules. These sub-modules are the results of stress-tests on interest rate, equity, property, credit and FX factors, as well as concentration on counter-parties. The market SCR is non convex and non differentiable, which does not make it a natural optimization criteria candidate. In the SCR formulation, correlations between sub-modules are fixed, whereas risk-driven portfolio allocation is usually driven by the dynamics of the actual correlations. Implementing a portfolio construction approach that is efficient on both a regulatory and economic standpoint is not straightforward. Moreover, the challenge for insurance portfolio managers is not only to achieve a minimal SCR to reduce non-invested capital but also to ensure stability of the SCR. Some optimizations have already been performed in the literature, simplifying the standard formula into a quadratic function. But to our knowledge, it is the first time that the standard formula of the market SCR is used in an optimization problem. Two solvers are combined: a bundle algorithm for convex non- differentiable problems, and a BFGS (Broyden-Fletcher-Goldfarb- Shanno)-SQP (Sequential Quadratic Programming) algorithm, to cope with non-convex cases. A market SCR minimization is then performed with historical data. This approach results in significant reduction of the capital requirement, compared to a classical Markowitz approach based on the historical volatility. A comparative analysis of different optimization models (equi-risk-contribution portfolio, minimizing volatility portfolio and minimizing value-at-risk portfolio) is performed and the impact of these strategies on risk measures including market SCR and its sub-modules is evaluated. A lack of diversification of market SCR is observed, specially for equities. This was expected since the market SCR strongly penalizes this type of financial instrument. It was shown that this direct effect of the regulation can be attenuated by implementing constraints in the optimization process or minimizing the market SCR together with the historical volatility, proving the interest of having a portfolio construction approach that can incorporate such features. The present results are further explained by the Market SCR modelling.

Keywords: financial risk, numerical optimization, portfolio management, solvency capital requirement

Procedia PDF Downloads 117
1471 Cross-border Data Transfers to and from South Africa

Authors: Amy Gooden, Meshandren Naidoo

Abstract:

Genetic research and transfers of big data are not confined to a particular jurisdiction, but there is a lack of clarity regarding the legal requirements for importing and exporting such data. Using direct-to-consumer genetic testing (DTC-GT) as an example, this research assesses the status of data sharing into and out of South Africa (SA). While SA laws cover the sending of genetic data out of SA, prohibiting such transfer unless a legal ground exists, the position where genetic data comes into the country depends on the laws of the country from where it is sent – making the legal position less clear.

Keywords: cross-border, data, genetic testing, law, regulation, research, sharing, South Africa

Procedia PDF Downloads 127
1470 XAI Implemented Prognostic Framework: Condition Monitoring and Alert System Based on RUL and Sensory Data

Authors: Faruk Ozdemir, Roy Kalawsky, Peter Hubbard

Abstract:

Accurate estimation of RUL provides a basis for effective predictive maintenance, reducing unexpected downtime for industrial equipment. However, while models such as the Random Forest have effective predictive capabilities, they are the so-called ‘black box’ models, where interpretability is at a threshold to make critical diagnostic decisions involved in industries related to aviation. The purpose of this work is to present a prognostic framework that embeds Explainable Artificial Intelligence (XAI) techniques in order to provide essential transparency in Machine Learning methods' decision-making mechanisms based on sensor data, with the objective of procuring actionable insights for the aviation industry. Sensor readings have been gathered from critical equipment such as turbofan jet engine and landing gear, and the prediction of the RUL is done by a Random Forest model. It involves steps such as data gathering, feature engineering, model training, and evaluation. These critical components’ datasets are independently trained and evaluated by the models. While suitable predictions are served, their performance metrics are reasonably good; such complex models, however obscure reasoning for the predictions made by them and may even undermine the confidence of the decision-maker or the maintenance teams. This is followed by global explanations using SHAP and local explanations using LIME in the second phase to bridge the gap in reliability within industrial contexts. These tools analyze model decisions, highlighting feature importance and explaining how each input variable affects the output. This dual approach offers a general comprehension of the overall model behavior and detailed insight into specific predictions. The proposed framework, in its third component, incorporates the techniques of causal analysis in the form of Granger causality tests in order to move beyond correlation toward causation. This will not only allow the model to predict failures but also present reasons, from the key sensor features linked to possible failure mechanisms to relevant personnel. The causality between sensor behaviors and equipment failures creates much value for maintenance teams due to better root cause identification and effective preventive measures. This step contributes to the system being more explainable. Surrogate Several simple models, including Decision Trees and Linear Models, can be used in yet another stage to approximately represent the complex Random Forest model. These simpler models act as backups, replicating important jobs of the original model's behavior. If the feature explanations obtained from the surrogate model are cross-validated with the primary model, the insights derived would be more reliable and provide an intuitive sense of how the input variables affect the predictions. We then create an iterative explainable feedback loop, where the knowledge learned from the explainability methods feeds back into the training of the models. This feeds into a cycle of continuous improvement both in model accuracy and interpretability over time. By systematically integrating new findings, the model is expected to adapt to changed conditions and further develop its prognosis capability. These components are then presented to the decision-makers through the development of a fully transparent condition monitoring and alert system. The system provides a holistic tool for maintenance operations by leveraging RUL predictions, feature importance scores, persistent sensor threshold values, and autonomous alert mechanisms. Since the system will provide explanations for the predictions given, along with active alerts, the maintenance personnel can make informed decisions on their end regarding correct interventions to extend the life of the critical machinery.

Keywords: predictive maintenance, explainable artificial intelligence, prognostic, RUL, machine learning, turbofan engines, C-MAPSS dataset

Procedia PDF Downloads 9
1469 Strategic Environmental Assessment and Climate Change: From European Experiences to Brazilian Needs

Authors: Amália S. Botter Fabbri

Abstract:

This paper proposes the analysis of the Strategic Environmental Assessment (SEA) in relation to the three pillars of the sustainable development, highlighting its particular importance to combat climate change. Theoretical and practical examples from Europe show how SEA has been implemented under the SEA Directive in the recent years, while the Brazilian case study shows a situation in which no regulation on SEA was implemented, despite the strong demand for it, as revealed by past experiences and future planning needs. In the end, some aspects to the formulation of a SEA Act are suggested, in an attempt to contribute to a better Brazilian environmental governance in relation to the future plans, programmes and policies required to the reduction of greenhouse gases emissions.

Keywords: Brazil, climate change, Europe, strategic environmental assessment

Procedia PDF Downloads 270
1468 Transformable Lightweight Structures for Short-term Stay

Authors: Anna Daskalaki, Andreas Ashikalis

Abstract:

This is a conceptual project that suggests an alternative type of summer camp in the forest of Rouvas in the island of Crete. Taking into account some feasts that are organised by the locals or mountaineering clubs near the church of St. John, we created a network of lightweight timber structures that serve the needs of the visitor. These structures are transformable and satisfy the need for rest, food, and sleep – this means a seat, a table and a tent are embodied in each structure. These structures blend in with the environment as they are being installed according to the following parameters: (a) the local relief, (b) the clusters of trees, and (c) the existing paths. Each timber structure could be considered as a module that could be totally independent or part of a bigger construction. The design showcases the advantages of a timber structure as it can be quite adaptive to the needs of the project, but also it is a sustainable and environmentally friendly material that can be recycled. Finally, it is important to note that the basic goal of this project is the minimum alteration of the natural environment.

Keywords: lightweight structures, timber, transformable, tent

Procedia PDF Downloads 171