Search results for: europium dinuclear complex
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5253

Search results for: europium dinuclear complex

1983 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution

Authors: Nikolay P. Brayanov, Anna V. Stoynova

Abstract:

Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.

Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development

Procedia PDF Downloads 242
1982 Facilitating Active Reading Strategies through Caps Chart to Foster Elementary EFL Learners’ Reading Skills and Reading Competency

Authors: Michelle Bulawan, Mei-Hua Chen

Abstract:

Reading comprehension is crucial for acquiring information, analyzing critically, and achieving academic proficiency. However, there is a lack of growth in reading comprehension skills beyond fourth grade. The developmental shift from "learning to read" to "reading to learn" occurs around this stage. Factual knowledge and diverse views in articles enhance reading comprehension abilities. Nevertheless, some face difficulties due to evolving textual requirements, such as expanding vocabulary and using longer, more complex terminology. Most research on reading strategies has been conducted at the tertiary and secondary levels, while few have focused on the elementary levels. Furthermore, the use of character, ask, problem, solution (CAPS) charts in teaching reading has also been hardly explored. Thus, the researcher decided to explore the facilitation of active reading strategies through the CAPS chart and address the following research questions: a) What differences existed in elementary EFL learners' reading competency among those who engaged in active reading strategies and those who did not? b) What are the learners’ metacognitive skills of those who engage in active reading strategies and those who do not, and what are their effects on their reading competency? c) For those participants who engage in active reading activities, what are their perceptions about incorporating active reading activities into their English classroom learning? Two groups of elementary EFL learners, each with 18 students of the same level of English proficiency, participated in this study. Group A served as the control group, while Group B served as the experimental group. Two teachers also participated in this research; one of them was the researcher who handled the experimental group. The treatment lasts for one whole semester or seventeen weeks. In addition to the CAPS chart, the researcher also used the metacognitive awareness of reading strategy inventory (MARSI) and a ten-item, five-point Likert scale survey.

Keywords: active reading, EFL learners, metacognitive skills, reading competency, student’s perception

Procedia PDF Downloads 89
1981 Deubiquitinase USP35 Regulates Mitosis Progression by Blocking CDH1-Mediated Degradation of Aurora B.

Authors: Jinyoung Park, Eun Joo Song

Abstract:

Introduction: Deubiquitinating enzymes (DUBs) are proteases that cleave ubiquitin or ubiquitin-like modifications on substrates. Deubiquitination could regulate cellular physiology, such as signal transduction, DNA damage and repair, and cell cycle progression. Although more than 100 DUBs are encoded in the human and the importance of DUBs has been realized, the functions of most DUBs are unknown. This study aims to identify the molecular mechanism by which deubiquitinating enzyme USP35 regulates cell cycle progression for the first time. Methods: USP35 RNAi was mainly used to identify the function of USP35 in cell cycle progression. To find substrates of USP35, we analyzed protein-protein interaction using LC-MS. Several biological methods, such as ubiquitination assay, cell synchronization, immunofluorescence, and immunoprecipitation assay were used to investigate the exact mechanism by which USP35 affects successful completion of mitosis. Results: USP35 knockdown caused not only reduction of mitotic cell number but also induction of mitotic cells with abnormal spindle formation. Actually, cell proliferation was decreased by USP35 knockdown. Interestingly, we found that loss of USP35 decreased the stability and expression of Aurora B, a member of chromosomal passenger complex (CPC), and the phosphorylation of its substrate. Indeed, USP35 interacted with Aurora B and deubiquitinated it. In addition, USP35 knockdown induced abnormal localization of Aurora B in mitotic cells. Finally, CDH1-mediated ubiquitination of Aurora B level was rescued by USP35 overexpression, but not inactive form of USP35, USP35 C450A. Discussion: Our findings suggest that USP35 regulates Aurora B-mediated mitotic spindle assembly and G2-M transition by blocking CDH1-induced degradation of Aurora B.

Keywords: USP35, HSP90, Aurora B, cell cycle progression

Procedia PDF Downloads 356
1980 Advancements in Autonomous Drones for Enhanced Healthcare Logistics

Authors: Bhaargav Gupta P., Vignesh N., Nithish Kumar R., Rahul J., Nivetha Ruvah D.

Abstract:

Delivering essential medical supplies to rural and underserved areas is challenging due to infrastructure limitations and logistical barriers, often resulting in inefficiencies and delays. Traditional delivery methods are hindered by poor road networks, long distances, and difficult terrains, compromising timely access to vital resources, especially in emergencies. This paper introduces an autonomous drone system engineered to optimize last-mile delivery. By utilizing advanced navigation and object-detection algorithms, such as region-based convolutional neural networks (R-CNN), our drones efficiently avoid obstacles, identify safe landing zones, and adapt dynamically to varying environments. Equipped with high-precision GPS and autonomous capabilities, the drones effectively navigate complex, remote areas with minimal dependence on established infrastructure. The system includes a dedicated mobile application for secure order placement and real-time tracking, and a secure payload box with OTP verification ensures tamper-resistant delivery to authorized recipients. This project demonstrates the potential of automated drone technology in healthcare logistics, offering a scalable and eco-friendly approach to enhance accessibility and service delivery in underserved regions. By addressing logistical gaps through advanced automation, this system represents a significant advancement toward sustainable, accessible healthcare in remote areas.

Keywords: region-based convolutional neural network, one time password, global positioning system, autonomous drones, healthcare logistics

Procedia PDF Downloads 0
1979 Analysis of Pressure Drop in a Concentrated Solar Collector with Direct Steam Production

Authors: Sara Sallam, Mohamed Taqi, Naoual Belouaggadia

Abstract:

Solar thermal power plants using parabolic trough collectors (PTC) are currently a powerful technology for generating electricity. Most of these solar power plants use thermal oils as heat transfer fluid. The latter is heated in the solar field and transfers the heat absorbed in an oil-water heat exchanger for the production of steam driving the turbines of the power plant. Currently, we are seeking to develop PTCs with direct steam generation (DSG). This process consists of circulating water under pressure in the receiver tube to generate steam directly into the solar loop. This makes it possible to reduce the investment and maintenance costs of the PTCs (the oil-water exchangers are removed) and to avoid the environmental risks associated with the use of thermal oils. The pressure drops in these systems are an important parameter to ensure their proper operation. The determination of these losses is complex because of the presence of the two phases, and most often we limit ourselves to describing them by models using empirical correlations. A comparison of these models with experimental data was performed. Our calculations focused on the evolution of the pressure of the liquid-vapor mixture along the receiver tube of a PTC-DSG for pressure values and inlet flow rates ranging respectively from 3 to 10 MPa, and from 0.4 to 0.6 kg/s. The comparison of the numerical results with experience allows us to demonstrate the validity of some models according to the pressures and the flow rates of entry in the PTC-DSG receiver tube. The analysis of these two parameters’ effects on the evolution of the pressure along the receiving tub, shows that the increase of the inlet pressure and the decrease of the flow rate lead to minimal pressure losses.

Keywords: direct steam generation, parabolic trough collectors, Ppressure drop, empirical models

Procedia PDF Downloads 138
1978 TAXAPRO, A Streamlined Pipeline to Analyze Shotgun Metagenomes

Authors: Sofia Sehli, Zainab El Ouafi, Casey Eddington, Soumaya Jbara, Kasambula Arthur Shem, Islam El Jaddaoui, Ayorinde Afolayan, Olaitan I. Awe, Allissa Dillman, Hassan Ghazal

Abstract:

The ability to promptly sequence whole genomes at a relatively low cost has revolutionized the way we study the microbiome. Microbiologists are no longer limited to studying what can be grown in a laboratory and instead are given the opportunity to rapidly identify the makeup of microbial communities in a wide variety of environments. Analyzing whole genome sequencing (WGS) data is a complex process that involves multiple moving parts and might be rather unintuitive for scientists that don’t typically work with this type of data. Thus, to help lower the barrier for less-computationally inclined individuals, TAXAPRO was developed at the first Omics Codeathon held virtually by the African Society for Bioinformatics and Computational Biology (ASBCB) in June 2021. TAXAPRO is an advanced metagenomics pipeline that accurately assembles organelle genomes from whole-genome sequencing data. TAXAPRO seamlessly combines WGS analysis tools to create a pipeline that automatically processes raw WGS data and presents organism abundance information in both a tabular and graphical format. TAXAPRO was evaluated using COVID-19 patient gut microbiome data. Analysis performed by TAXAPRO demonstrated a high abundance of Clostridia and Bacteroidia genera and a low abundance of Proteobacteria genera relative to others in the gut microbiome of patients hospitalized with COVID-19, consistent with the original findings derived using a different analysis methodology. This provides crucial evidence that the TAXAPRO workflow dispenses reliable organism abundance information overnight without the hassle of performing the analysis manually.

Keywords: metagenomics, shotgun metagenomic sequence analysis, COVID-19, pipeline, bioinformatics

Procedia PDF Downloads 217
1977 Ontology based Fault Detection and Diagnosis system Querying and Reasoning examples

Authors: Marko Batic, Nikola Tomasevic, Sanja Vranes

Abstract:

One of the strongholds in the ubiquitous efforts related to the energy conservation and energy efficiency improvement is represented by the retrofit of high energy consumers in buildings. In general, HVAC systems represent the highest energy consumers in buildings. However they usually suffer from mal-operation and/or malfunction, causing even higher energy consumption than necessary. Various Fault Detection and Diagnosis (FDD) systems can be successfully employed for this purpose, especially when it comes to the application at a single device/unit level. In the case of more complex systems, where multiple devices are operating in the context of the same building, significant energy efficiency improvements can only be achieved through application of comprehensive FDD systems relying on additional higher level knowledge, such as their geographical location, served area, their intra- and inter- system dependencies etc. This paper presents a comprehensive FDD system that relies on the utilization of common knowledge repository that stores all critical information. The discussed system is deployed as a test-bed platform at the two at Fiumicino and Malpensa airports in Italy. This paper aims at presenting advantages of implementation of the knowledge base through the utilization of ontology and offers improved functionalities of such system through examples of typical queries and reasoning that enable derivation of high level energy conservation measures (ECM). Therefore, key SPARQL queries and SWRL rules, based on the two instantiated airport ontologies, are elaborated. The detection of high level irregularities in the operation of airport heating/cooling plants is discussed and estimation of energy savings is reported.

Keywords: airport ontology, knowledge management, ontology modeling, reasoning

Procedia PDF Downloads 536
1976 A Methodology to Integrate Data in the Company Based on the Semantic Standard in the Context of Industry 4.0

Authors: Chang Qin, Daham Mustafa, Abderrahmane Khiat, Pierre Bienert, Paulo Zanini

Abstract:

Nowadays, companies are facing lots of challenges in the process of digital transformation, which can be a complex and costly undertaking. Digital transformation involves the collection and analysis of large amounts of data, which can create challenges around data management and governance. Furthermore, it is also challenged to integrate data from multiple systems and technologies. Although with these pains, companies are still pursuing digitalization because by embracing advanced technologies, companies can improve efficiency, quality, decision-making, and customer experience while also creating different business models and revenue streams. In this paper, the issue that data is stored in data silos with different schema and structures is focused. The conventional approaches to addressing this issue involve utilizing data warehousing, data integration tools, data standardization, and business intelligence tools. However, these approaches primarily focus on the grammar and structure of the data and neglect the importance of semantic modeling and semantic standardization, which are essential for achieving data interoperability. In this session, the challenge of data silos in Industry 4.0 is addressed by developing a semantic modeling approach compliant with Asset Administration Shell (AAS) models as an efficient standard for communication in Industry 4.0. The paper highlights how our approach can facilitate the data mapping process and semantic lifting according to existing industry standards such as ECLASS and other industrial dictionaries. It also incorporates the Asset Administration Shell technology to model and map the company’s data and utilize a knowledge graph for data storage and exploration.

Keywords: data interoperability in industry 4.0, digital integration, industrial dictionary, semantic modeling

Procedia PDF Downloads 93
1975 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 85
1974 An Ensemble System of Classifiers for Computer-Aided Volcano Monitoring

Authors: Flavio Cannavo

Abstract:

Continuous evaluation of the status of potentially hazardous volcanos plays a key role for civil protection purposes. The importance of monitoring volcanic activity, especially for energetic paroxysms that usually come with tephra emissions, is crucial not only for exposures to the local population but also for airline traffic. Presently, real-time surveillance of most volcanoes worldwide is essentially delegated to one or more human experts in volcanology, who interpret data coming from different kind of monitoring networks. Unfavorably, the high nonlinearity of the complex and coupled volcanic dynamics leads to a large variety of different volcanic behaviors. Moreover, continuously measured parameters (e.g. seismic, deformation, infrasonic and geochemical signals) are often not able to fully explain the ongoing phenomenon, thus making the fast volcano state assessment a very puzzling task for the personnel on duty at the control rooms. With the aim of aiding the personnel on duty in volcano surveillance, here we introduce a system based on an ensemble of data-driven classifiers to infer automatically the ongoing volcano status from all the available different kind of measurements. The system consists of a heterogeneous set of independent classifiers, each one built with its own data and algorithm. Each classifier gives an output about the volcanic status. The ensemble technique allows weighting the single classifier output to combine all the classifications into a single status that maximizes the performance. We tested the model on the Mt. Etna (Italy) case study by considering a long record of multivariate data from 2011 to 2015 and cross-validated it. Results indicate that the proposed model is effective and of great power for decision-making purposes.

Keywords: Bayesian networks, expert system, mount Etna, volcano monitoring

Procedia PDF Downloads 246
1973 Consumer Preferences towards Sorbets: A Questionnaire Study

Authors: Kinga Topolska, Agnieszka Filipiak-Florkiewicz, Adam Florkiewicz, Daria Chechelska, Iwona Cieślik, Ewa Cieślik

Abstract:

Food choice is a complex human behaviour, that is influenced by many interrelating factors. It is important to understand what consumers really want to eat. Nowadays, the growing popularity of frozen desserts is observed. Among them, sorbets are of the special interest. They are made primarily of fruit juice or fruit purée, water and sugar. A questionnaire study was done to evaluate the consumer preferences towards sorbets. A hundred respondents were included in the study. The respondents answered the questions concerning, inter alia, on the favourite taste of sorbets, additional ingredients (pieces of fruits, nuts etc.), the reason for choosing the product and also the opinion about potential purchasing or not the new product. Women, more frequently than men, indicated strawberry sorbet or the product on the basis of citrus fruits as a favourite one. In turn, 14% of men chose the apple taste. Pieces of chocolate were chosen by most of respondents. Men, more often than women, regarded raisins, alcohol and nuts as the most desirable additional ingredients of sorbets. The candied fruits and spices were indicated more frequently by women. Most of respondents indicated the taste as the major reason for sorbet buying. In turn, for 20% women the most important determinant was the care for their figure. It was observed that more than a half of women regarded sorbets as healthier than traditional ice creams. Answering the question: 'If you had the opportunity to try a new sorbet, containing the ingredient with proven healthy properties, would you buy it?', significantly more men than women answered 'yes, because I like novelty'. Meanwhile, for 14% respondents (independently of gender) it would be only a publicity stunt. Knowing what the consumers desire in selecting a product, is a very important information to design and offer them a new one. Sorbets could be an interesting alternative to ice creams.

Keywords: consumer, preferences, sorbets, questionnaire study

Procedia PDF Downloads 284
1972 Binding Mechanism of Synthesized 5β-Dihydrocortisol and 5β-Dihydrocortisol Acetate with Human Serum Albumin to Understand Their Role in Breast Cancer

Authors: Monika Kallubai, Shreya Dubey, Rajagopal Subramanyam

Abstract:

Our study is all about the biological interactions of synthesized 5β-dihydrocortisol (Dhc) and 5β-dihydrocortisol acetate (DhcA) molecules with carrier protein Human Serum Albumin (HSA). The cytotoxic study was performed on breast cancer cell line (MCF-7) normal human embryonic kidney cell line (HEK293), the IC50 values for MCF-7 cells were 28 and 25 µM, respectively, whereas no toxicity in terms of cell viability was observed with HEK293 cell line. The further experiment proved that Dhc and DhcA induced 35.6% and 37.7% early apoptotic cells and 2.5%, 2.9% late apoptotic cells respectively. Morphological observation of cell death through TUNEL assay revealed that Dhc and DhcA induced apoptosis in MCF-7 cells. The complexes of HSA–Dhc and HSA–DhcA were observed as static quenching, and the binding constants (K) was 4.7±0.03×104 M-1 and 3.9±0.05×104 M-1, and their binding free energies were found to be -6.4 and -6.16 kcal/mol, respectively. The displacement studies confirmed that lidocaine 1.4±0.05×104 M-1 replaced Dhc, and phenylbutazone 1.5±0.05×104 M-1 replaced by DhcA, which explains domain I and domain II are the binding sites for Dhc and DhcA. Further, CD results revealed that the secondary structure of HSA was altered in the presence of Dhc and DhcA. Furthermore, the atomic force microscopy and transmission electron microscopy showed that the dimensions like height and molecular sizes of the HSA–Dhc and HSA–DhcA complex were larger compared to HSA alone. Detailed analysis through molecular dynamics simulations also supported the greater stability of HSA–Dhc and HSA–DhcA complexes, and root-mean-square-fluctuation interpreted the binding site of Dhc as domain IB and domain IIA for DhcA. This information is valuable for the further development of steroid derivatives with improved pharmacological significance as novel anti-cancer drugs.

Keywords: apoptosis, dihydrocortisol, fluorescence quenching, protein conformations

Procedia PDF Downloads 129
1971 Improvement of Artemisinin Production by P. indica in Hairy Root Cultures of A. annua L.

Authors: Seema Ahlawat, Parul Saxena, Malik Zainul Abdin

Abstract:

Malaria is a major health problem in many developing countries. The parasite responsible for the vast majority of fatal malaria infections is Plasmodium falciparum. Unfortunately, most Plasmodium strains including P. falciparum have become resistant to most of the antimalarials including chloroquine, mefloquine, etc. To combat this problem, WHO has recommended the use of artemisinin and its derivatives in artemisinin based combination therapy (ACT). Due to its current use in artemisinin based-combination therapy (ACT), its global demand is increasing continuously. But, the relatively low yield of artemisinin in A. annua L. plants and unavailability of economically viable synthetic protocols are the major bottlenecks for its commercial production and clinical use. Chemical synthesis of artemisinin is also very complex and uneconomical. The hairy root system, using the Agrobacterium rhizogenes LBA 9402 strain to enhance the production of artemisinin in A. annua L., is developed in our laboratory. The transgenic nature of hairy root lines and the copy number of trans gene (rol B) were confirmed using PCR and Southern Blot analyses, respectively. The effect of different concentrations of Piriformospora indica on artemisinin production in hairy root cultures were evaluated. 3% P. indica has resulted 1.97 times increase in artemisinin production in comparison to control cultures. The effects of P. indica on artemisinin production was positively correlated with regulatory genes of MVA, MEP and artemisinin biosynthetic pathways, viz. hmgr, ads, cyp71av1, aldh1, dxs, dxr and dbr2 in hairy root cultures of A. annua L. Mass scale cultivation of A. annua L. hairy roots by plant tissue culture technology may be an alternative route for production of artemisinin. A comprehensive investigation of the hairy root system of A. annua L. would help in developing a viable process for the production of artemisinin. The efficiency of the scaling up systems still needs optimization before industrial exploitation becomes viable.

Keywords: A. annua L., artemisinin, hairy root cultures, malaria

Procedia PDF Downloads 413
1970 Application of RayMan Model in Quantifying the Impacts of the Built Environment and Surface Properties on Surrounding Temperature

Authors: Maryam Karimi, Rouzbeh Nazari

Abstract:

Introduction: Understanding thermal distribution in the micro-urban climate has now been necessary for urban planners or designers due to the impact of complex micro-scale features of Urban Heat Island (UHI) on the built environment and public health. Hence, understanding the interrelation between urban components and thermal pattern can assist planners in the proper addition of vegetation to build-environment, which can minimize the UHI impact. To characterize the need for urban green infrastructure (UGI) through better urban planning, this study proposes the use of RayMan model to measure the impact of air quality and increased temperature based on urban morphology in the selected metropolitan cities. This project will measure the impact of build environment for urban and regional planning using human biometeorological evaluations (Tmrt). Methods: We utilized the RayMan model to estimate the Tmrt in an urban environment incorporating location and height of buildings and trees as a supplemental tool in urban planning and street design. The estimated Tmrt value will be compared with existing surface and air temperature data to find the actual temperature felt by pedestrians. Results: Our current results suggest a strong relationship between sky-view factor (SVF) and increased surface temperature in megacities based on current urban morphology. Conclusion: This study will help with Quantifying the impacts of the built environment and surface properties on surrounding temperature, identifying priority urban neighborhoods by analyzing Tmrt and air quality data at the pedestrian level, and characterizing the need for urban green infrastructure cooling potential.

Keywords: built environment, urban planning, urban cooling, extreme heat

Procedia PDF Downloads 122
1969 Investigating Elements of Identity of Traditional Neighborhoods in Isfahan and Using These Elements in the Design of Modern Neighborhoods

Authors: Saman Keshavarzi

Abstract:

The process of planning, designing and building neighborhoods is a complex and multidimensional part of urban planning. Understanding the elements that give a neighborhood a sense of identity can lead to successful city planning and result in a cohesive and functional community where people feel a sense of belonging. These factors are important in ensuring that the needs of the urban population are met to live in a safe, pleasant and healthy society. This research paper aims to identify the elements of the identity of traditional neighborhoods in Isfahan and analyzes ways of using these elements in the design of modern neighborhoods to increase social interaction between communities and cultural reunification of people. The neighborhood of Jolfa in Isfahan has a unique socio-cultural identity as it dates back to the Safavid Dynasty of the 16th century, and most of its inhabitants are Christian Armenians of a religious minority. The elements of the identity of Jolfa were analyzed through the following research methods: field observations, distribution of questionnaires and qualitative analysis. The basic methodology that was used to further understand the Jolfa neighborhood and deconstruct the identity image that residents associate with their respective neighborhoods was a qualitative research method. This was done through utilizing questionnaires that respondents had to fill out in response to a series of research questions. From collecting these qualitative data, the major finding was that traditional neighborhoods that have elements of identity embedded in them are seen to have closer-knit communities whose residents have strong societal ties. This area of study in urban planning is vital to ensuring that new neighborhoods are built with concepts of social cohesion, community and inclusion in mind as they are what lead to strong, connected, and prosperous societies.

Keywords: development, housing, identity, neighborhood, policy, urbanization

Procedia PDF Downloads 173
1968 Stabilization of Pb, Cr, Cd, Cu and Zn in Solid Waste and Sludge Pyrolysis by Modified Vermiculite

Authors: Yuxuan Yang, Zhaoping Zhong

Abstract:

Municipal solid waste and sludge are important sources of waste energy and their proper disposal is of great importance. Pyrolysis can fully decompose solid wastes and sludge, and the pyrolysis products (charcoal, oil and gas) have important recovery values. Due to the complex composition of solid wastes and sludge, the pyrolysis process at high temperatures is prone to heavy metal emissions, which are harmful to humans and the environment and reduce the safety of pyrolysis products. In this paper, heavy metal emissions during pyrolysis of municipal sewage sludge, paper mill sludge, municipal domestic waste, and aged refuse at 450-650°C were investigated and the emissions and hazards of heavy metals (Pb, Cr, Cd, Cu and Zn) were effectively reduced by adding modified vermiculite as an additive. The vermiculite was modified by intercalation with cetyltrimethylammonium bromide, which resulted in more than twice the original layer spacing of the vermiculite. Afterward, the interpolated vermiculite was made into vermiculite flakes by exfoliation modification. After that, the expansion rate of vermiculite flakes was increased by Mg2+ modification and thermal activation. The expanded vermiculite flakes were acidified to improve the textural characteristics of the vermiculite. The modified vermiculite was analysed by XRD, FT-IR, BET and SEM to clarify the modification effect. The incorporation of modified vermiculite resulted in more than 80% retention of all heavy metals at 450°C. Cr, Cu and Zn were better retained than Pb and Cd. The incorporation of modified vermiculite effectively reduced the risk of heavy metals, and all risks were low for Pb, Cr, Cu and Zn. The toxicity of all heavy metals was greatly reduced by the incorporation of modified vermiculite and the morphology of heavy metals was transformed from Exchangeable and acid-soluble (F1) and Reducible (F2) to Oxidizable (F3) and Residual (F4). In addition, the increase in temperature favored the stabilization of heavy metal forms. This study provides a new insight into the cleaner use of energy and the safe management of solid waste.

Keywords: heavy metal, pyrolysis, vermiculite, solid waste

Procedia PDF Downloads 67
1967 Reliability Qualification Test Plan Derivation Method for Weibull Distributed Products

Authors: Ping Jiang, Yunyan Xing, Dian Zhang, Bo Guo

Abstract:

The reliability qualification test (RQT) is widely used in product development to qualify whether the product meets predetermined reliability requirements, which are mainly described in terms of reliability indices, for example, MTBF (Mean Time Between Failures). It is widely exercised in product development. In engineering practices, RQT plans are mandatorily referred to standards, such as MIL-STD-781 or GJB899A-2009. But these conventional RQT plans in standards are not preferred, as the test plans often require long test times or have high risks for both producer and consumer due to the fact that the methods in the standards only use the test data of the product itself. And the standards usually assume that the product is exponentially distributed, which is not suitable for a complex product other than electronics. So it is desirable to develop an RQT plan derivation method that safely shortens test time while keeping the two risks under control. To meet this end, for the product whose lifetime follows Weibull distribution, an RQT plan derivation method is developed. The merit of the method is that expert judgment is taken into account. This is implemented by applying the Bayesian method, which translates the expert judgment into prior information on product reliability. Then producer’s risk and the consumer’s risk are calculated accordingly. The procedures to derive RQT plans are also proposed in this paper. As extra information and expert judgment are added to the derivation, the derived test plans have the potential to shorten the required test time and have satisfactory low risks for both producer and consumer, compared with conventional test plans. A case study is provided to prove that when using expert judgment in deriving product test plans, the proposed method is capable of finding ideal test plans that not only reduce the two risks but also shorten the required test time as well.

Keywords: expert judgment, reliability qualification test, test plan derivation, producer’s risk, consumer’s risk

Procedia PDF Downloads 136
1966 Relevance of Reliability Approaches to Predict Mould Growth in Biobased Building Materials

Authors: Lucile Soudani, Hervé Illy, Rémi Bouchié

Abstract:

Mould growth in living environments has been widely reported for decades all throughout the world. A higher level of moisture in housings can lead to building degradation, chemical component emissions from construction materials as well as enhancing mould growth within the envelope elements or on the internal surfaces. Moreover, a significant number of studies have highlighted the link between mould presence and the prevalence of respiratory diseases. In recent years, the proportion of biobased materials used in construction has been increasing, as seen as an effective lever to reduce the environmental impact of the building sector. Besides, bio-based materials are also hygroscopic materials: when in contact with the wet air of a surrounding environment, their porous structures enable a better capture of water molecules, thus providing a more suitable background for mould growth. Many studies have been conducted to develop reliable models to be able to predict mould appearance, growth, and decay over many building materials and external exposures. Some of them require information about temperature and/or relative humidity, exposure times, material sensitivities, etc. Nevertheless, several studies have highlighted a large disparity between predictions and actual mould growth in experimental settings as well as in occupied buildings. The difficulty of considering the influence of all parameters appears to be the most challenging issue. As many complex phenomena take place simultaneously, a preliminary study has been carried out to evaluate the feasibility to sadopt a reliability approach rather than a deterministic approach. Both epistemic and random uncertainties were identified specifically for the prediction of mould appearance and growth. Several studies published in the literature were selected and analysed, from the agri-food or automotive sectors, as the deployed methodology appeared promising.

Keywords: bio-based materials, mould growth, numerical prediction, reliability approach

Procedia PDF Downloads 45
1965 Practical Ways to Acquire the Arabic Language through Electronic Means

Authors: Hondozi Jahja

Abstract:

There is an obvious need to learn Arabic language and teach it to other speakers through the new curricula. The idea is to bridge the gap between theory and practice. To that end, we have sought to offer some means of help to master the Arabic language, in addition to our efforts to apply these means, enriching the culture of the student and develop his vocabulary. There is no doubt that taking care of the practical aspect of the grammar was our constant goal, and this particular aspect is what builds the student’s positive values and refine his taste and develop his language. In addressing these issues, we have adopted a school-based approach based primarily on the active and positive participation of the student. The theoretical linguistic issues - in our opinion - are not a primary goal, but the goal is to be used them by students through speaking and applying them. Among the objectives of this research is to establish the basic language skills of the students using new means that help the student to acquire these skills and apply them in various subjects of interest in his progress and development. Unfortunately, some of our students consider the grammar as ‘difficult’, ‘complex’ and ‘heavy’ in itself. This is one of the obstacles that stand in the way of their desired results. As a consequence, they end up talking – mumbling - about the difficulties they face in applying those rules. Therefore, some of our students finish their university studies and are unable to express what they feel using language correctly. For this purpose, we have sought in this research to follow a new integrated approach, which is to study the grammar of the language through modern means of the consolidation of the principle of functional language, and that the rule implies to control tongues and linguistic expressions properly. This research is a result of a practical experience as a teacher of Arabic language for non-native speakers at the ‘Hassan Pristina’ University, located in Pristina, the capital of Kosovo and at the Qatar Training Center since its establishment in 2012.

Keywords: arabic, applied methods, acquire, learning

Procedia PDF Downloads 158
1964 [Keynote Talk]: Caught in the Tractorbeam of Larger Influences: The Filtration of Innovation in Education Technology Design

Authors: Justin D. Olmanson, Fitsum Abebe, Valerie Jones, Eric Kyle, Xianquan Liu, Katherine Robbins, Guieswende Rouamba

Abstract:

The history of education technology--and designing, adapting, and adopting technologies for use in educational spaces--is nuanced, complex, and dynamic. Yet, despite a range of continually emerging technologies, the design and development process often yields results that appear quite similar in terms of affordances and interactions. Through this study we (1) verify the extent to which designs have been constrained, (2) consider what might account for it, and (3) offer a way forward in terms of how we might identify and strategically sidestep these influences--thereby increasing the diversity of our designs with a given technology or within a particular learning domain. We begin our inquiry from the perspective that a host of co-influencing elements, fields, and meta narratives converge on the education technology design process to exert a tangible, often homogenizing effect on the resultant designs. We identify several elements that influence design in often implicit or unquestioned ways (e.g. curriculum, learning theory, economics, learning context, pedagogy), we describe our methodology for identifying the elemental positionality embedded in a design, we direct our analysis to a particular subset of technologies in the field of literacy, and unpack our findings. Our early analysis suggests that the majority of education technologies designed for use/used in US public schools are heavily influenced by a handful of mainstream theories and meta narratives. These findings have implications for how we approach the education technology design process--which we use to suggest alternative methods for designing/ developing with emerging technologies. Our analytical process and re conceptualized design process hold the potential to diversify the ways emerging and established technologies get incorporated into our designs.

Keywords: curriculum, design, innovation, meta narratives

Procedia PDF Downloads 508
1963 Predicting Ecological Impacts of Sea-Level Change on Coastal Conservation Areas in India

Authors: Mohammad Zafar-ul Islam, Shaily Menon, Xingong Li, A. Townsend Peterson

Abstract:

In addition to the mounting empirical data on direct implications of climate change for natural and human systems, evidence is increasing for other, indirect climate change phenomena such as sea-level rise. Rising sea levels and associated marine intrusion into terrestrial environments are predicted to be among the most serious eventual consequences of climate change. The many complex and interacting factors affecting sea levels create considerable uncertainty in sea-level rise projections: conservative estimates are on the order of 0.5-1.0 m globally, while other estimates are much higher, approaching 6 m. Marine intrusion associated with 1– 6 m sea-level rise will impact species and habitats in coastal ecosystems severely. Examining areas most vulnerable to such impacts may allow design of appropriate adaptation and mitigation strategies. We present an overview of potential effects of 1 and 6 m sea level rise for coastal conservation areas in the Indian Subcontinent. In particular, we examine the projected magnitude of areal losses in relevant biogeographic zones, ecoregions, protected areas (PAs), and Important Bird Areas (IBAs). In addition, we provide a more detailed and quantitative analysis of likely effects of marine intrusion on 22 coastal PAs and IBAs that provide critical habitat for birds in the form of breeding areas, migratory stopover sites, and overwintering habitats. Several coastal PAs and IBAs are predicted to experience higher than 50% losses to marine intrusion. We explore consequences of such inundation levels on species and habitat in these areas.

Keywords: sea-level change, coastal inundation, marine intrusion, biogeographic zones, ecoregions, protected areas, important bird areas, adaptation, mitigation

Procedia PDF Downloads 256
1962 Comparison of the Results of a Parkinson’s Holter Monitor with Patient Diaries, in Real Conditions of Use: A Sub-Analysis of the MoMoPa-EC Clinical Trial

Authors: Alejandro Rodríguez-Molinero, Carlos Pérez-López, Jorge Hernández-Vara, Àngels Bayes-Rusiñol, Juan Carlos Martínez-Castrillo, David A. Pérez-Martínez

Abstract:

Background: Monitoring motor symptoms in Parkinson's patients is often a complex and time-consuming task for clinicians, as Hauser's diaries are often poorly completed by patients. Recently, new automatic devices (Parkinson's holter: STAT-ON®) have been developed capable of monitoring patients' motor fluctuations. The MoMoPa-EC clinical trial (NCT04176302) investigates which of the two methods produces better clinical results. In this sub-analysis, the concordance between both methods is analyzed. Methods: In the MoMoPa-EC clinical trial, 164 patients with moderate-severe Parkinson's disease and at least two hours a day of Off will be included. At the time of patient recruitment, all of them completed a seven-day motor fluctuation diary at home (Hauser’s diary) while wearing the Parkinson's holter. In this sub-analysis, 71 patients with complete data for the purpose of this comparison were included. The intraclass correlation coefficient was calculated between the patient diary entries and the Parkinson's holter data in terms of time On, Off, and time with dyskinesias. Results: The intra-class correlation coefficient of both methods was 0.57 (95% CI: 0.3-0.74) for daily time in Off (%), 0.48 (95% CI: 0.14-0.68) for daily time in On (%), and 0.37 (95% CI %: -0.04-0.62) for daily time with dyskinesias (%). Conclusions: Both methods have a moderate agreement with each other. We will have to wait for the results of the MoMoPa-EC project to estimate which of them has the greatest clinical benefits. Acknowledgment: This work is supported by AbbVie S.L.U, the Instituto de Salud Carlos III [DTS17/00195], and the European Fund for Regional Development, 'A way to make Europe'.

Keywords: Parkinson, sensor, motor fluctuations, dyskinesia

Procedia PDF Downloads 226
1961 Advancing Urban Sustainability through Data-Driven Machine Learning Solutions

Authors: Nasim Eslamirad, Mahdi Rasoulinezhad, Francesco De Luca, Sadok Ben Yahia, Kimmo Sakari Lylykangas, Francesco Pilla

Abstract:

With the ongoing urbanization, cities face increasing environmental challenges impacting human well-being. To tackle these issues, data-driven approaches in urban analysis have gained prominence, leveraging urban data to promote sustainability. Integrating Machine Learning techniques enables researchers to analyze and predict complex environmental phenomena like Urban Heat Island occurrences in urban areas. This paper demonstrates the implementation of data-driven approach and interpretable Machine Learning algorithms with interpretability techniques to conduct comprehensive data analyses for sustainable urban design. The developed framework and algorithms are demonstrated for Tallinn, Estonia to develop sustainable urban strategies to mitigate urban heat waves. Geospatial data, preprocessed and labeled with UHI levels, are used to train various ML models, with Logistic Regression emerging as the best-performing model based on evaluation metrics to derive a mathematical equation representing the area with UHI or without UHI effects, providing insights into UHI occurrences based on buildings and urban features. The derived formula highlights the importance of building volume, height, area, and shape length to create an urban environment with UHI impact. The data-driven approach and derived equation inform mitigation strategies and sustainable urban development in Tallinn and offer valuable guidance for other locations with varying climates.

Keywords: data-driven approach, machine learning transparent models, interpretable machine learning models, urban heat island effect

Procedia PDF Downloads 37
1960 Applications of Evolutionary Optimization Methods in Reinforcement Learning

Authors: Rahul Paul, Kedar Nath Das

Abstract:

The paradigm of Reinforcement Learning (RL) has become prominent in training intelligent agents to make decisions in environments that are both dynamic and uncertain. The primary objective of RL is to optimize the policy of an agent in order to maximize the cumulative reward it receives throughout a given period. Nevertheless, the process of optimization presents notable difficulties as a result of the inherent trade-off between exploration and exploitation, the presence of extensive state-action spaces, and the intricate nature of the dynamics involved. Evolutionary Optimization Methods (EOMs) have garnered considerable attention as a supplementary approach to tackle these challenges, providing distinct capabilities for optimizing RL policies and value functions. The ongoing advancement of research in both RL and EOMs presents an opportunity for significant advancements in autonomous decision-making systems. The convergence of these two fields has the potential to have a transformative impact on various domains of artificial intelligence (AI) applications. This article highlights the considerable influence of EOMs in enhancing the capabilities of RL. Taking advantage of evolutionary principles enables RL algorithms to effectively traverse extensive action spaces and discover optimal solutions within intricate environments. Moreover, this paper emphasizes the practical implementations of EOMs in the field of RL, specifically in areas such as robotic control, autonomous systems, inventory problems, and multi-agent scenarios. The article highlights the utilization of EOMs in facilitating RL agents to effectively adapt, evolve, and uncover proficient strategies for complex tasks that may pose challenges for conventional RL approaches.

Keywords: machine learning, reinforcement learning, loss function, optimization techniques, evolutionary optimization methods

Procedia PDF Downloads 79
1959 Predictive Modeling of Bridge Conditions Using Random Forest

Authors: Miral Selim, May Haggag, Ibrahim Abotaleb

Abstract:

The aging of transportation infrastructure presents significant challenges, particularly concerning the monitoring and maintenance of bridges. This study investigates the application of Random Forest algorithms for predictive modeling of bridge conditions, utilizing data from the US National Bridge Inventory (NBI). The research is significant as it aims to improve bridge management through data-driven insights that can enhance maintenance strategies and contribute to overall safety. Random Forest is chosen for its robustness, ability to handle complex, non-linear relationships among variables, and its effectiveness in feature importance evaluation. The study begins with comprehensive data collection and cleaning, followed by the identification of key variables influencing bridge condition ratings, including age, construction materials, environmental factors, and maintenance history. Random Forest is utilized to examine the relationships between these variables and the predicted bridge conditions. The dataset is divided into training and testing subsets to evaluate the model's performance. The findings demonstrate that the Random Forest model effectively enhances the understanding of factors affecting bridge conditions. By identifying bridges at greater risk of deterioration, the model facilitates proactive maintenance strategies, which can help avoid costly repairs and minimize service disruptions. Additionally, this research underscores the value of data-driven decision-making, enabling better resource allocation to prioritize maintenance efforts where they are most necessary. In summary, this study highlights the efficiency and applicability of Random Forest in predictive modeling for bridge management. Ultimately, these findings pave the way for more resilient and proactive management of bridge systems, ensuring their longevity and reliability for future use.

Keywords: data analysis, random forest, predictive modeling, bridge management

Procedia PDF Downloads 20
1958 Classification System for Soft Tissue Injuries of Face: Bringing Objectiveness to Injury Severity

Authors: Garg Ramneesh, Uppal Sanjeev, Mittal Rajinder, Shah Sheerin, Jain Vikas, Singla Bhupinder

Abstract:

Introduction: Despite advances in trauma care, a classification system for soft tissue injuries of the face still needs to be objectively defined. Aim: To develop a classification system for soft tissue injuries of the face; that is objective, easy to remember, reproducible, universally applicable, aids in surgical management and helps to develop a structured data that can be used for future use. Material and Methods: This classification system includes those patients that need surgical management of facial injuries. Associated underlying bony fractures have been intentionally excluded. Depending upon the severity of soft tissue injury, these can be graded from 0 to IV (O-Abrasions, I-lacerations, II-Avulsion injuries with no skin loss, III-Avulsion injuries with skin loss that would need graft or flap cover, and IV-complex injuries). Anatomically, the face has been divided into three zones (Zone 1/2/3), as per aesthetic subunits. Zone 1e stands for injury of eyebrows; Zones 2 a/b/c stand for nose, upper eyelid and lower eyelid respectively; Zones 3 a/b/c stand for upper lip, lower lip and cheek respectively. Suffices R and L stand for right or left involved side, B for presence of foreign body like glass or pellets, C for extensive contamination and D for depth which can be graded as D 1/2/3 if depth is still fat, muscle or bone respectively. I is for damage to facial nerve or parotid duct. Results and conclusions: This classification system is easy to remember, clinically applicable and would help in standardization of surgical management of soft tissue injuries of face. Certain inherent limitations of this classification system are inability to classify sutured wounds, hematomas and injuries along or against Langer’s lines.

Keywords: soft tissue injuries, face, avulsion, classification

Procedia PDF Downloads 382
1957 Corrosion Resistance of 17-4 Precipitation Hardenable Stainless Steel Fabricated by Selective Laser Melting

Authors: Michella Alnajjar, Frederic Christien, Krzysztof Wolski, Cedric Bosch

Abstract:

Additive manufacturing (AM) has gained more interest in the past few years because it allows 3D parts often having a complex geometry to be directly fabricated, layer by layer according to a CAD model. One of the AM techniques is the selective laser melting (SLM) which is based on powder bed fusion. In this work, the corrosion resistance of 17-4 PH steel obtained by SLM is investigated. Wrought 17-4 PH steel is a martensitic precipitation hardenable stainless steel. It is widely used in a variety of applications such as aerospace, medical and food industries, due to its high strength and relatively good corrosion resistance. However, the combined findings of X-Ray diffraction and electron backscatter diffraction (EBSD) proved that SLM-ed 17-4 PH steel has a fully ferritic microstructure, more specifically δ ferrite. The microstructure consists of coarse ferritic grains elongated along the build direction, with a pronounced solidification crystallographic texture. These results were associated with the high cooling and heating rates experienced throughout the SLM process (10⁵-10⁶ K/s) that suppressed the austenite formation and produced a 'by-passing' phenomenon of this phase during the numerous thermal cycles. Furthermore, EDS measurements revealed a uniform distribution of elements without any dendritic structure. The extremely high cooling kinetics induced a diffusionless solidification, resulting in a homogeneous elemental composition. Consequently, the corrosion properties of this steel are altered from that of conventional ones. By using electrochemical means, it was found that SLM-ed 17-4 PH is more resistant to general corrosion than the wrought steel. However, the SLM-ed material exhibits metastable pitting due to its high porosity density. In addition, the hydrogen embrittlement of SLM-ed 17-4 PH steel is investigated, and a correlation between its behavior and the observed microstructure is made.

Keywords: corrosion resistance, 17-4 PH stainless steel, selective laser melting, hydrogen embrittlement

Procedia PDF Downloads 140
1956 Trend Analysis of Annual Total Precipitation Data in Konya

Authors: Naci Büyükkaracığan

Abstract:

Hydroclimatic observation values ​​are used in the planning of the project of water resources. Climate variables are the first of the values ​​used in planning projects. At the same time, the climate system is a complex and interactive system involving the atmosphere, land surfaces, snow and bubbles, the oceans and other water structures. The amount and distribution of precipitation, which is an important climate parameter, is a limiting environmental factor for dispersed living things. Trend analysis is applied to the detection of the presence of a pattern or trend in the data set. Many trends work in different parts of the world are usually made for the determination of climate change. The detection and attribution of past trends and variability in climatic variables is essential for explaining potential future alteration resulting from anthropogenic activities. Parametric and non-parametric tests are used for determining the trends in climatic variables. In this study, trend tests were applied to annual total precipitation data obtained in period of 1972 and 2012, in the Konya Basin. Non-parametric trend tests, (Sen’s T, Spearman’s Rho, Mann-Kendal, Sen’s T trend, Wald-Wolfowitz) and parametric test (mean square) were applied to annual total precipitations of 15 stations for trend analysis. The linear slopes (change per unit time) of trends are calculated by using a non-parametric estimator developed by Sen. The beginning of trends is determined by using the Mann-Kendall rank correlation test. In addition, homogeneities in precipitation trends are tested by using a method developed by Van Belle and Hughes. As a result of tests, negative linear slopes were found in annual total precipitations in Konya.

Keywords: trend analysis, precipitation, hydroclimatology, Konya

Procedia PDF Downloads 215
1955 Oxidosqualene Cyclase: A Novel Inhibitor

Authors: Devadrita Dey Sarkar

Abstract:

Oxidosqualene cyclase is a membrane bound enzyme in which helps in the formation of steroid scaffold in higher organisms. In a highly selective cyclization reaction oxidosqualene cyclase forms LANOSTEROL with seven chiral centres starting from the linear substrate 2,3-oxidosqualene. In humans OSC in cholesterol biosynthesis it represents a target for the discovery of novel anticholesteraemic drugs that could complement the widely used statins. The enzyme oxidosqualene: lanosterol cyclase (OSC) represents a novel target for the treatment of hypercholesterolemia. OSC catalyzes the cyclization of the linear 2,3-monoepoxysqualene to lanosterol, the initial four-ringed sterol intermediate in the cholesterol biosynthetic pathway. OSC also catalyzes the formation of 24(S), 25-epoxycholesterol, a ligand activator of the liver X receptor. Inhibition of OSC reduces cholesterol biosynthesis and selectively enhances 24(S),25-epoxycholesterol synthesis. Through this dual mechanism, OSC inhibition decreases plasma levels of low-density lipoprotein (LDL)-cholesterol and prevents cholesterol deposition within macrophages. The recent crystallization of OSC identifies the mechanism of action for this complex enzyme, setting the stage for the design of OSC inhibitors with improved pharmacological properties for cholesterol lowering and treatment of atherosclerosis. While studying and designing the inhibitor of oxidosqulene cyclase, I worked on the pdb id of 1w6k which was the most worked on pdb id and I used several methods, techniques and softwares to identify and validate the top most molecules which could be acting as an inhibitor for oxidosqualene cyclase. Thus, by partial blockage of this enzyme, both an inhibition of lanosterol and subsequently cholesterol formation as well as a concomitant effect on HMG-CoA reductase can be achieved. Both effects complement each other and lead to an effective control of cholesterol biosynthesis. It is therefore concluded that 2,3-oxidosqualene cyclase plays a crucial role in the regulation of intracellular cholesterol homeostasis. 2,3-Oxidosqualene cyclase inhibitors offer an attractive approach for novel lipid-lowering agents.

Keywords: anticholesteraemic, crystallization, statins, homeostasis

Procedia PDF Downloads 349
1954 An Experimental Approach to the Influence of Tipping Points and Scientific Uncertainties in the Success of International Fisheries Management

Authors: Jules Selles

Abstract:

The Atlantic and Mediterranean bluefin tuna fishery have been considered as the archetype of an overfished and mismanaged fishery. This crisis has demonstrated the role of public awareness and the importance of the interactions between science and management about scientific uncertainties. This work aims at investigating the policy making process associated with a regional fisheries management organization. We propose a contextualized computer-based experimental approach, in order to explore the effects of key factors on the cooperation process in a complex straddling stock management setting. Namely, we analyze the effects of the introduction of a socio-economic tipping point and the uncertainty surrounding the estimation of the resource level. Our approach is based on a Gordon-Schaefer bio-economic model which explicitly represents the decision making process. Each participant plays the role of a stakeholder of ICCAT and represents a coalition of fishing nations involved in the fishery and decide unilaterally a harvest policy for the coming year. The context of the experiment induces the incentives for exploitation and collaboration to achieve common sustainable harvest plans at the Atlantic bluefin tuna stock scale. Our rigorous framework allows testing how stakeholders who plan the exploitation of a fish stock (a common pool resource) respond to two kinds of effects: i) the inclusion of a drastic shift in the management constraints (beyond a socio-economic tipping point) and ii) an increasing uncertainty in the scientific estimation of the resource level.

Keywords: economic experiment, fisheries management, game theory, policy making, Atlantic Bluefin tuna

Procedia PDF Downloads 253