Search results for: effective Lagrangian approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21250

Search results for: effective Lagrangian approach

16360 Mitigating Biofouling on Reverse Osmosis Membranes: Applying Greener Preservatives to Biofilm Treatment

Authors: Anna Curtin, Matthew Thibodeau, Heather Buckley

Abstract:

Water scarcity is characterized by a lack of access to clean and affordable drinking water, as well as water for hygienic and economic needs. The amount of people effected by water scarcity is expected to increase in the coming years due to climate change, population growth, and pollution, amongst other things. In response, scientists are pursuing cost effective drinking water treatment methods, often with a focus on alternative water sources. Desalination of seawater via reverse osmosis is one promising alternative method. Desalination of seawater via reverse osmosis, however, is limited significantly by biofouling of the filtration membrane. Biofouling is the buildup of microorganisms in a biofilm at the water-membrane interface. It clogs the membrane, decreasing the efficiency of filtration, consequently increasing operational and maintenance costs. Although effective, existing chemical treatment methods can damage the membrane, decreasing the lifespan of the membrane; create antibiotic resistance; and cause harm to humans and the environment if they pass through the membrane into the permeate. The current project focuses on applying safer preservatives used in home and personal care products to RO membranes to investigate the biofouling treatment efficacy. Currently, many of these safer preservatives have only been tested on cells in planktonic phase in suspension cultures, not on cells in biofilms. The results of suspension culture tests are not applicable to biofouling scenarios because organisms in planktonic phase in suspension cultures exhibit different morphological, chemical, and metabolic characteristics than those in a biofilm. Testing antifoulant efficacy of safer preservatives on biofilms will provide more applicable results to biofouling on RO membranes. To do this, biofilms will be grown on 96-well-plates and minimum inhibitory concentrations (MIC90) and log-reductions will be calculated for various safer preservatives. Results from these tests will be used to guide doses for tests of safer preservatives in a bench-scale RO system.

Keywords: reverse osmosis, biofouling, preservatives, antimicrobial, safer alternative, green chemistry

Procedia PDF Downloads 142
16359 A Framework for Designing Complex Product-Service Systems with a Multi-Domain Matrix

Authors: Yoonjung An, Yongtae Park

Abstract:

Offering a Product-Service System (PSS) is a well-accepted strategy that companies may adopt to provide a set of systemic solutions to customers. PSSs were initially provided in a simple form but now take diversified and complex forms involving multiple services, products and technologies. With the growing interest in the PSS, frameworks for the PSS development have been introduced by many researchers. However, most of the existing frameworks fail to examine various relations existing in a complex PSS. Since designing a complex PSS involves full integration of multiple products and services, it is essential to identify not only product-service relations but also product-product/ service-service relations. It is also equally important to specify how they are related for better understanding of the system. Moreover, as customers tend to view their purchase from a more holistic perspective, a PSS should be developed based on the whole system’s requirements, rather than focusing only on the product requirements or service requirements. Thus, we propose a framework to develop a complex PSS that is coordinated fully with the requirements of both worlds. Specifically, our approach adopts a multi-domain matrix (MDM). A MDM identifies not only inter-domain relations but also intra-domain relations so that it helps to design a PSS that includes highly desired and closely related core functions/ features. Also, various dependency types and rating schemes proposed in our approach would help the integration process.

Keywords: inter-domain relations, intra-domain relations, multi-domain matrix, product-service system design

Procedia PDF Downloads 638
16358 Knowledge Loss Risk Assessment for Departing Employees: An Exploratory Study

Authors: Muhammad Saleem Ullah Khan Sumbal, Eric Tsui, Ricky Cheong, Eric See To

Abstract:

Organizations are posed to a threat of valuable knowledge loss when employees leave either due to retirement, resignation, job change or because of disabilities e.g. death, etc. Due to changing economic conditions, globalization, and aging workforce, organizations are facing challenges regarding retention of valuable knowledge. On the one hand, large number of employees are going to retire in the organizations whereas on the other hand, younger generation does not want to work in a company for a long time and there is an increasing trend of frequent job change among the new generation. Because of these factors, organizations need to make sure that they capture the knowledge of employee before (s)he walks out of the door. The first step in this process is to know what type of knowledge employee possesses and whether this knowledge is important for the organization. Researchers reveal in the literature that despite the serious consequences of knowledge loss in terms of organizational productivity and competitive advantage, there has not been much work done in the area of knowledge loss assessment of departing employees. An important step in the knowledge retention process is to determine the critical ‘at risk’ knowledge. Thus, knowledge loss risk assessment is a process by which organizations can gauge the importance of knowledge of the departing employee. The purpose of this study is to explore this topic of knowledge loss risk assessment by conducting a qualitative study in oil and gas sector. By engaging in dialogues with managers and executives of the organizations through in-depth interviews and adopting a grounded methodology approach, the research will explore; i) Are there any measures adopted by organizations to assess the risk of knowledge loss from departing employees? ii) Which factors are crucial for knowledge loss assessment in the organizations? iii) How can we prioritize the employees for knowledge retention according to their criticality? Grounded theory approach is used when there is not much knowledge available in the area under research and thus new knowledge is generated about the topic through an in-depth exploration of the topic by using methods such as interviews and using a systematic approach to analyze the data. The outcome of the study will generate a model for the risk of knowledge loss through factors such as the likelihood of knowledge loss, the consequence/impact of knowledge loss and quality of the knowledge loss of departing employees. Initial results show that knowledge loss assessment is quite crucial for the organizations and it helps in determining what types of knowledge employees possess e.g. organizations knowledge, subject matter expertise or relationships knowledge. Based on that, it can be assessed which employee is more important for the organizations and how to prioritize the knowledge retention process for departing employees.

Keywords: knowledge loss, risk assessment, departing employees, Hong Kong organizations

Procedia PDF Downloads 403
16357 Design and Development of Graphene Oxide Modified by Chitosan Nanosheets Showing pH-Sensitive Surface as a Smart Drug Delivery System for Control Release of Doxorubicin

Authors: Parisa Shirzadeh

Abstract:

Drug delivery systems in which drugs are traditionally used, multi-stage and at specified intervals by patients, do not meet the needs of the world's up-to-date drug delivery. In today's world, we are dealing with a huge number of recombinant peptide and protean drugs and analogues of hormones in the body, most of which are made with genetic engineering techniques. Most of these drugs are used to treat critical diseases such as cancer. Due to the limitations of the traditional method, researchers sought to find ways to solve the problems of the traditional method to a large extent. Following these efforts, controlled drug release systems were introduced, which have many advantages. Using controlled release of the drug in the body, the concentration of the drug is kept at a certain level, and in a short time, it is done at a higher rate. Graphene is a natural material that is biodegradable, non-toxic, and natural compared to carbon nanotubes; its price is lower than carbon nanotubes and is cost-effective for industrialization. On the other hand, the presence of highly effective surfaces and wide surfaces of graphene plates makes it more effective to modify graphene than carbon nanotubes. Graphene oxide is often synthesized using concentrated oxidizers such as sulfuric acid, nitric acid, and potassium permanganate based on Hummer 1 method. In comparison with the initial graphene, the resulting graphene oxide is heavier and has carboxyl, hydroxyl, and epoxy groups. Therefore, graphene oxide is very hydrophilic and easily dissolves in water and creates a stable solution. On the other hand, because the hydroxyl, carboxyl, and epoxy groups created on the surface are highly reactive, they have the ability to work with other functional groups such as amines, esters, polymers, etc. Connect and bring new features to the surface of graphene. In fact, it can be concluded that the creation of hydroxyl groups, Carboxyl, and epoxy and in fact graphene oxidation is the first step and step in creating other functional groups on the surface of graphene. Chitosan is a natural polymer and does not cause toxicity in the body. Due to its chemical structure and having OH and NH groups, it is suitable for binding to graphene oxide and increasing its solubility in aqueous solutions. Graphene oxide (GO) has been modified by chitosan (CS) covalently, developed for control release of doxorubicin (DOX). In this study, GO is produced by the hummer method under acidic conditions. Then, it is chlorinated by oxalyl chloride to increase its reactivity against amine. After that, in the presence of chitosan, the amino reaction was performed to form amide transplantation, and the doxorubicin was connected to the carrier surface by π-π interaction in buffer phosphate. GO, GO-CS, and GO-CS-DOX characterized by FT-IR, RAMAN, TGA, and SEM. The ability to load and release is determined by UV-Visible spectroscopy. The loading result showed a high capacity of DOX absorption (99%) and pH dependence identified as a result of DOX release from GO-CS nanosheet at pH 5.3 and 7.4, which show a fast release rate in acidic conditions.

Keywords: graphene oxide, chitosan, nanosheet, controlled drug release, doxorubicin

Procedia PDF Downloads 118
16356 Segmentation of the Liver and Spleen From Abdominal CT Images Using Watershed Approach

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

The phase of segmentation is an important step in the processing and interpretation of medical images. In this paper, we focus on the segmentation of liver and spleen from the abdomen computed tomography (CT) images. The importance of our study comes from the fact that the segmentation of ROI from CT images is usually a difficult task. This difficulty is the gray’s level of which is similar to the other organ also the ROI are connected to the ribs, heart, kidneys, etc. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to remove the surrounding and connected organs and tissues by applying morphological filters. This first step makes the extraction of interest regions easier. The second step consists of improving the quality of the image gradient. In this step, we propose a method for improving the image gradient to reduce these deficiencies by applying the spatial filters followed by the morphological filters. Thereafter we proceed to the segmentation of the liver, spleen. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.The system has been evaluated by computing the sensitivity and specificity between the semi-automatically segmented (liver and spleen) contour and the manually contour traced by radiological experts.

Keywords: CT images, liver and spleen segmentation, anisotropic diffusion filter, morphological filters, watershed algorithm

Procedia PDF Downloads 490
16355 Dynamic Modeling of Energy Systems Adapted to Low Energy Buildings in Lebanon

Authors: Nadine Yehya, Chantal Maatouk

Abstract:

Low energy buildings have been developed to achieve global climate commitments in reducing energy consumption. They comprise energy efficient buildings, zero energy buildings, positive buildings and passive house buildings. The reduced energy demands in Low Energy buildings call for advanced building energy modeling that focuses on studying active building systems such as heating, cooling and ventilation, improvement of systems performances, and development of control systems. Modeling and building simulation have expanded to cover different modeling approach i.e.: detailed physical model, dynamic empirical models, and hybrid approaches, which are adopted by various simulation tools. This paper uses DesignBuilder with EnergyPlus simulation engine in order to; First, study the impact of efficiency measures on building energy behavior by comparing Low energy residential model to a conventional one in Beirut-Lebanon. Second, choose the appropriate energy systems for the studied case characterized by an important cooling demand. Third, study dynamic modeling of Variable Refrigerant Flow (VRF) system in EnergyPlus that is chosen due to its advantages over other systems and its availability in the Lebanese market. Finally, simulation of different energy systems models with different modeling approaches is necessary to confront the different modeling approaches and to investigate the interaction between energy systems and building envelope that affects the total energy consumption of Low Energy buildings.

Keywords: physical model, variable refrigerant flow heat pump, dynamic modeling, EnergyPlus, the modeling approach

Procedia PDF Downloads 216
16354 Inducing Cryptobiosis State of Tardigrades in Cyanobacteria Synechococcus elongatus for Effective Preservation

Authors: Nilesh Bandekar, Sumita Dasgupta, Luis Alberto Allcahuaman Huaya, Souvik Manna

Abstract:

Cryptobiosis is a dormant state where all measurable metabolic activities are at a halt, allowing an organism to survive in extreme conditions like low temperature (cryobiosis), extreme drought (anhydrobiosis), etc. This phenomenon is observed especially in tardigrades that can retain this state for decades depending on the abiotic environmental conditions. On returning to favorable conditions, tardigrades re-attain a metabolically active state. In this study, cyanobacteria as a model organism are being chosen to induce cryptobiosis for its effective preservation over a long period of time. Preserving cyanobacteria using this strategy will have multiple space applications because of its ability to produce oxygen. In addition, research has shown the survivability of this organism in space for a certain period of time. Few species of cyanobacterial residents of the soil such as Microcoleus, are able to survive in extreme drought as well. This work specifically focuses on Synechococcus elongatus, an endolith cyanobacteria with multiple benefits. It has the capability to produce 25% oxygen in water bodies. It utilizes carbon dioxide to produce oxygen via photosynthesis and also uses carbon dioxide as an energy source to form glucose via the Calvin cycle. There is a fair possibility of initiating cryptobiosis in such an organism by inducing certain proteins extracted from tardigrades such as Heat Shock Proteins (Hsp27 and Hsp30c) and/or hydrophilic Late Embryogenesis Abundant proteins (LEA). Existing methods like cryopreservation are difficult to execute in space keeping in mind their cost and heavy instrumentation. Also, extensive freezing may cause cellular damage. Therefore, cryptobiosis-induced cyanobacteria for its transportation from Earth to Mars as a part of future terraforming missions on Mars will save resources and increase the effectiveness of preservation. Finally, Cyanobacteria species like Synechococcus elongatus can also produce oxygen and glucose on Mars in favorable conditions and holds the key to terraforming Mars.

Keywords: cryptobiosis, cyanobacteria, glucose, mars, Synechococcus elongatus, tardigrades

Procedia PDF Downloads 221
16353 Recognizing Customer Preferences Using Review Documents: A Hybrid Text and Data Mining Approach

Authors: Oshin Anand, Atanu Rakshit

Abstract:

The vast increment in the e-commerce ventures makes this area a prominent research stream. Besides several quantified parameters, the textual content of reviews is a storehouse of many information that can educate companies and help them earn profit. This study is an attempt in this direction. The article attempts to categorize data based on a computed metric that quantifies the influencing capacity of reviews rendering two categories of high and low influential reviews. Further, each of these document is studied to conclude several product feature categories. Each of these categories along with the computed metric is converted to linguistic identifiers and are used in an association mining model. The article makes a novel attempt to combine feature attraction with quantified metric to categorize review text and finally provide frequent patterns that depict customer preferences. Frequent mentions in a highly influential score depict customer likes or preferred features in the product whereas prominent pattern in low influencing reviews highlights what is not important for customers. This is achieved using a hybrid approach of text mining for feature and term extraction, sentiment analysis, multicriteria decision-making technique and association mining model.

Keywords: association mining, customer preference, frequent pattern, online reviews, text mining

Procedia PDF Downloads 385
16352 Comparative Evaluation of Root Uptake Models for Developing Moisture Uptake Based Irrigation Schedules for Crops

Authors: Vijay Shankar

Abstract:

In the era of water scarcity, effective use of water via irrigation requires good methods for determining crop water needs. Implementation of irrigation scheduling programs requires an accurate estimate of water use by the crop. Moisture depletion from the root zone represents the consequent crop evapotranspiration (ET). A numerical model for simulating soil water depletion in the root zone has been developed by taking into consideration soil physical properties, crop and climatic parameters. The governing differential equation for unsaturated flow of water in the soil is solved numerically using the fully implicit finite difference technique. The water uptake by plants is simulated by using three different sink functions. The non-linear model predictions are in good agreement with field data and thus it is possible to schedule irrigations more effectively. The present paper describes irrigation scheduling based on moisture depletion from the different layers of the root zone, obtained using different sink functions for three cash, oil and forage crops: cotton, safflower and barley, respectively. The soil is considered at a moisture level equal to field capacity prior to planting. Two soil moisture regimes are then imposed for irrigated treatment, one wherein irrigation is applied whenever soil moisture content is reduced to 50% of available soil water; and other wherein irrigation is applied whenever soil moisture content is reduced to 75% of available soil water. For both the soil moisture regimes it has been found that the model incorporating a non-linear sink function which provides best agreement of computed root zone moisture depletion with field data, is most effective in scheduling irrigations. Simulation runs with this moisture uptake function result in saving 27.3 to 45.5% & 18.7 to 37.5%, 12.5 to 25% % &16.7 to 33.3% and 16.7 to 33.3% & 20 to 40% irrigation water for cotton, safflower and barley respectively, under 50 & 75% moisture depletion regimes over other moisture uptake functions considered in the study. Simulation developed can be used for an optimized irrigation planning for different crops, choosing a suitable soil moisture regime depending upon the irrigation water availability and crop requirements.

Keywords: irrigation water, evapotranspiration, root uptake models, water scarcity

Procedia PDF Downloads 329
16351 Surface Elevation Dynamics Assessment Using Digital Elevation Models, Light Detection and Ranging, GPS and Geospatial Information Science Analysis: Ecosystem Modelling Approach

Authors: Ali K. M. Al-Nasrawi, Uday A. Al-Hamdany, Sarah M. Hamylton, Brian G. Jones, Yasir M. Alyazichi

Abstract:

Surface elevation dynamics have always responded to disturbance regimes. Creating Digital Elevation Models (DEMs) to detect surface dynamics has led to the development of several methods, devices and data clouds. DEMs can provide accurate and quick results with cost efficiency, in comparison to the inherited geomatics survey techniques. Nowadays, remote sensing datasets have become a primary source to create DEMs, including LiDAR point clouds with GIS analytic tools. However, these data need to be tested for error detection and correction. This paper evaluates various DEMs from different data sources over time for Apple Orchard Island, a coastal site in southeastern Australia, in order to detect surface dynamics. Subsequently, 30 chosen locations were examined in the field to test the error of the DEMs surface detection using high resolution global positioning systems (GPSs). Results show significant surface elevation changes on Apple Orchard Island. Accretion occurred on most of the island while surface elevation loss due to erosion is limited to the northern and southern parts. Concurrently, the projected differential correction and validation method aimed to identify errors in the dataset. The resultant DEMs demonstrated a small error ratio (≤ 3%) from the gathered datasets when compared with the fieldwork survey using RTK-GPS. As modern modelling approaches need to become more effective and accurate, applying several tools to create different DEMs on a multi-temporal scale would allow easy predictions in time-cost-frames with more comprehensive coverage and greater accuracy. With a DEM technique for the eco-geomorphic context, such insights about the ecosystem dynamic detection, at such a coastal intertidal system, would be valuable to assess the accuracy of the predicted eco-geomorphic risk for the conservation management sustainability. Demonstrating this framework to evaluate the historical and current anthropogenic and environmental stressors on coastal surface elevation dynamism could be profitably applied worldwide.

Keywords: DEMs, eco-geomorphic-dynamic processes, geospatial Information Science, remote sensing, surface elevation changes,

Procedia PDF Downloads 266
16350 Wireless Backhauling for 5G Small Cell Networks

Authors: Abdullah A. Al Orainy

Abstract:

Small cell backhaul solutions need to be cost-effective, scalable, and easy to install. This paper presents an overview of small cell backhaul technologies. Wireless solutions including TV white space, satellite, sub-6 GHz radio wave, microwave and mmWave with their backhaul characteristics are discussed. Recent research on issues like beamforming, backhaul architecture, precoding and large antenna arrays, and energy efficiency for dense small cell backhaul with mmWave communications is reviewed. Recent trials of 5G technologies are summarized.

Keywords: backhaul, small cells, wireless, 5G

Procedia PDF Downloads 505
16349 Going beyond Stakeholder Participation

Authors: Florian Engel

Abstract:

Only with a radical change to an intrinsically motivated project team, through giving the employees the freedom for autonomy, mastery and purpose, it is then possible to develop excellent products. With these changes, combined with using a rapid application development approach, the group of users serves as an important indicator to test the market needs, rather than only as the stakeholders for requirements.

Keywords: intrinsic motivation, requirements elicitation, self-directed work, stakeholder participation

Procedia PDF Downloads 335
16348 Statistical Pattern Recognition for Biotechnological Process Characterization Based on High Resolution Mass Spectrometry

Authors: S. Fröhlich, M. Herold, M. Allmer

Abstract:

Early stage quantitative analysis of host cell protein (HCP) variations is challenging yet necessary for comprehensive bioprocess development. High resolution mass spectrometry (HRMS) provides a high-end technology for accurate identification alongside with quantitative information. Hereby we describe a flexible HRMS assay platform to quantify HCPs relevant in microbial expression systems such as E. Coli in both up and downstream development by means of MVDA tools. Cell pellets were lysed and proteins extracted, purified samples not further treated before applying the SMART tryptic digest kit. Peptides separation was optimized using an RP-UHPLC separation platform. HRMS-MSMS analysis was conducted on an Orbitrap Velos Elite applying CID. Quantification was performed label-free taking into account ionization properties and physicochemical peptide similarities. Results were analyzed using SIEVE 2.0 (Thermo Fisher Scientific) and SIMCA (Umetrics AG). The developed HRMS platform was applied to an E. Coli expression set with varying productivity and the corresponding downstream process. Selected HCPs were successfully quantified within the fmol range. Analysing HCP networks based on pattern analysis facilitated low level quantification and enhanced validity. This approach is of high relevance for high-throughput screening experiments during upstream development, e.g. for titer determination, dynamic HCP network analysis or product characterization. Considering the downstream purification process, physicochemical clustering of identified HCPs is of relevance to adjust buffer conditions accordingly. However, the technology provides an innovative approach for label-free MS based quantification relying on statistical pattern analysis and comparison. Absolute quantification based on physicochemical properties and peptide similarity score provides a technological approach without the need of sophisticated sample preparation strategies and is therefore proven to be straightforward, sensitive and highly reproducible in terms of product characterization.

Keywords: process analytical technology, mass spectrometry, process characterization, MVDA, pattern recognition

Procedia PDF Downloads 245
16347 Inferring Influenza Epidemics in the Presence of Stratified Immunity

Authors: Hsiang-Yu Yuan, Marc Baguelin, Kin O. Kwok, Nimalan Arinaminpathy, Edwin Leeuwen, Steven Riley

Abstract:

Traditional syndromic surveillance for influenza has substantial public health value in characterizing epidemics. Because the relationship between syndromic incidence and the true infection events can vary from one population to another and from one year to another, recent studies rely on combining serological test results with syndromic data from traditional surveillance into epidemic models to make inference on epidemiological processes of influenza. However, despite the widespread availability of serological data, epidemic models have thus far not explicitly represented antibody titre levels and their correspondence with immunity. Most studies use dichotomized data with a threshold (Typically, a titre of 1:40 was used) to define individuals as likely recently infected and likely immune and further estimate the cumulative incidence. Underestimation of Influenza attack rate could be resulted from the dichotomized data. In order to improve the use of serosurveillance data, here, a refinement of the concept of the stratified immunity within an epidemic model for influenza transmission was proposed, such that all individual antibody titre levels were enumerated explicitly and mapped onto a variable scale of susceptibility in different age groups. Haemagglutination inhibition titres from 523 individuals and 465 individuals during pre- and post-pandemic phase of the 2009 pandemic in Hong Kong were collected. The model was fitted to serological data in age-structured population using Bayesian framework and was able to reproduce key features of the epidemics. The effects of age-specific antibody boosting and protection were explored in greater detail. RB was defined to be the effective reproductive number in the presence of stratified immunity and its temporal dynamics was compared to the traditional epidemic model using use dichotomized seropositivity data. Deviance Information Criterion (DIC) was used to measure the fitness of the model to serological data with different mechanisms of the serological response. The results demonstrated that the differential antibody response with age was present (ΔDIC = -7.0). The age-specific mixing patterns with children specific transmissibility, rather than pre-existing immunity, was most likely to explain the high serological attack rates in children and low serological attack rates in elderly (ΔDIC = -38.5). Our results suggested that the disease dynamics and herd immunity of a population could be described more accurately for influenza when the distribution of immunity was explicitly represented, rather than relying only on the dichotomous states 'susceptible' and 'immune' defined by the threshold titre (1:40) (ΔDIC = -11.5). During the outbreak, RB declined slowly from 1.22[1.16-1.28] in the first four months after 1st May. RB dropped rapidly below to 1 during September and October, which was consistent to the observed epidemic peak time in the late September. One of the most important challenges for infectious disease control is to monitor disease transmissibility in real time with statistics such as the effective reproduction number. Once early estimates of antibody boosting and protection are obtained, disease dynamics can be reconstructed, which are valuable for infectious disease prevention and control.

Keywords: effective reproductive number, epidemic model, influenza epidemic dynamics, stratified immunity

Procedia PDF Downloads 259
16346 Exploration and Evaluation of the Effect of Multiple Countermeasures on Road Safety

Authors: Atheer Al-Nuaimi, Harry Evdorides

Abstract:

Every day many people die or get disabled or injured on roads around the world, which necessitates more specific treatments for transportation safety issues. International road assessment program (iRAP) model is one of the comprehensive road safety models which accounting for many factors that affect road safety in a cost-effective way in low and middle income countries. In iRAP model road safety has been divided into five star ratings from 1 star (the lowest level) to 5 star (the highest level). These star ratings are based on star rating score which is calculated by iRAP methodology depending on road attributes, traffic volumes and operating speeds. The outcome of iRAP methodology are the treatments that can be used to improve road safety and reduce fatalities and serious injuries (FSI) numbers. These countermeasures can be used separately as a single countermeasure or mix as multiple countermeasures for a location. There is general agreement that the adequacy of a countermeasure is liable to consistent losses when it is utilized as a part of mix with different countermeasures. That is, accident diminishment appraisals of individual countermeasures cannot be easily added together. The iRAP model philosophy makes utilization of a multiple countermeasure adjustment factors to predict diminishments in the effectiveness of road safety countermeasures when more than one countermeasure is chosen. A multiple countermeasure correction factors are figured for every 100-meter segment and for every accident type. However, restrictions of this methodology incorporate a presumable over-estimation in the predicted crash reduction. This study aims to adjust this correction factor by developing new models to calculate the effect of using multiple countermeasures on the number of fatalities for a location or an entire road. Regression models have been used to establish relationships between crash frequencies and the factors that affect their rates. Multiple linear regression, negative binomial regression, and Poisson regression techniques were used to develop models that can address the effectiveness of using multiple countermeasures. Analyses are conducted using The R Project for Statistical Computing showed that a model developed by negative binomial regression technique could give more reliable results of the predicted number of fatalities after the implementation of road safety multiple countermeasures than the results from iRAP model. The results also showed that the negative binomial regression approach gives more precise results in comparison with multiple linear and Poisson regression techniques because of the overdispersion and standard error issues.

Keywords: international road assessment program, negative binomial, road multiple countermeasures, road safety

Procedia PDF Downloads 235
16345 Mindful Self-Compassion Training to Alleviate Work Stress and Fatigue in Community Workers: A Mixed Method Evaluation

Authors: Catherine Begin, Jeanne Berthod, Manon Truchon

Abstract:

In Quebec, there are more than 8,000 community organizations throughout the province, representing more than 72,000 jobs. Working in a community setting involves several particularities (e.g., contact with the suffering of users, feelings of powerlessness, institutional pressure, unstable funding, etc.), which can put workers at risk of fatigue, burnout, and psychological distress. A 2007 study shows that 52% of community workers surveyed have a high psychological distress index. The Ricochet project, founded in 2019, is an initiative aimed at providing various care and services to community workers in the Quebec City region, with a global health approach. Within this program, mindful self-compassion training (MSC) is offered at a low cost. MSC is one of the effective strategies proposed in the literature to help prevent and reduce burnout. Self-compassion is the recognition that suffering, failure, and inadequacies are inherent in the human experience and that everyone, including oneself, deserves compassion. MSC training targets several behavioral, cognitive, and emotional learnings (e.g., motivating oneself with caring, better managing difficult emotions, promoting resilience, etc.). A mixed-method evaluation was conducted with the participants in order to explore the effects of the training on community workers in the Quebec City region. The participants were community workers (management or caregiver). 15 participants completed satisfaction and perceived impact surveys, and 30 participated in structured interviews. Quantitative results showed that participants were generally completely satisfied or satisfied with the training (94%) and perceived that the training allowed them to develop new strategies for dealing with stress (87%). Participants perceived effects on their mood (93%), their contact with others (80%), and their stress level (67%). Some of the barriers raised were scheduling constraints, length of training, and guilt about taking time for oneself. The qualitative results show that individuals experienced long-term benefits, as they were able to apply the tools they received during the training in their daily lives. Some barriers were noted, such as difficulty in getting away from work or problems with the employer, which prevented enrollment. Overall, the results of this evaluation support the use of MSC (mindful self-compassion) training among community workers. Future research could support this evaluation by using a rigorous design and developing innovative ways to overcome the barriers raised.

Keywords: mindful self-compassion, community workers, work stres, burnout, wellbeing at work

Procedia PDF Downloads 115
16344 Effectiveness of Metacognitive Skills in Comprehension Instruction for Elementary Students

Authors: Mahdi Taheri Asl

Abstract:

Using a variety of strategies to read text plays an important role to make students strategic independent, strategic, and metacognitive readers. Given the importance of comprehension instruction (CI), it is essential to support the fostering comprehension skills at elementary age students, particularly those who struggle with or dislike reading. One of the main components of CI is activating metacognitive skills, which double function of elementary students. Thus, it’s important to evaluate the implemented comprehension interventions to inform reading specialist and teachers. There has been limited review research in the area of CI, so the conduction review research is required. The purpose of this review is to examine the effectiveness of metacognitive reading strategies in a regular classroom environment with elementary aged students. We develop five inclusion criteria to identify researches relevant to our research. First, the article had to be published in a peer-reviewed journal from 2000 to 2023. second, the study had to include participants in elementary school it could include of special education students. Third, the intervention needed to be involved with metacognitive strategies. Fourth, the articles had to use experimental or quasi experimental design. The last one needed to include measurement of reading performance in pre and post intervention. We used computer data-based site like Eric, PsychoINFO, and google scholar to search for articles that met these criteria. we used the following search terms: comprehension instruction, meta cognitive strategies, and elementary school. The next step was to do an ancestral search that get in reviewing the relevant studies cited in the articles that were found in the database search. We identified 30studies in the initial searches. After coding agreement, we synthesized 13 with respect to the participant, setting, research design, dependent variables, measures, the intervention used by instructors, and general outcomes. The finding show metacognitive strategies were effective to empower student’s comprehension skills. It also showed that linguistic instruction will be effective if got mixed with metacognitive strategies. The research provides a useful view into reading intervention. Despite the positive effect of metacognitive instruction on students’ comprehension skills, it is not widely used in classroom.

Keywords: comprehension instruction, metacogntion, metacognitive skills, reading intervention

Procedia PDF Downloads 68
16343 Comparison of Number of Waves Surfed and Duration Using Global Positioning System and Inertial Sensors

Authors: João Madureira, Ricardo Lagido, Inês Sousa, Fraunhofer Portugal

Abstract:

Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.

Keywords: inertial measurement unit (IMU), global positioning system (GPS), smartphone, surfing performance

Procedia PDF Downloads 398
16342 Phytoremediation Waste Processing of Coffee in Various Concentration of Organic Materials Plant Using Kiambang

Authors: Siti Aminatu Zuhria

Abstract:

On wet coffee processing can improve the quality of coffee, but the coffee liquid waste that can pollute the environment. Liquid waste a lot of coffee resulting from the stripping and washing the coffee. This research will be carried out the process of handling liquid waste stripping coffee from the coffee skin with media phytoremediation using plants kiambang. The purpose of this study was to determine the characteristics of the coffee liquid waste and plant phytoremediation kiambang as agent in various concentrations of liquid waste coffee as well as determining the most optimal concentration in the improved quality of waste water quality standard approach. This research will be conducted through two stages, namely the preliminary study and the main study. In a preliminary study aims to determine the ability of the plant life kiambang as phytoremediation agent in the media well water, distilled water and liquid waste coffee. The main study will be conducted wastewater dilution and coffee will be obtained COD concentration variations. Results are expected at this research that can determine the ability of plants kiambang as an agent for phytoremediation in wastewater treatment with various concentrations of waste and the most optimal concentration in the improved quality of waste water quality standard approach.

Keywords: wet coffee processing, phytoremediation, Kiambang plant, variation concentration liquid waste

Procedia PDF Downloads 302
16341 Online Think–Pair–Share in a Third-Age Information and Communication Technology Course

Authors: Daniele Traversaro

Abstract:

Problem: Senior citizens have been facing a challenging reality as a result of strict public health measures designed to protect people from the COVID-19 outbreak. These include the risk of social isolation due to the inability of the elderly to integrate with technology. Never before have information and communication technology (ICT) skills become essential for their everyday life. Although third-age ICT education and lifelong learning are widely supported by universities and governments, there is a lack of literature on which teaching strategy/methodology to adopt in an entirely online ICT course aimed at third-age learners. This contribution aims to present an application of the Think-Pair-Share (TPS) learning method in an ICT third-age virtual classroom with an intergenerational approach to conducting online group labs and review activities. This collaborative strategy can help increase student engagement, promote active learning and online social interaction. Research Question: Is collaborative learning applicable and effective, in terms of student engagement and learning outcomes, for an entirely online third-age ICT introductory course? Methods: In the TPS strategy, a problem is posed by the teacher, students have time to think about it individually, and then they work in pairs (or small groups) to solve the problem and share their ideas with the entire class. We performed four experiments in the ICT course of the University of the Third Age of Genova (University of Genova, Italy) on the Microsoft Teams platform. The study cohort consisted of 26 students over the age of 45. Data were collected through online questionnaires. Two have been proposed, one at the end of the first activity and another at the end of the course. They consisted of five and three close-ended questions, respectively. The answers were on a Likert scale (from 1 to 4) except two questions (which asked the number of correct answers given individually and in groups) and the field for free comments/suggestions. Results: Results show that groups perform better than individual students (with scores greater than one order of magnitude) and that most students found it helpful to work in groups and interact with their peers. Insights: From these early results, it appears that TPS is applicable to an online third-age ICT classroom and useful for promoting discussion and active learning. Despite this, our experimentation has a number of limitations. First of all, the results highlight the need for more data to be able to perform a statistical analysis in order to determine the effectiveness of this methodology in terms of student engagement and learning outcomes as a future direction.

Keywords: collaborative learning, information technology education, lifelong learning, older adult education, think-pair-share

Procedia PDF Downloads 187
16340 Recovery of Polyphenolic Phytochemicals From Greek Grape Pomace (Vitis Vinifera L.)

Authors: Christina Drosou, Konstantina E. Kyriakopoulou, Andreas Bimpilas, Dimitrios Tsimogiannis, Magdalini C. Krokida

Abstract:

Rationale: Agiorgitiko is one of the most widely-grown and commercially well-established red wine varieties in Greece. Each year viticulture industry produces a large amount of waste consisting of grape skins and seeds (pomace) during a short period. Grapes contain polyphenolic compounds which are partially transferred to wine during winemaking. Therefore, winery wastes could be an alternative cheap source for obtaining such compounds with important antioxidant activity. Specifically, red grape waste contains anthocyanins and flavonols which are characterized by multiple biological activities, including cardioprotective, anti-inflammatory, anti-carcinogenic, antiviral and antibacterial properties attributed mainly to their antioxidant activity. Ultrasound assisted extraction (UAE) is considered an effective way to recover phenolic compounds, since it combines the advantage of mechanical effect with low temperature. Moreover, green solvents can be used in order to recover extracts intended for used in the food and nutraceutical industry. Apart from the extraction, pre-treatment process like drying can play an important role on the preservation of the grape pomace and the enhancement of its antioxidant capacity. Objective: The aim of this study is to recover natural extracts from winery waste with high antioxidant capacity using green solvents so they can be exploited and utilized as enhancers in food or nutraceuticals. Methods: Agiorgitiko grape pomace was dehydrated by air drying (AD) and accelerated solar drying (ASD) in order to explore the effect of the pre-treatment on the recovery of bioactive compounds. UAE was applied in untreated and dried samples using water and water: ethanol (1:1) as solvents. The total antioxidant potential and phenolic content of the extracts was determined using the 1,1-diphenyl-2-picrylhydrazyl (DPPH) radical scavenging assay and Folin-Ciocalteu method, respectively. Finally, the profile of anthocyanins and flavonols was specified using HPLC-DAD analysis. The efficiency of processes was determined in terms of extraction yield, antioxidant activity, phenolic content and the anthocyanins and flavovols profile. Results & Discussion: The experiments indicated that the pre-treatment was essential for the recovery of highly nutritious compounds from the pomace as long as the extracts samples showed higher phenolic content and antioxidant capacity. Water: ethanol (1:1) was considered a more effective solvent on the recovery of phenolic compounds. Moreover, ASD grape pomace extracted with the solvent system exhibited the highest antioxidant activity (IC50=0.36±0.01mg/mL) and phenolic content (TPC=172.68±0.01mgGAE/g dry extract), followed by AD and untreated pomace. The major compounds recovered were malvidin3-O-glucoside and quercetin3-O-glucoside according to the HPLC analysis. Conclusions: Winery waste can be exploited for the recovery of nutritious compounds using green solvents such as water or ethanol. The pretreatment of the pomace can significantly affect the concentration of phenolic compounds, while UAE is considered a highly effective extraction process.

Keywords: agiorgitico grape pomace, antioxidants, phenolic compounds, ultrasound assisted extraction

Procedia PDF Downloads 389
16339 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm

Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn

Abstract:

Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.

Keywords: binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct

Procedia PDF Downloads 222
16338 Aspiring to Achieve a Fairer Society

Authors: Bintou Jobe

Abstract:

Background: The research is focused on the concept of equality, diversity, and inclusion (EDI) and the need to achieve equity by treating individuals according to their circumstances and needs. The research is rooted in the UK Equality Act 2010, which emphasizes the importance of equal opportunities for all individuals regardless of their background and social life. However, inequality persists in society, particularly for those from minority backgrounds who face discrimination. Research Aim: The aim of this research is to promote equality, diversity, and inclusion by encouraging the regeneration of minds and the eradication of stereotypes. The focus is on promoting good Equality, Diversity and Inclusion practices in various settings, including schools, colleges, universities, and workplaces, to create environments where every individual feels a sense of belonging. Methodology: The research utilises a literature review approach to gather information on promoting inclusivity, diversity, and inclusion. Findings: The research highlights the significance of promoting equality, diversity, and inclusion practices to ensure that individuals receive the respect and dignity they deserve. It emphasises the importance of treating individuals based on their unique circumstances and needs rather than relying on stereotypes. The research also emphasises the benefits of diversity and inclusion in enhancing innovation, creativity, and productivity. The theoretical importance of this research is to raise awareness about the importance of regenerating minds, challenging stereotypes, and promoting equality, diversity, and inclusion. The emphasis is on treating individuals based on their circumstances and needs rather than relying on generalizations. Diversity and inclusion are beneficial in different settings, as highlighted by the research. By raising awareness about the importance of mind regeneration, eradicating stereotypes, and promoting equality, diversity, and inclusion, this research makes a significant contribution to the subject area. It emphasizes the necessity of treating individuals based on their unique circumstances instead of relying on generalizations. However, the methodology could be strengthened by incorporating primary research to complement the literature review approach. Data Collection and Analysis Procedures: The research utilised a literature review approach to gather relevant information on promoting inclusivity, diversity, and inclusion. NVivo software application was used to analysed and synthesize the findings to identify themes and support the research aim and objectives. Question Addressed: This research addresses the question of how to promote inclusivity, diversity, and inclusion and reduce the prevalence of stereotypes and prejudice. It explores the need to treat individuals based on their unique circumstances and needs rather than relying on generic assumptions. Encourage individuals to adopt a more inclusive approach. Provide managers with responsibility and training that helps them understand the importance of their roles in shaping the workplace culture. Have an equality, diversity, and inclusion manager from a majority background at the senior level who can speak up for underrepresented groups and flag any issues that need addressing. Conclusion: The research emphasizes the importance of promoting equality, diversity, and inclusion practices to create a fairer society. It highlights the need to challenge stereotypes, treat individuals according to their circumstances and needs, and promote a culture of respect and dignity.

Keywords: equality, fairer society, inclusion, diversity

Procedia PDF Downloads 44
16337 Promoting Social Advocacy through Digital Storytelling: The Case of Ocean Acidification

Authors: Chun Chen Yea, Wen Huei Chou

Abstract:

Many chemical changes in the atmosphere and the ocean are invisible to the naked eye, but they have profound impacts. These changes not only confirm the phenomenon of global carbon pollution, but also forewarn that more changes are coming. The carbon dioxide gases emitted from the burning of fossil fuels dissolve into the ocean and chemically react with seawater to form carbonic acid, which increases the acidity of the originally alkaline seawater. This gradual acidification is occurring at an unprecedented rate and will affect the effective formation of carapace of some marine organisms such as corals and crustaceans, which are almost entirely composed of calcium carbonate. The carapace of these organisms will become more dissoluble. Acidified seawater not only threatens the survival of marine life, but also negatively impacts the global ecosystem via the food chain. Faced with the threat of ocean acidification, all humans are duty-bound. The industrial sector outputs the highest level of carbon dioxide emissions in Taiwan, and the petrochemical industry is the major contributor. Ever since the construction of Formosa Plastics Group's No. 6 Naphtha Cracker Plant in Yunlin County, there have been many environmental concerns such as air pollution and carbon dioxide emission. The marine life along the coast of Yunlin is directly affected by ocean acidification arising from the carbon emissions. Societal change demands our willingness to act, which is what social advocacy promotes. This study uses digital storytelling for social advocacy and ocean acidification as the subject of a visual narrative in visualization to demonstrate the subsequent promotion of social advocacy. Storytelling can transform dull knowledge into an engaging narrative of the crisis faced by marine life. Digital dissemination is an effective social-work practice. The visualization promoting awareness on ocean acidification disseminated via social media platforms, such as Facebook and Instagram. Social media enables users to compose their own messages and share information across different platforms, which helps disseminate the core message of social advocacy.

Keywords: digital storytelling, visualization, ocean acidification, social advocacy

Procedia PDF Downloads 115
16336 Automated Evaluation Approach for Time-Dependent Question Answering Pairs on Web Crawler Based Question Answering System

Authors: Shraddha Chaudhary, Raksha Agarwal, Niladri Chatterjee

Abstract:

This work demonstrates a web crawler-based generalized end-to-end open domain Question Answering (QA) system. An efficient QA system requires a significant amount of domain knowledge to answer any question with the aim to find an exact and correct answer in the form of a number, a noun, a short phrase, or a brief piece of text for the user's questions. Analysis of the question, searching the relevant document, and choosing an answer are three important steps in a QA system. This work uses a web scraper (Beautiful Soup) to extract K-documents from the web. The value of K can be calibrated on the basis of a trade-off between time and accuracy. This is followed by a passage ranking process using the MS-Marco dataset trained on 500K queries to extract the most relevant text passage, to shorten the lengthy documents. Further, a QA system is used to extract the answers from the shortened documents based on the query and return the top 3 answers. For evaluation of such systems, accuracy is judged by the exact match between predicted answers and gold answers. But automatic evaluation methods fail due to the linguistic ambiguities inherent in the questions. Moreover, reference answers are often not exhaustive or are out of date. Hence correct answers predicted by the system are often judged incorrect according to the automated metrics. One such scenario arises from the original Google Natural Question (GNQ) dataset which was collected and made available in the year 2016. Use of any such dataset proves to be inefficient with respect to any questions that have time-varying answers. For illustration, if the query is where will be the next Olympics? Gold Answer for the above query as given in the GNQ dataset is “Tokyo”. Since the dataset was collected in the year 2016, and the next Olympics after 2016 were in 2020 that was in Tokyo which is absolutely correct. But if the same question is asked in 2022 then the answer is “Paris, 2024”. Consequently, any evaluation based on the GNQ dataset will be incorrect. Such erroneous predictions are usually given to human evaluators for further validation which is quite expensive and time-consuming. To address this erroneous evaluation, the present work proposes an automated approach for evaluating time-dependent question-answer pairs. In particular, it proposes a metric using the current timestamp along with top-n predicted answers from a given QA system. To test the proposed approach GNQ dataset has been used and the system achieved an accuracy of 78% for a test dataset comprising 100 QA pairs. This test data was automatically extracted using an analysis-based approach from 10K QA pairs of the GNQ dataset. The results obtained are encouraging. The proposed technique appears to have the possibility of developing into a useful scheme for gathering precise, reliable, and specific information in a real-time and efficient manner. Our subsequent experiments will be guided towards establishing the efficacy of the above system for a larger set of time-dependent QA pairs.

Keywords: web-based information retrieval, open domain question answering system, time-varying QA, QA evaluation

Procedia PDF Downloads 99
16335 The Impact of Physical Activity for Recovering Cancer Patients

Authors: Martyn Queen, Diane Crone, Andrew Parker, Saul Bloxham

Abstract:

Rationale: There is a growing body of evidence that supports the use of physical activity during and after cancer treatment. However, activity levels for patients remain low. As more cancer patients are treated successfully, and treatment costs continue to escalate, physical activity may be a promising adjunct to a person-centred healthcare approach to recovery. Aim: The aim was to further understand how physical activity may enhance the recovery process for a group of mixed-site cancer patients. Objectives: The research investigated longitudinal changes in physical activity and perceived the quality of life between two and six month’s post-exercise interventions. It also investigated support systems that enabled patients to sustain these perceived changes. Method: The respondent cohort comprised 14 mixed-site cancer patients aged 43-70 (11 women, 3 men), who participated in a two-phase physical activity intervention that took place at a university in the South West of England. Phase 1 consisted of an eight-week structured physical activity programme; Phase 2 consisted of four months of non-supervised physical activity. Semi-structured interviews took place three times over six months with each participant. Grounded theory informed the data collection and analysis which, in turn, facilitated theoretical development. Findings: Our findings propose three theories on the impact of physical activity for recovering cancer patients: 1) Knowledge gained through a structured exercise programme can enable recovering cancer patients to independently sustain physical activity to four-month follow-up. 2) Sustaining physical activity for six months promotes positive changes in the quality of life indicators of chronic fatigue, self-efficacy, the ability to self-manage and energy levels. 3) Peer support from patients facilitates adherence to a structured exercise programme and support from a spouse, or life partner facilitates independently sustained physical activity to four-month follow-up. Conclusions: This study demonstrates that qualitative research can provide an evidence base that could be used to support future care plans for cancer patients. Findings also demonstrate that a physical activity intervention can be effective at helping cancer patients recover from the side effects of their treatment, and recommends that physical activity should become an adjunct therapy alongside traditional cancer treatments.

Keywords: physical activity, health, cancer recovery, quality of life, support systems, qualitative, grounded theory, person-centred healthcare

Procedia PDF Downloads 287
16334 Consent and the Construction of Unlawfulness

Authors: Susanna Menis

Abstract:

The context of this study revolves around the theme of consent and the construction of unlawfulness in judicial decisions. It aims to explore the formation of societal perceptions of unlawfulness within the context of consensual sexual acts leading to harmful consequences. This study investigates how judges create legal rules that reflect social solidarity and protect against violence. Specifically, the research aims to understand the justification behind criminalising consensual sexual activity when categorised under different offences. The main question addressed in this study will evaluate the way judges create legal rules that they believe reflect social solidarity and protect against violence. The study employs a historical genealogy approach as its methodology. This approach allows for tracing back the original formation of societal perspectives on unlawfulness, thus highlighting the socially constructed nature of the present understanding. The data for this study will be collected through an extensive literature review, examining historical legal cases and documents that shape the understanding of unlawfulness. This will provide a comprehensive view of how social attitudes toward private sexual relations influenced the creation of legal rules. The theoretical importance of this research lies in its contribution to socio-legal scholarship. This study adds to the existing knowledge on the topic by exploring questions of unconscious bias and its origins. The findings shed light on how and why individuals possess unconscious biases, particularly within the judicial system. In conclusion, this study investigates judicial decisions concerning consensual sexual acts and the construction of unlawfulness. By employing a historical genealogy approach, the research sheds light on how judges create legal rules that reflect social solidarity and aim to protect against violence. The theoretical importance of this study lies in its contribution to understanding unconscious bias and its origins within the judicial system. Through data collection and analysis procedures, this study aims to provide valuable insights into the formation of social attitudes towards private sexual relations and its impact on legal rulings.

Keywords: consent, sexual offences, offences against the person, legal genealogy, social construct

Procedia PDF Downloads 57
16333 Decision Framework for Cross-Border Railway Infrastructure Projects

Authors: Dimitrios J. Dimitriou, Maria F. Sartzetaki

Abstract:

Transport infrastructure assets are key components of the national asset portfolio. The decision to invest in a new infrastructure in transports could take from a few years to some decades. This is mainly because of the need to reserve and spent many capitals, the long payback period, the number of the stakeholders involved in decision process and –many times- the investment and business risks are high. Therefore, the decision assessment framework is an essential challenge linked with the key decision factors meet the stakeholder expectations highlighting project trade-offs, financial risks, business uncertainties and market limitations. This paper examines the decision process for new transport infrastructure projects in cross border regions, where a wide range of stakeholders with different expectation is involved. According to a consequences analysis systemic approach, the relationship of transport infrastructure development, economic system development and stakeholder expectation is analyzed. Adopting the on system of system methodological approach, the decision making framework, variables, inputs and outputs are defined, highlighting the key shareholder’s role and expectations. The application provides the methodology outputs presenting the proposed decision framework for a strategic railway project in north Greece deals with the upgrade of the existing railway corridor connecting Greece, Turkey and Bulgaria.

Keywords: decision making, system of system, cross-border, infrastructure project

Procedia PDF Downloads 311
16332 Best Practical Technique to Drain Recoverable Oil from Unconventional Deep Libyan Oil Reservoir

Authors: Tarek Duzan, Walid Esayed

Abstract:

Fluid flow in porous media is attributed fundamentally to parameters that are controlled by depositional and post-depositional environments. After deposition, digenetic events can act negatively on the reservoir and reduce the effective porosity, thereby making the rock less permeable. Therefore, exploiting hydrocarbons from such resources requires partially altering the rock properties to improve the long-term production rate and enhance the recovery efficiency. In this study, we try to address, firstly, the phenomena of permeability reduction in tight sandstone reservoirs and illustrate the implemented procedures to investigate the problem roots; finally, benchmark the candidate solutions at the field scale and recommend the mitigation strategy for the field development plan. During the study, two investigations have been considered: subsurface analysis using ( PLT ) and Laboratory tests for four candidate wells of the interested reservoir. Based on the above investigations, it was obvious that the Production logging tool (PLT) has shown areas of contribution in the reservoir, which is considered very limited, considering the total reservoir thickness. Also, Alcohol treatment was the first choice to go with for the AA9 well. The well productivity has been relatively restored but not to its initial productivity. Furthermore, Alcohol treatment in the lab was effective and restored permeability in some plugs by 98%, but operationally, the challenge would be the ability to distribute enough alcohol in a wellbore to attain the sweep Efficiency obtained within a laboratory core plug. However, the Second solution, which is based on fracking wells, has shown excellent results, especially for those wells that suffered a high drop in oil production. It is suggested to frac and pack the wells that are already damaged in the Waha field to mitigate the damage and restore productivity back as much as possible. In addition, Critical fluid velocity and its effect on fine sand migration in the reservoir have to be well studied on core samples, and therefore, suitable pressure drawdown will be applied in the reservoir to limit fine sand migration.

Keywords: alcohol treatment, post-depositional environments, permeability, tight sandstone

Procedia PDF Downloads 64
16331 Machine learning Assisted Selective Emitter design for Solar Thermophotovoltaic System

Authors: Ambali Alade Odebowale, Andargachew Mekonnen Berhe, Haroldo T. Hattori, Andrey E. Miroshnichenko

Abstract:

Solar thermophotovoltaic systems (STPV) have emerged as a promising solution to overcome the Shockley-Queisser limit, a significant impediment in the direct conversion of solar radiation into electricity using conventional solar cells. The STPV system comprises essential components such as an optical concentrator, selective emitter, and a thermophotovoltaic (TPV) cell. The pivotal element in achieving high efficiency in an STPV system lies in the design of a spectrally selective emitter or absorber. Traditional methods for designing and optimizing selective emitters are often time-consuming and may not yield highly selective emitters, posing a challenge to the overall system performance. In recent years, the application of machine learning techniques in various scientific disciplines has demonstrated significant advantages. This paper proposes a novel nanostructure composed of four-layered materials (SiC/W/SiO2/W) to function as a selective emitter in the energy conversion process of an STPV system. Unlike conventional approaches widely adopted by researchers, this study employs a machine learning-based approach for the design and optimization of the selective emitter. Specifically, a random forest algorithm (RFA) is employed for the design of the selective emitter, while the optimization process is executed using genetic algorithms. This innovative methodology holds promise in addressing the challenges posed by traditional methods, offering a more efficient and streamlined approach to selective emitter design. The utilization of a machine learning approach brings several advantages to the design and optimization of a selective emitter within the STPV system. Machine learning algorithms, such as the random forest algorithm, have the capability to analyze complex datasets and identify intricate patterns that may not be apparent through traditional methods. This allows for a more comprehensive exploration of the design space, potentially leading to highly efficient emitter configurations. Moreover, the application of genetic algorithms in the optimization process enhances the adaptability and efficiency of the overall system. Genetic algorithms mimic the principles of natural selection, enabling the exploration of a diverse range of emitter configurations and facilitating the identification of optimal solutions. This not only accelerates the design and optimization process but also increases the likelihood of discovering configurations that exhibit superior performance compared to traditional methods. In conclusion, the integration of machine learning techniques in the design and optimization of a selective emitter for solar thermophotovoltaic systems represents a groundbreaking approach. This innovative methodology not only addresses the limitations of traditional methods but also holds the potential to significantly improve the overall performance of STPV systems, paving the way for enhanced solar energy conversion efficiency.

Keywords: emitter, genetic algorithm, radiation, random forest, thermophotovoltaic

Procedia PDF Downloads 59