Search results for: conditional tabular GAN
105 EFL Teachers’ Metacognitive Awareness as a Predictor of Their Professional Success
Authors: Saeedeh Shafiee Nahrkhalaji
Abstract:
Metacognitive knowledge increases EFL students’ ability to be successful learners. Although this relationship has been investigated by a number of scholars, EFL teachers’ explicit awareness of their cognitive knowledge has not been sufficiently explored. The aim of this study was to examine the role of EFL teachers’ metacognitive knowledge in their pedagogical performance. Furthermore, the role played by years of their academic education and teaching experience was also studied. Fifty female EFL teachers were selected. They completed Metacognitive Awareness Inventory (MAI) that assessed six components of metacognition including procedural knowledge, declarative knowledge, conditional knowledge, planning, evaluating, and management strategies. Near the end of the academic semester, the students of each class filled in ‘the Language Teacher Characteristics Questionnaire’ to evaluate their teachers’ pedagogical performance. Four elements of MAI, declarative knowledge, planning, evaluating, and management strategies were found to be significantly correlated with EFL teachers’ pedagogical success. Significant correlation was also established between metacognitive knowledge and EFL teachers’ years of academic education and teaching experience. The findings obtained from this research have contributing implication for EFL teacher educators. The discussion concludes by setting out directions for future research.Keywords: metacognotive knowledge, pedagogical performance, language teacher characteristics questionnaire, metacognitive awareness inventory
Procedia PDF Downloads 329104 Determinants of International Volatility Passthroughs of Agricultural Commodities: A Panel Analysis of Developing Countries
Authors: Tetsuji Tanaka, Jin Guo
Abstract:
The extant literature has not succeeded in uncovering the common determinants of price volatility transmissions of agricultural commodities from international to local markets, and further, has rarely investigated the role of self-sufficiency measures in the context of national food security. We analyzed various factors to determine the degree of price volatility transmissions of wheat, rice, and maize between world and domestic markets using GARCH models with dynamic conditional correlation (DCC) specifications and panel-feasible generalized least square models. We found that the grain autarky system has the potential to diminish volatility pass-throughs for three grain commodities. Furthermore, it was discovered that the substitutive commodity consumption behavior between maize and wheat buffers the volatility transmissions of both, but rice does not function as a transmission-relieving element, either for the volatilities of wheat or maize. The effectiveness of grain consumption substitution to insulate the pass-throughs from global markets is greater than that of cereal self-sufficiency. These implications are extremely beneficial for developing governments to protect their domestic food markets from uncertainty in foreign countries and as such, improves food security.Keywords: food security, GARCH, grain self-sufficiency, volatility transmission
Procedia PDF Downloads 155103 Internet of Things Edge Device Power Modelling and Optimization Simulator
Authors: Cian O'Shea, Ross O'Halloran, Peter Haigh
Abstract:
Wireless Sensor Networks (WSN) are Internet of Things (IoT) edge devices. They are becoming widely adopted in many industries, including health care, building energy management, and conditional monitoring. As the scale of WSN deployments increases, the cost and complexity of battery replacement and disposal become more significant and in time may become a barrier to adoption. Harvesting ambient energies provide a pathway to reducing dependence on batteries and in the future may lead to autonomously powered sensors. This work describes a simulation tool that enables the user to predict the battery life of a wireless sensor that utilizes energy harvesting to supplement the battery power. To create this simulator, all aspects of a typical WSN edge device were modelled including, sensors, transceiver, and microcontroller as well as the energy source components (batteries, solar cells, thermoelectric generators (TEG), supercapacitors and DC/DC converters). The tool allows the user to plug and play different pre characterized devices as well as add user-defined devices. The goal of this simulation tool is to predict the lifetime of a device and scope for extension using ambient energy sources.Keywords: Wireless Sensor Network, IoT, edge device, simulation, solar cells, TEG, supercapacitor, energy harvesting
Procedia PDF Downloads 130102 Solving One of the Variants of Necktie Paradox for Business Proposals
Authors: Natarajan Vijayarangan, Viswanath Kumar Ganesan, G. Kumudhavalli
Abstract:
This abstract figures out an uncertainty problem pertaining to evaluating business proposals or concept notes in an organisation. Let us consider business proposal evaluation process (BPEP) for execution of corporate research cum business projects in the organisation. Assume that two concept notes X and Y of BPEP are approved: one of them is a full-fledged type (100% financial approval given by the organisation) - X and other one is a conditional type (a partial financial approval given by the organisation) - Y. Then a penalty criteria has been introduced during the process. At the end of annual appraisal, if both of them complete as per the goals and objectives committed or figured out at the time of concept note submission, then both will get an incentive of $N from the organisation. If one of them doesn't fulfill the goals and objectives at the year-end appraisal, then d% reduction or cut will be levied on the project budget for the next year. If X fulfills the goals and objectives and Y doesn't , then X gets a gain of d% on Y's previous year budget and Y gets a loss of d% from the previous year budget for the next year. And vice-versa. Further, an incentive of $N will be given to those who gains. This process is a part of Necktie paradox and inherits an uncertainty principle on X or Y getting more than $N even if X or Y performs well.Solving the above problem and generalizing on finitely many concept notes will be a challenging task.Keywords: concept notes, necktie paradox, annual appraisal, project budget and gain or loss
Procedia PDF Downloads 469101 Analysis of Risks of Adopting Integrated Project Delivery: Application of Bayesian Theory
Abstract:
Integrated project delivery (IPD) is a project delivery method distinguished by a shared risk/rewards mechanism and multiparty agreement. IPD has drawn increasing attention from construction industry due to its reliability to deliver high-performing buildings. However, unavailable IPD specific insurance concerns the industry participants who are interested in IPD implementation. Even though the risk management capability can be enhanced using shared risk mechanism, some risks may occur when the partners do not commit themselves into the integrated practices in a desired manner. This is because the intense collaboration and close integration can not only create added value but bring new opportunistic behaviors and disputes. The study is aimed to investigate the risks of implementing IPD using Bayesian theory. IPD risk taxonomy is presented to identify all potential risks of implementing IPD and a risk network map is developed to capture the interdependencies between IPD risks. The conditional relations between risk occurrences and the impacts of IPD risks on project performances are evaluated and simulated based on Bayesian theory. The probability of project outcomes is predicted by simulation. In addition, it is found that some risks caused by integration are most possible occurred risks. This study can help the IPD project participants identify critical risks of adopting IPD to improve project performances. In addition, it is helpful to develop IPD specific insurance when the pertinent risks can be identified.Keywords: Bayesian theory, integrated project delivery, project risks, project performances
Procedia PDF Downloads 300100 The Effect of Relocating a Red Deer Stag on the Size of Its Home Range and Activity
Authors: Erika Csanyi, Gyula Sandor
Abstract:
In the course of the examination, we sought to answer the question of how and to what extent the home range and daily activity of a deer stag relocated from its habitual surroundings changes. We conducted the examination in two hunting areas in Hungary, about 50 km from one another. The control area was in the north of Somogy County, while the sample area was an area of similar features in terms of forest cover, tree stock, agricultural structure, altitude above sea level, climate, etc. in the south of Somogy County. Three middle-aged red deer stags were captured with rocket nets, immobilized and marked with GPS-Plus Collars manufactured by Vectronic Aerospace Gesellschaft mit beschränkter Haftung. One captured species was relocated. We monitored deer movements over 24-hour periods at 3 months. In the course of the examination, we analysed the behaviour of the relocated species and those that remained in their original habitat, as well as the temporal evolution of their behaviour. We examined the characteristics of the marked species’ daily activities and the hourly distance they covered. We intended to find out the difference between the behaviour of the species remaining in their original habitat and of those relocated to a more distant, but similar habitat. In summary, based on our findings, it can be established that such enforced relocations to a different habitat (e.g., game relocation) significantly increases the home range of the species in the months following relocation. Home ranges were calculated using the full data set and the minimum convex polygon (MCP) method. Relocation did not increase the nocturnal and diurnal movement activity of the animal in question. Our research found that the home range of the relocated species proved to be significantly higher than that of those species that were not relocated. The results have been presented in tabular form and have also been displayed on a map. Based on the results, it can be established that relocation inherently includes the risk of falling victim to poaching, vehicle collision. It was only in the third month following relocation that the home range of the relocated species subsided to the level of those species that were not relocated. It is advisable to take these observations into consideration in relocating red deer for nature conservation or game management purposes.Keywords: Cervus elaphus, home range, relocation, red deer stag
Procedia PDF Downloads 13799 The International Monetary Fund’s Treatment Towards Argentina and Brazil During Financial Negotiations for Their First Adjustment Programs, 1958-64
Authors: Fernanda Conforto de Oliveira
Abstract:
The International Monetary Fund (IMF) has a central role in global financial governance as the world’s leading crisis lender. Its practice of conditional lending – conditioning loans on the implementation of economic policy adjustments – is the primary lever by which the institution interacts with and influences the policy choices of member countries and has been a key topic of interest to scholars and public opinion. However, empirical evidence about the economic and (geo)political determinants of IMF lending behavior remains inconclusive, and no model that explains IMF policies has been identified. This research moves beyond panel analysis to focus on financial negotiations for the first IMF programs in Argentina and Brazil in the early post-war period. It seeks to understand why negotiations achieved distinct objectives: Argentinean officials cooperated and complied with IMF policies, whereas their Brazilian counterparts hesitated. Using qualitative and automated text analysis, this paper analyses the hypothesis about whether a differential IMF treatment could help to explain these distinct outcomes. This paper contributes to historical studies on IMF-Latin America relations and the broader literature in international policy economy about IMF policies.Keywords: international monetary fund, international history, financial history, Latin American economic history, natural language processing, sentiment analysis
Procedia PDF Downloads 6398 Dynamic Risk Model for Offshore Decommissioning Using Bayesian Belief Network
Authors: Ahmed O. Babaleye, Rafet E. Kurt
Abstract:
The global oil and gas industry is beginning to witness an increase in the number of installations moving towards decommissioning. Decommissioning of offshore installations is a complex, costly and hazardous activity, making safety one of the major concerns. Among existing removal options, complete and partial removal options pose the highest risks. Therefore, a dynamic risk model of the accidents from the two options is important to assess the risks on an overall basis. In this study, a risk-based safety model is developed to conduct quantitative risk analysis (QRA) for jacket structure systems failure. Firstly, bow-tie (BT) technique is utilised to model the causal relationship between the system failure and potential accident scenarios. Subsequently, to relax the shortcomings of BT, Bayesian Belief Networks (BBNs) were established to dynamically assess associated uncertainties and conditional dependencies. The BBN is developed through a similitude mapping of the developed bow-tie. The BBN is used to update the failure probabilities of the contributing elements through diagnostic analysis, thus, providing a case-specific and realistic safety analysis method when compared to a bow-tie. This paper presents the application of dynamic safety analysis to guide the allocation of risk control measures and consequently, drive down the avoidable cost of remediation.Keywords: Bayesian belief network, offshore decommissioning, dynamic safety model, quantitative risk analysis
Procedia PDF Downloads 28097 An Empirical Analysis of the Effects of Corporate Derivatives Use on the Underlying Stock Price Exposure: South African Evidence
Authors: Edson Vengesai
Abstract:
Derivative products have become essential instruments in portfolio diversification, price discovery, and, most importantly, risk hedging. Derivatives are complex instruments; their valuation, volatility implications, and real impact on the underlying assets' behaviour are not well understood. Little is documented empirically, with conflicting conclusions on how these instruments affect firm risk exposures. Given the growing interest in using derivatives in risk management and portfolio engineering, this study examines the practical impact of derivative usage on the underlying stock price exposure and systematic risk. The paper uses data from South African listed firms. The study employs GARCH models to understand the effect of derivative uses on conditional stock volatility. The GMM models are used to estimate the effect of derivatives use on stocks' systematic risk as measured by Beta and on the total risk of stocks as measured by the standard deviation of returns. The results provide evidence on whether derivatives use is instrumental in reducing stock returns' systematic and total risk. The results are subjected to numerous controls for robustness, including financial leverage, firm size, growth opportunities, and macroeconomic effects.Keywords: derivatives use, hedging, volatility, stock price exposure
Procedia PDF Downloads 10896 Characterization of Aquifer Systems and Identification of Potential Groundwater Recharge Zones Using Geospatial Data and Arc GIS in Kagandi Water Supply System Well Field
Authors: Aijuka Nicholas
Abstract:
A research study was undertaken to characterize the aquifers and identify the potential groundwater recharge zones in the Kagandi district. Quantitative characterization of hydraulic conductivities of aquifers is of fundamental importance to the study of groundwater flow and contaminant transport in aquifers. A conditional approach is used to represent the spatial variability of hydraulic conductivity. Briefly, it involves using qualitative and quantitative geologic borehole-log data to generate a three-dimensional (3D) hydraulic conductivity distribution, which is then adjusted through calibration of a 3D groundwater flow model using pumping-test data and historic hydraulic data. The approach consists of several steps. The study area was divided into five sub-watersheds on the basis of artificial drainage divides. A digital terrain model (DTM) was developed using Arc GIS to determine the general drainage pattern of Kagandi watershed. Hydrologic characterization involved the determination of the various hydraulic properties of the aquifers. Potential groundwater recharge zones were identified by integrating various thematic maps pertaining to the digital elevation model, land use, and drainage pattern in Arc GIS and Sufer golden software. The study demonstrates the potential of GIS in delineating groundwater recharge zones and that the developed methodology will be applicable to other watersheds in Uganda.Keywords: aquifers, Arc GIS, groundwater recharge, recharge zones
Procedia PDF Downloads 14795 High Resolution Image Generation Algorithm for Archaeology Drawings
Authors: Xiaolin Zeng, Lei Cheng, Zhirong Li, Xueping Liu
Abstract:
Aiming at the problem of low accuracy and susceptibility to cultural relic diseases in the generation of high-resolution archaeology drawings by current image generation algorithms, an archaeology drawings generation algorithm based on a conditional generative adversarial network is proposed. An attention mechanism is added into the high-resolution image generation network as the backbone network, which enhances the line feature extraction capability and improves the accuracy of line drawing generation. A dual-branch parallel architecture consisting of two backbone networks is implemented, where the semantic translation branch extracts semantic features from orthophotographs of cultural relics, and the gradient screening branch extracts effective gradient features. Finally, the fusion fine-tuning module combines these two types of features to achieve the generation of high-quality and high-resolution archaeology drawings. Experimental results on the self-constructed archaeology drawings dataset of grotto temple statues show that the proposed algorithm outperforms current mainstream image generation algorithms in terms of pixel accuracy (PA), structural similarity (SSIM), and peak signal-to-noise ratio (PSNR) and can be used to assist in drawing archaeology drawings.Keywords: archaeology drawings, digital heritage, image generation, deep learning
Procedia PDF Downloads 5894 On Flexible Preferences for Standard Taxis, Electric Taxis, and Peer-to-Peer Ridesharing
Authors: Ricardo Daziano
Abstract:
In the analysis and planning of the mobility ecosystem, preferences for ride-hailing over incumbent street-hailing services need better understanding. In this paper, a seminonparametric discrete choice model that allows for flexible preference heterogeneity is fitted with data from a discrete choice experiment among adult commuters in Montreal, Canada (N=760). Participants chose among Uber, Teo (a local electric ride-hailing service that was in operation when data was collected in 2018), and a standard taxi when presented with information about cost, time (on-trip, waiting, walking), powertrain of the car (gasoline/hybrid) for Uber and taxi, and whether the available electric Teo was a Tesla (which was one of the actual features of the Teo fleet). The fitted flexible model offers several behavioral insights. Waiting time for ride-hailing services is associated with a statistically significant but low marginal disutility. For other time components, including on-ride, and street-hailing waiting and walking the estimates of the value of time show an interesting pattern: whereas in a conditional logit on-ride time reductions are valued higher, in the flexible LML specification means of the value of time follow the expected pattern of waiting and walking creating a higher disutility. At the same time, the LML estimates show the presence of important, multimodal unobserved preference heterogeneity.Keywords: discrete choice, electric taxis, ridehailing, semiparametrics
Procedia PDF Downloads 16293 Characterization on Molecular Weight of Polyamic Acids Using GPC Coupled with Multiple Detectors
Authors: Mei Hong, Wei Liu, Xuemin Dai, Yanxiong Pan, Xiangling Ji
Abstract:
Polyamic acid (PAA) is the precursor of polyimide (PI) prepared by a two-step method, its molecular weight and molecular weight distribution not only play an important role during the preparation and processing, but also influence the final performance of PI. However, precise characterization on molecular weight of PAA is still a challenge because of the existence of very complicated interactions in the solution system, including the electrostatic interaction, hydrogen bond interaction, dipole-dipole interaction, etc. Thus, it is necessary to establisha suitable strategy which can completely suppress these complex effects and get reasonable data on molecular weight. Herein, the gel permeation chromatography (GPC) coupled with differential refractive index (RI) and multi-angle laser light scattering (MALLS) detectors were applied to measure the molecular weight of (6FDA-DMB) PAA using different mobile phases, LiBr/DMF, LiBr/H3PO4/THF/DMF, LiBr/HAc/THF/DMF, and LiBr/HAc/DMF, respectively. It was found that combination of LiBr with HAc can shield the above-mentioned complex interactions and is more conducive to the separation of PAA than only addition of LiBr in DMF. LiBr/HAc/DMF was employed for the first time as a mild mobile phase to effectively separate PAA and determine its molecular weight. After a series of conditional experiments, 0.02M LiBr/0.2M HAc/DMF was fixed as an optimized mobile phase to measure the relative and absolute molecular weights of (6FDA-DMB) PAA prepared, and the obtained Mw from GPC-MALLS and GPC-RI were 35,300 g/mol and 125,000 g/mol, respectively. Particularly, such a mobile phase is also applicable to other PAA samples with different structures, and the final results on molecular weight are also reproducible.Keywords: Polyamic acids, Polyelectrolyte effects, Gel permeation chromatography, Mobile phase, Molecular weight
Procedia PDF Downloads 5492 Underivatized Amino Acid Analyses Using Liquid Chromatography-Tandem Mass Spectrometry in Scalp Hair of Children with Autism Spectrum Disorder
Authors: Ayat Bani Rashaid, Zain Khasawneh, Mazin Alqhazo, Shreen Nusair, Mohammad El-Khateeb, Mahmoud Bashtawi
Abstract:
Autism Spectrum disorder (ASD) is a psychiatric disorder with unknown etiology that mainly affects children in the first three years of life. Alterations of amino acid levels are believed to contribute to ASD. The levels of six essential amino acids (methionine, histidine, valine, leucine, threonine, and phenylalanine), five conditional amino acids (proline, tyrosine, glutamine, cysteine, and cystine), and five non-essential amino acids (asparagine, aspartic acid, alanine, serine, and glutamic acid) in hair samples of children with ASD (n = 25) were analyzed and compared to corresponding levels in healthy age-matched controls (n = 25). The results showed that the levels of methionine, alanine, and asparagine were significantly lower in the hair samples of ASD group compared to those of the control group (p ≤ 0.05). However, the levels of glutamic acid were significantly higher in the ASD group than the control group (p ≤ 0.05). The current findings could contribute towards further understanding of ASD etiology and provide specialists with a hair amino acid profile utilized as a biomarker for early diagnosis of ASD. Such biomarkers could participate in future developments of therapies that reduce ASD-related symptoms.Keywords: autism spectrum disorder, amino acids, liquid chromatography-tandem mass spectrometry, human hair
Procedia PDF Downloads 13891 Internet Use, Social Networks, Loneliness and Quality of Life among Adults Aged 50 and Older: Mediating and Moderating Effects
Authors: Rabia Khaliala, Adi Vitman-Schorr
Abstract:
Background: The increase in longevity of people on one hand, and on the other hand the fact that the social networks in later life become increasingly narrower, highlight the importance of Internet use to enhance quality of life (QoL). However, whether Internet use increases or decreases social networks, loneliness and quality of life is not clear-cut. Purposes: To explore the direct and/or indirect effects of Internet use on QoL, and to examine whether ethnicity and time the elderly spent with family moderate the mediation effect of Internet use on quality of life throughout loneliness. Methods: This descriptive-correlational study was carried out in 2016 by structured interviews with a convenience sample of 502 respondents aged 50 and older, living in northern Israel. Bootstrapping with resampling strategies was used for testing mediation a model. Results: Use of the Internet was found to be positively associated with QoL. However, this relationship was mediated by loneliness, and moderated by the time the elderly spent with family members. In addition, respondents' ethnicity significantly moderated the mediation effect between Internet use and loneliness. Conclusions: Internet use can enhance QoL of older adults directly or indirectly by reducing loneliness. However, these effects are conditional on other variables. The indirect effect moderated by ethnicity, and the direct effect moderated by the time the elderly spend with their families. Researchers and practitioners should be aware of these interactions which can impact loneliness and quality of life of older persons differently.Keywords: internet use, loneliness, quality of life, social contacts
Procedia PDF Downloads 18590 Risk Assessments of Longest Dry Spells Phenomenon in Northern Tunisia
Authors: Majid Mathlouthi, Fethi Lebdi
Abstract:
Throughout the world, the extent and magnitude of droughts have economic, social and environmental consequences. Today climate change has become more and more felt; most likely they increase the frequency and duration of droughts. An analysis by event of dry event, from series of observations of the daily rainfall is carried out. A daily precipitation threshold value has been set. A catchment localized in Northern Tunisia where the average rainfall is about 600 mm has been studied. Rainfall events are defined as an uninterrupted series of rainfall days understanding at least a day having received a precipitation superior or equal to a fixed threshold. The dry events are constituted of a series of dry days framed by two successive rainfall events. A rainfall event is a vector of coordinates the duration, the rainfall depth per event and the duration of the dry event. The depth and duration are found to be correlated. So we use conditional probabilities to analyse the depth per event. The negative binomial distribution fits well the dry event. The duration of the rainfall event follows a geometric distribution. The length of the climatically cycle adjusts to the Incomplete Gamma. Results of this analysis was used to study of the effects of climate change on water resources and crops and to calibrate precipitation models with little rainfall records. In response to long droughts in the basin, the drought management system is based on three phases during each of the three phases; different measurements are applied and executed. The first is before drought, preparedness and early warning; the second is drought management, mitigation in the event of drought; and the last subsequent drought, when the drought is over.Keywords: dry spell, precipitation threshold, climate vulnerability, adaptation measures
Procedia PDF Downloads 8489 Learning a Bayesian Network for Situation-Aware Smart Home Service: A Case Study with a Robot Vacuum Cleaner
Authors: Eu Tteum Ha, Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
The smart home environment backed up by IoT (internet of things) technologies enables intelligent services based on the awareness of the situation a user is currently in. One of the convenient sensors for recognizing the situations within a home is the smart meter that can monitor the status of each electrical appliance in real time. This paper aims at learning a Bayesian network that models the causal relationship between the user situations and the status of the electrical appliances. Using such a network, we can infer the current situation based on the observed status of the appliances. However, learning the conditional probability tables (CPTs) of the network requires many training examples that cannot be obtained unless the user situations are closely monitored by any means. This paper proposes a method for learning the CPT entries of the network relying only on the user feedbacks generated occasionally. In our case study with a robot vacuum cleaner, the feedback comes in whenever the user gives an order to the robot adversely from its preprogrammed setting. Given a network with randomly initialized CPT entries, our proposed method uses this feedback information to adjust relevant CPT entries in the direction of increasing the probability of recognizing the desired situations. Simulation experiments show that our method can rapidly improve the recognition performance of the Bayesian network using a relatively small number of feedbacks.Keywords: Bayesian network, IoT, learning, situation -awareness, smart home
Procedia PDF Downloads 52388 Valuing Public Urban Street Trees and Their Environmental Spillover Benefits
Authors: Sofia F. Franco, Jacob Macdonald
Abstract:
This paper estimates the value of urban public street trees and their complementary and substitution value with other broader urban amenities and dis-amenities via the residential housing market. We estimate a lower bound value on a city’s tree amenities under instrumental variable and geographic regression discontinuity approaches with an application to Lisbon, Portugal. For completeness, we also explore how urban trees and in particular public street trees impact house prices across the city. Finally, we jointly analyze the planting and maintenance costs and benefits of urban street trees. The estimated value of all public trees in Lisbon is €8.84M. When considering specifically trees planted alongside roads and in public squares, the value is €6.06M or €126.64 per tree. This value is conditional on the distribution of trees in terms of their broader density, with higher effects coming from the overall greening of larger areas of the city compared to the greening of the direct neighborhood. Detrimental impacts are found when the number of trees is higher near street canyons, where they may exacerbate the stagnation of air pollution from traffic. Urban street trees also have important spillover benefits due to pollution mitigation around €6.21 million, or an additional €129.93 per tree. There are added benefits of €26.32 and €28.58 per tree in terms of flooding and heat mitigation, respectively. With significant resources and policies aimed at urban greening, the value obtained is shown to be important for discussions on the benefits of urban trees as compared to mitigation and abatement costs undertaken by a municipality.Keywords: urban public goods, urban street trees, spatial boundary discontinuities, geospatial and remote sensing methods
Procedia PDF Downloads 17787 Recommendations for Teaching Word Formation for Students of Linguistics Using Computer Terminology as an Example
Authors: Svetlana Kostrubina, Anastasia Prokopeva
Abstract:
This research presents a comprehensive study of the word formation processes in computer terminology within English and Russian languages and provides listeners with a system of exercises for training these skills. The originality is that this study focuses on a comparative approach, which shows both general patterns and specific features of English and Russian computer terms word formation. The key point is the system of exercises development for training computer terminology based on Bloom’s taxonomy. Data contain 486 units (228 English terms from the Glossary of Computer Terms and 258 Russian terms from the Terminological Dictionary-Reference Book). The objective is to identify the main affixation models in the English and Russian computer terms formation and to develop exercises. To achieve this goal, the authors employed Bloom’s Taxonomy as a methodological framework to create a systematic exercise program aimed at enhancing students’ cognitive skills in analyzing, applying, and evaluating computer terms. The exercises are appropriate for various levels of learning, from basic recall of definitions to higher-order thinking skills, such as synthesizing new terms and critically assessing their usage in different contexts. Methodology also includes: a method of scientific and theoretical analysis for systematization of linguistic concepts and clarification of the conceptual and terminological apparatus; a method of nominative and derivative analysis for identifying word-formation types; a method of word-formation analysis for organizing linguistic units; a classification method for determining structural types of abbreviations applicable to the field of computer communication; a quantitative analysis technique for determining the productivity of methods for forming abbreviations of computer vocabulary based on the English and Russian computer terms, as well as a technique of tabular data processing for a visual presentation of the results obtained. a technique of interlingua comparison for identifying common and different features of abbreviations of computer terms in the Russian and English languages. The research shows that affixation retains its productivity in the English and Russian computer terms formation. Bloom’s taxonomy allows us to plan a training program and predict the effectiveness of the compiled program based on the assessment of the teaching methods used.Keywords: word formation, affixation, computer terms, Bloom's taxonomy
Procedia PDF Downloads 1286 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA
Authors: Marek Dosbaba
Abstract:
Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data
Procedia PDF Downloads 10985 Marginalized Two-Part Joint Models for Generalized Gamma Family of Distributions
Authors: Mohadeseh Shojaei Shahrokhabadi, Ding-Geng (Din) Chen
Abstract:
Positive continuous outcomes with a substantial number of zero values and incomplete longitudinal follow-up are quite common in medical cost data. To jointly model semi-continuous longitudinal cost data and survival data and to provide marginalized covariate effect estimates, a marginalized two-part joint model (MTJM) has been developed for outcome variables with lognormal distributions. In this paper, we propose MTJM models for outcome variables from a generalized gamma (GG) family of distributions. The GG distribution constitutes a general family that includes approximately all of the most frequently used distributions like the Gamma, Exponential, Weibull, and Log Normal. In the proposed MTJM-GG model, the conditional mean from a conventional two-part model with a three-parameter GG distribution is parameterized to provide the marginal interpretation for regression coefficients. In addition, MTJM-gamma and MTJM-Weibull are developed as special cases of MTJM-GG. To illustrate the applicability of the MTJM-GG, we applied the model to a set of real electronic health record data recently collected in Iran, and we provided SAS code for application. The simulation results showed that when the outcome distribution is unknown or misspecified, which is usually the case in real data sets, the MTJM-GG consistently outperforms other models. The GG family of distribution facilitates estimating a model with improved fit over the MTJM-gamma, standard Weibull, or Log-Normal distributions.Keywords: marginalized two-part model, zero-inflated, right-skewed, semi-continuous, generalized gamma
Procedia PDF Downloads 17684 Optimal Portfolio of Multi-service Provision based on Stochastic Model Predictive Control
Authors: Yifu Ding, Vijay Avinash, Malcolm McCulloch
Abstract:
As the proliferation of decentralized energy systems, the UK power system allows small-scale entities such as microgrids (MGs) to tender multiple energy services including energy arbitrage and frequency responses (FRs). However, its operation requires the balance between the uncertain renewable generations and loads in real-time and has to fulfill their provision requirements of contract services continuously during the time window agreed, otherwise it will be penalized for the under-delivered provision. To hedge against risks due to uncertainties and maximize the economic benefits, we propose a stochastic model predictive control (SMPC) framework to optimize its operation for the multi-service provision. Distinguished from previous works, we include a detailed economic-degradation model of the lithium-ion battery to quantify the costs of different service provisions, as well as accurately describe the changing dynamics of the battery. Considering a branch of load and generation scenarios and the battery aging, we formulate a risk-averse cost function using conditional value at risk (CVaR). It aims to achieve the maximum expected net revenue and avoids severe losses. The framework will be performed on a case study of a PV-battery grid-tied microgrid in the UK with real-life data. To highlight its performance, the framework will be compared with the case without the degradation model and the deterministic formulation.Keywords: model predictive control (MPC), battery degradation, frequency response, microgrids
Procedia PDF Downloads 12283 Deaf Inmates in Canadian Prisons: Addressing Discrimination through Staff Training Videos with Deaf Actors
Authors: Tracey Bone
Abstract:
Deaf inmates, whose first or preferred language is a Signed Language, experience barriers to accessing the necessary two-way communication with correctional staff, and the educational and social programs that will enhance their eligibility for conditional release from the federal prison system in Canada. The development of visual content to enhance the knowledge and skill development of correctional staff is a contemporary strategy intended to significantly improve the correctional experience for deaf inmates. This presentation reports on the development of two distinct training videos created to enhance staff’s understanding of the needs of deaf inmates; one a two-part simulation of an interaction with a deaf inmate, the second an interview with a deaf academic. Part one of video one demonstrates the challenges and misunderstandings inherent in communicating across languages without a qualified sign language interpreter; the second part demonstrates the ease of communication when communication needs are met. Video two incorporates the experiences of a deaf academic to provide the cultural grounding necessary to educate staff in the unique experiences associated with being a visual language user. Lack of staff understanding or awareness of deaf culture and language must not be acceptable reasons for the inadequate treatment of deaf visual language users in federal prisons. This paper demonstrates a contemporary approach to meeting the human rights and needs of this unique and often ignored inmate subpopulation. The deaf community supports this visual approach to enhancing staff understanding of the unique needs of this population. A study of its effectiveness is currently underway.Keywords: accommodations, American Sign Language (ASL), deaf inmates, sensory deprivation
Procedia PDF Downloads 14982 Introducing Principles of Land Surveying by Assigning a Practical Project
Authors: Introducing Principles of Land Surveying by Assigning a Practical Project
Abstract:
A practical project is used in an engineering surveying course to expose sophomore and junior civil engineering students to several important issues related to the use of basic principles of land surveying. The project, which is the design of a two-lane rural highway to connect between two arbitrary points, requires students to draw the profile of the proposed highway along with the existing ground level. Areas of all cross-sections are then computed to enable quantity computations between them. Lastly, Mass-Haul Diagram is drawn with all important parts and features shown on it for clarity. At the beginning, students faced challenges getting started on the project. They had to spend time and effort thinking of the best way to proceed and how the work would flow. It was even more challenging when they had to visualize images of cut, fill and mixed cross sections in three dimensions before they can draw them to complete the necessary computations. These difficulties were then somewhat overcome with the help of the instructor and thorough discussions among team members and/or between different teams. The method of assessment used in this study was a well-prepared-end-of-semester questionnaire distributed to students after the completion of the project and the final exam. The survey contained a wide spectrum of questions from students' learning experience when this course development was implemented to students' satisfaction of the class instructions provided to them and the instructor's competency in presenting the material and helping with the project. It also covered the adequacy of the project to show a sample of a real-life civil engineering application and if there is any excitement added by implementing this idea. At the end of the questionnaire, students had the chance to provide their constructive comments and suggestions for future improvements of the land surveying course. Outcomes will be presented graphically and in a tabular format. Graphs provide visual explanation of the results and tables, on the other hand, summarize numerical values for each student along with some descriptive statistics, such as the mean, standard deviation, and coefficient of variation for each student and each question as well. In addition to gaining experience in teamwork, communications, and customer relations, students felt the benefit of assigning such a project. They noticed the beauty of the practical side of civil engineering work and how theories are utilized in real-life engineering applications. It was even recommended by students that such a project be exercised every time this course is offered so future students can have the same learning opportunity they had.Keywords: land surveying, highway project, assessment, evaluation, descriptive statistics
Procedia PDF Downloads 22981 From Responses of Macroinvertebrate Metrics to the Definition of Reference Thresholds
Authors: Hounyèmè Romuald, Mama Daouda, Argillier Christine
Abstract:
The present study focused on the use of benthic macrofauna to define the reference state of an anthropized lagoon (Nokoué-Benin) from the responses of relevant metrics to proxies. The approach used is a combination of a joint species distribution model and Bayesian networks. The joint species distribution model was used to select the relevant metrics and generate posterior probabilities that were then converted into posterior response probabilities for each of the quality classes (pressure levels), which will constitute the conditional probability tables allowing the establishment of the probabilistic graph representing the different causal relationships between metrics and pressure proxies. For the definition of the reference thresholds, the predicted responses for low-pressure levels were read via probability density diagrams. Observations collected during high and low water periods spanning 03 consecutive years (2004-2006), sampling 33 macroinvertebrate taxa present at all seasons and sampling points, and measurements of 14 environmental parameters were used as application data. The study demonstrated reliable inferences, selection of 07 relevant metrics and definition of quality thresholds for each environmental parameter. The relevance of the metrics as well as the reference thresholds for ecological assessment despite the small sample size, suggests the potential for wider applicability of the approach for aquatic ecosystem monitoring and assessment programs in developing countries generally characterized by a lack of monitoring data.Keywords: pressure proxies, bayesian inference, bioindicators, acadjas, functional traits
Procedia PDF Downloads 8380 The Impact of Hospital Strikes on Patient Care: Evidence from 135 Strikes in the Portuguese National Health System
Authors: Eduardo Costa
Abstract:
Hospital strikes in the Portuguese National Health Service (NHS) are becoming increasingly frequent, raising concerns in what respects patient safety. In fact, data shows that mortality rates for patients admitted during strikes are up to 30% higher than for patients admitted in other days. This paper analyses the effects of hospital strikes on patients’ outcomes. Specifically, it analyzes the impact of different strikes (physicians, nurses and other health professionals), on in-hospital mortality rates, readmission rates and length of stay. The paper uses patient-level data containing all NHS hospital admissions in mainland Portugal from 2012 to 2017, together with a comprehensive strike dataset comprising over 250 strike days (19 physicians-strike days, 150 nurses-strike days and 50 other health professionals-strike days) from 135 different strikes. The paper uses a linear probability model and controls for hospital and regional characteristics, time trends, and changes in patients’ composition and diagnoses. Preliminary results suggest a 6-7% increase in in-hospital mortality rates for patients exposed to physicians’ strikes. The effect is smaller for patients exposed to nurses’ strikes (2-5%). Patients exposed to nurses strikes during their stay have, on average, higher 30-days urgent readmission rates (4%). Length of stay also seems to increase for patients exposed to any strike. Results – conditional on further testing, namely on non-linear models - suggest that hospital operations and service levels are partially disrupted during strikes.Keywords: health sector strikes, in-hospital mortality rate, length of stay, readmission rate
Procedia PDF Downloads 13579 Data Augmentation for Early-Stage Lung Nodules Using Deep Image Prior and Pix2pix
Authors: Qasim Munye, Juned Islam, Haseeb Qureshi, Syed Jung
Abstract:
Lung nodules are commonly identified in computed tomography (CT) scans by experienced radiologists at a relatively late stage. Early diagnosis can greatly increase survival. We propose using a pix2pix conditional generative adversarial network to generate realistic images simulating early-stage lung nodule growth. We have applied deep images prior to 2341 slices from 895 computed tomography (CT) scans from the Lung Image Database Consortium (LIDC) dataset to generate pseudo-healthy medical images. From these images, 819 were chosen to train a pix2pix network. We observed that for most of the images, the pix2pix network was able to generate images where the nodule increased in size and intensity across epochs. To evaluate the images, 400 generated images were chosen at random and shown to a medical student beside their corresponding original image. Of these 400 generated images, 384 were defined as satisfactory - meaning they resembled a nodule and were visually similar to the corresponding image. We believe that this generated dataset could be used as training data for neural networks to detect lung nodules at an early stage or to improve the accuracy of such networks. This is particularly significant as datasets containing the growth of early-stage nodules are scarce. This project shows that the combination of deep image prior and generative models could potentially open the door to creating larger datasets than currently possible and has the potential to increase the accuracy of medical classification tasks.Keywords: medical technology, artificial intelligence, radiology, lung cancer
Procedia PDF Downloads 6878 Network Conditioning and Transfer Learning for Peripheral Nerve Segmentation in Ultrasound Images
Authors: Harold Mauricio Díaz-Vargas, Cristian Alfonso Jimenez-Castaño, David Augusto Cárdenas-Peña, Guillermo Alberto Ortiz-Gómez, Alvaro Angel Orozco-Gutierrez
Abstract:
Precise identification of the nerves is a crucial task performed by anesthesiologists for an effective Peripheral Nerve Blocking (PNB). Now, anesthesiologists use ultrasound imaging equipment to guide the PNB and detect nervous structures. However, visual identification of the nerves from ultrasound images is difficult, even for trained specialists, due to artifacts and low contrast. The recent advances in deep learning make neural networks a potential tool for accurate nerve segmentation systems, so addressing the above issues from raw data. The most widely spread U-Net network yields pixel-by-pixel segmentation by encoding the input image and decoding the attained feature vector into a semantic image. This work proposes a conditioning approach and encoder pre-training to enhance the nerve segmentation of traditional U-Nets. Conditioning is achieved by the one-hot encoding of the kind of target nerve a the network input, while the pre-training considers five well-known deep networks for image classification. The proposed approach is tested in a collection of 619 US images, where the best C-UNet architecture yields an 81% Dice coefficient, outperforming the 74% of the best traditional U-Net. Results prove that pre-trained models with the conditional approach outperform their equivalent baseline by supporting learning new features and enriching the discriminant capability of the tested networks.Keywords: nerve segmentation, U-Net, deep learning, ultrasound imaging, peripheral nerve blocking
Procedia PDF Downloads 10677 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria
Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov
Abstract:
This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model
Procedia PDF Downloads 6276 Fight against Money Laundering with Optical Character Recognition
Authors: Saikiran Subbagari, Avinash Malladhi
Abstract:
Anti Money Laundering (AML) regulations are designed to prevent money laundering and terrorist financing activities worldwide. Financial institutions around the world are legally obligated to identify, assess and mitigate the risks associated with money laundering and report any suspicious transactions to governing authorities. With increasing volumes of data to analyze, financial institutions seek to automate their AML processes. In the rise of financial crimes, optical character recognition (OCR), in combination with machine learning (ML) algorithms, serves as a crucial tool for automating AML processes by extracting the data from documents and identifying suspicious transactions. In this paper, we examine the utilization of OCR for AML and delve into various OCR techniques employed in AML processes. These techniques encompass template-based, feature-based, neural network-based, natural language processing (NLP), hidden markov models (HMMs), conditional random fields (CRFs), binarizations, pattern matching and stroke width transform (SWT). We evaluate each technique, discussing their strengths and constraints. Also, we emphasize on how OCR can improve the accuracy of customer identity verification by comparing the extracted text with the office of foreign assets control (OFAC) watchlist. We will also discuss how OCR helps to overcome language barriers in AML compliance. We also address the implementation challenges that OCR-based AML systems may face and offer recommendations for financial institutions based on the data from previous research studies, which illustrate the effectiveness of OCR-based AML.Keywords: anti-money laundering, compliance, financial crimes, fraud detection, machine learning, optical character recognition
Procedia PDF Downloads 144