Search results for: protection methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16823

Search results for: protection methods

15173 The Legal Nature of Grading Decisions and the Implications for Handling of Academic Complaints in or out of Court: A Comparative Legal Analysis of Academic Litigation in Europe

Authors: Kurt Willems

Abstract:

This research examines complaints against grading in higher education institutions in four different European regions: England and Wales, Flanders, the Netherlands, and France. The aim of the research is to examine the correlation between the applicable type of complaint handling on the one hand, and selected qualities of the higher education landscape and of public law on the other hand. All selected regions report a rising number of complaints against grading decisions, not only as to internal complaint handling within the institution but also judicially if the dispute persists. Some regions deem their administrative court system appropriate to deal with grading disputes (France) or have even erected a specialty administrative court to facilitate access (Flanders, the Netherlands). However, at the same time, different types of (governmental) dispute resolution bodies have been established outside of the judicial court system (England and Wales, and to lesser extent France and the Netherlands). Those dispute procedures do not seem coincidental. Public law issues such as the underlying legal nature of the education institution and, eventually, the grading decision itself, have an impact on the way the academic complaint procedures are developed. Indeed, in most of the selected regions, contractual disputes enjoy different legal protection than administrative decisions, making the legal qualification of the relationship between student and higher education institution highly relevant. At the same time, the scope of competence of government over different types of higher education institutions; albeit direct or indirect (o.a. through financing and quality control) is relevant as well to comprehend why certain dispute handling procedures have been established for students. To answer the above questions, the doctrinal and comparative legal method is used. The normative framework is distilled from the relevant national legislative rules and their preparatory texts, the legal literature, the (published) case law of academic complaints and the available governmental reports. The research is mainly theoretical in nature, examining different topics of public law (mainly administrative law) and procedural law in the context of grading decisions. The internal appeal procedure within the education institution is largely left out of the scope of the research, as well as different types of non-governmental-imposed cooperation between education institutions, given the public law angle of the research questions. The research results in the categorization of different academic complaint systems, and an analysis of the possibility to introduce each of those systems in different countries, depending on their public law system and higher education system. By doing so, the research also adds to the debate on the public-private divide in higher education systems, and its effect on academic complaints handling.

Keywords: higher education, legal qualification of education institution, legal qualification of grading decisions, legal protection of students, academic litigation

Procedia PDF Downloads 224
15172 A Review on Bone Grafting, Artificial Bone Substitutes and Bone Tissue Engineering

Authors: Kasun Gayashan Samarawickrama

Abstract:

Bone diseases, defects, and fractions are commonly seen in modern life. Since bone is regenerating dynamic living tissue, it will undergo healing process naturally, it cannot recover from major bone injuries, diseases and defects. In order to overcome them, bone grafting technique was introduced. Gold standard was the best method for bone grafting for the past decades. Due to limitations of gold standard, alternative methods have been implemented. Apart from them artificial bone substitutes and bone tissue engineering have become the emerging methods with technology for bone grafting. Many bone diseases and defects will be healed permanently with these promising techniques in future.

Keywords: bone grafting, gold standard, bone substitutes, bone tissue engineering

Procedia PDF Downloads 289
15171 Fuzzy Expert Approach for Risk Mitigation on Functional Urban Areas Affected by Anthropogenic Ground Movements

Authors: Agnieszka A. Malinowska, R. Hejmanowski

Abstract:

A number of European cities are strongly affected by ground movements caused by anthropogenic activities or post-anthropogenic metamorphosis. Those are mainly water pumping, current mining operation, the collapse of post-mining underground voids or mining-induced earthquakes. These activities lead to large and small-scale ground displacements and a ground ruptures. The ground movements occurring in urban areas could considerably affect stability and safety of structures and infrastructures. The complexity of the ground deformation phenomenon in relation to the structures and infrastructures vulnerability leads to considerable constraints in assessing the threat of those objects. However, the increase of access to the free software and satellite data could pave the way for developing new methods and strategies for environmental risk mitigation and management. Open source geographical information systems (OS GIS), may support data integration, management, and risk analysis. Lately, developed methods based on fuzzy logic and experts methods for buildings and infrastructure damage risk assessment could be integrated into OS GIS. Those methods were verified base on back analysis proving their accuracy. Moreover, those methods could be supported by ground displacement observation. Based on freely available data from European Space Agency and free software, ground deformation could be estimated. The main innovation presented in the paper is the application of open source software (OS GIS) for integration developed models and assessment of the threat of urban areas. Those approaches will be reinforced by analysis of ground movement based on free satellite data. Those data would support the verification of ground movement prediction models. Moreover, satellite data will enable our mapping of ground deformation in urbanized areas. Developed models and methods have been implemented in one of the urban areas hazarded by underground mining activity. Vulnerability maps supported by satellite ground movement observation would mitigate the hazards of land displacements in urban areas close to mines.

Keywords: fuzzy logic, open source geographic information science (OS GIS), risk assessment on urbanized areas, satellite interferometry (InSAR)

Procedia PDF Downloads 155
15170 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic

Authors: Diogen Babuc

Abstract:

The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.

Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison

Procedia PDF Downloads 95
15169 An Advanced Match-Up Scheduling Under Single Machine Breakdown

Authors: J. Ikome, M. Ndeley

Abstract:

When a machine breakdown forces a Modified Flow Shop (MFS) out of the prescribed state, the proposed strategy reschedules part of the initial schedule to match up with the preschedule at some point. The objective is to create a new schedule that is consistent with the other production planning decisions like material flow, tooling and purchasing by utilizing the time critical decision making concept. We propose a new rescheduling strategy and a match-up point determination procedure through a feedback mechanism to increase both the schedule quality and stability. The proposed approach is compared with alternative reactive scheduling methods under different experimental settings.

Keywords: advanced critical task methods modified flow shop (MFS), Manufacturing, experiment, determination

Procedia PDF Downloads 402
15168 Heuristic Methods for the Capacitated Location- Allocation Problem with Stochastic Demand

Authors: Salinee Thumronglaohapun

Abstract:

The proper number and appropriate locations of service centers can save cost, raise revenue and gain more satisfaction from customers. Establishing service centers is high-cost and difficult to relocate. In long-term planning periods, several factors may affect the service. One of the most critical factors is uncertain demand of customers. The opened service centers need to be capable of serving customers and making a profit although the demand in each period is changed. In this work, the capacitated location-allocation problem with stochastic demand is considered. A mathematical model is formulated to determine suitable locations of service centers and their allocation to maximize total profit for multiple planning periods. Two heuristic methods, a local search and genetic algorithm, are used to solve this problem. For the local search, five different chances to choose each type of moves are applied. For the genetic algorithm, three different replacement strategies are considered. The results of applying each method to solve numerical examples are compared. Both methods reach to the same best found solution in most examples but the genetic algorithm provides better solutions in some cases.

Keywords: location-allocation problem, stochastic demand, local search, genetic algorithm

Procedia PDF Downloads 117
15167 Towards a Standardization in Scheduling Models: Assessing the Variety of Homonyms

Authors: Marcel Rojahn, Edzard Weber, Norbert Gronau

Abstract:

Terminology is a critical instrument for each researcher. Different terminologies for the same research object may arise in different research communities. By this inconsistency, many synergistic effects get lost. Theories and models will be more understandable and reusable if a common terminology is applied. This paper examines the terminological (in) consistency for the research field of job-shop scheduling through a literature review. There is an enormous variety in the choice of terms and mathematical notation for the same concept. The comparability, reusability, and combinability of scheduling methods are unnecessarily hampered by the arbitrary use of homonyms and synonyms. The acceptance in the community of used variables and notation forms is shown by means of a compliance quotient. This is proven by the evaluation of 240 scientific publications on planning methods.

Keywords: job-shop scheduling, terminology, notation, standardization

Procedia PDF Downloads 102
15166 FT-NIR Method to Determine Moisture in Gluten Free Rice-Based Pasta during Drying

Authors: Navneet Singh Deora, Aastha Deswal, H. N. Mishra

Abstract:

Pasta is one of the most widely consumed food products around the world. Rapid determination of the moisture content in pasta will assist food processors to provide online quality control of pasta during large scale production. Rapid Fourier transform near-infrared method (FT-NIR) was developed for determining moisture content in pasta. A calibration set of 150 samples, a validation set of 30 samples and a prediction set of 25 samples of pasta were used. The diffuse reflection spectra of different types of pastas were measured by FT-NIR analyzer in the 4,000-12,000 cm-1 spectral range. Calibration and validation sets were designed for the conception and evaluation of the method adequacy in the range of moisture content 10 to 15 percent (w.b) of the pasta. The prediction models based on partial least squares (PLS) regression, were developed in the near-infrared. Conventional criteria such as the R2, the root mean square errors of cross validation (RMSECV), root mean square errors of estimation (RMSEE) as well as the number of PLS factors were considered for the selection of three pre-processing (vector normalization, minimum-maximum normalization and multiplicative scatter correction) methods. Spectra of pasta sample were treated with different mathematic pre-treatments before being used to build models between the spectral information and moisture content. The moisture content in pasta predicted by FT-NIR methods had very good correlation with their values determined via traditional methods (R2 = 0.983), which clearly indicated that FT-NIR methods could be used as an effective tool for rapid determination of moisture content in pasta. The best calibration model was developed with min-max normalization (MMN) spectral pre-processing (R2 = 0.9775). The MMN pre-processing method was found most suitable and the maximum coefficient of determination (R2) value of 0.9875 was obtained for the calibration model developed.

Keywords: FT-NIR, pasta, moisture determination, food engineering

Procedia PDF Downloads 251
15165 Computer Aided Analysis of Breast Based Diagnostic Problems from Mammograms Using Image Processing and Deep Learning Methods

Authors: Ali Berkan Ural

Abstract:

This paper presents the analysis, evaluation, and pre-diagnosis of early stage breast based diagnostic problems (breast cancer, nodulesorlumps) by Computer Aided Diagnosing (CAD) system from mammogram radiological images. According to the statistics, the time factor is crucial to discover the disease in the patient (especially in women) as possible as early and fast. In the study, a new algorithm is developed using advanced image processing and deep learning method to detect and classify the problem at earlystagewithmoreaccuracy. This system first works with image processing methods (Image acquisition, Noiseremoval, Region Growing Segmentation, Morphological Operations, Breast BorderExtraction, Advanced Segmentation, ObtainingRegion Of Interests (ROIs), etc.) and segments the area of interest of the breast and then analyzes these partly obtained area for cancer detection/lumps in order to diagnosis the disease. After segmentation, with using the Spectrogramimages, 5 different deep learning based methods (specified Convolutional Neural Network (CNN) basedAlexNet, ResNet50, VGG16, DenseNet, Xception) are applied to classify the breast based problems.

Keywords: computer aided diagnosis, breast cancer, region growing, segmentation, deep learning

Procedia PDF Downloads 80
15164 Advances in Artificial intelligence Using Speech Recognition

Authors: Khaled M. Alhawiti

Abstract:

This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.

Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance

Procedia PDF Downloads 467
15163 The Use of Simulation Programs of Leakage of Harmful Substances for Crisis Management

Authors: Jiří Barta

Abstract:

The paper deals with simulation programs of spread of harmful substances. Air pollution has a direct impact on the quality of human life and environmental protection is currently a very hot topic. Therefore, the paper focuses on the simulation of release of harmful substances. The first part of article deals with perspectives and possibilities of implementation outputs of simulations programs into the system which is education and of practical training of the management staff during emergency events in the frame of critical infrastructure. The last part shows the practical testing and evaluation of simulation programs. Of the tested simulations software been selected Symos97. The tool offers advanced features for setting leakage. Gradually allows the user to model the terrain, location, and method of escape of harmful substances.

Keywords: Computer Simulation, Symos97, Spread, Simulation Software, Harmful Substances

Procedia PDF Downloads 280
15162 Diagnosis of Avian Pathology in the East of Algeria

Authors: Khenenou Tarek, Benzaoui Hassina, Melizi Mohamed

Abstract:

The diagnosis requires a background of current knowledge in the field and also complementary means in which the laboratory occupies the central place for a better investigation. A correct diagnosis allows to establish the most appropriate treatment as soon as possible and avoids both the economic losses associated with mortality and growth retardation often observed in poultry furthermore it may reduce the high cost of treatment. Epedemiologic survey, hematologic and histopathologic study’s are three aspects of diagnosis heavily used in both human and veterinary pathology and the advanced researches in human medicine would be exploited to be applied in veterinary medicine with given modification .Whereas, the diagnostic methods in the east of Algeria are limited to the clinical signs and necropsy finding. Therefore, the diagnosis is based simply on the success or the failure of the therapeutic methods (therapeutic diagnosis).

Keywords: chicken, diagnosis, hematology, histopathology

Procedia PDF Downloads 619
15161 Effect of Outliers in Assessing Significant Wave Heights Through a Time-Dependent GEV Model

Authors: F. Calderón-Vega, A. D. García-Soto, C. Mösso

Abstract:

Recorded significant wave heights sometimes exhibit large uncommon values (outliers) that can be associated with extreme phenomena such as hurricanes and cold fronts. In this study, some extremely large wave heights recorded in NOAA buoys (National Data Buoy Center, noaa.gov) are used to investigate their effect in the prediction of future wave heights associated with given return periods. Extreme waves are predicted through a time-dependent model based on the so-called generalized extreme value distribution. It is found that the outliers do affect the estimated wave heights. It is concluded that a detailed inspection of outliers is envisaged to determine whether they are real recorded values since this will impact defining design wave heights for coastal protection purposes.

Keywords: GEV model, non-stationary, seasonality, outliers

Procedia PDF Downloads 188
15160 Sustainable Underground Structures Through Soil-Driven Bio-Protection of Concrete

Authors: Abdurahim Abogdera, Omar Hamza, David Elliott

Abstract:

The soil bacteria can be affected by some factors such as pH, calcium ions and Electrical conductivity. Fresh concrete has high pH value, which is between 11 and 13 and these values will be prevented the bacteria to produce CO₂ to participate with Calcium ions that released from the concrete to get calcite. In this study we replaced 15% and 25% of cement with Fly ash as the fly ash reduce the value of the pH at the concrete. The main goal of this study was investigated whether bacteria can be used on the soil rather than in the concrete to avoid the challenges and limitations of containing bacteria inside the concrete. This was achieved by incubating cracked cement mortar specimens into fully saturated sterilized and non-sterilized soil. The crack sealing developed in the specimens during the incubation period in both soil conditions were evaluated and compared. Visual inspection, water absorption test, scanning electron microscopy (SEM), and Energy Dispersive X-ray (EDX) were conducted to evaluate the healing process.

Keywords: pH, calcium ions, MICP, salinity

Procedia PDF Downloads 103
15159 Fault Location Identification in High Voltage Transmission Lines

Authors: Khaled M. El Naggar

Abstract:

This paper introduces a digital method for fault section identification in transmission lines. The method uses digital set of the measured short circuit current to locate faults in electrical power systems. The digitized current is used to construct a set of overdetermined system of equations. The problem is then constructed and solved using the proposed digital optimization technique to find the fault distance. The proposed optimization methodology is an application of simulated annealing optimization technique. The method is tested using practical case study to evaluate the proposed method. The accurate results obtained show that the algorithm can be used as a powerful tool in the area of power system protection.

Keywords: optimization, estimation, faults, measurement, high voltage, simulated annealing

Procedia PDF Downloads 387
15158 Monte Carlo Methods and Statistical Inference of Multitype Branching Processes

Authors: Ana Staneva, Vessela Stoimenova

Abstract:

A parametric estimation of the MBP with Power Series offspring distribution family is considered in this paper. The MLE for the parameters is obtained in the case when the observable data are incomplete and consist only with the generation sizes of the family tree of MBP. The parameter estimation is calculated by using the Monte Carlo EM algorithm. The estimation for the posterior distribution and for the offspring distribution parameters are calculated by using the Bayesian approach and the Gibbs sampler. The article proposes various examples with bivariate branching processes together with computational results, simulation and an implementation using R.

Keywords: Bayesian, branching processes, EM algorithm, Gibbs sampler, Monte Carlo methods, statistical estimation

Procedia PDF Downloads 408
15157 Fire Risk Information Harmonization for Transboundary Fire Events between Portugal and Spain

Authors: Domingos Viegas, Miguel Almeida, Carmen Rocha, Ilda Novo, Yolanda Luna

Abstract:

Forest fires along the more than 1200km of the Spanish-Portuguese border are more and more frequent, currently achieving around 2000 fire events per year. Some of these events develop to large international wildfire requiring concerted operations based on shared information between the two countries. The fire event of Valencia de Alcantara (2003) causing several fatalities and more than 13000ha burnt, is a reference example of these international events. Currently, Portugal and Spain have a specific cross-border cooperation protocol on wildfires response for a strip of about 30km (15 km for each side). It is recognized by public authorities the successfulness of this collaboration however it is also assumed that this cooperation should include more functionalities such as the development of a common risk information system for transboundary fire events. Since Portuguese and Spanish authorities use different approaches to determine the fire risk indexes inputs and different methodologies to assess the fire risk, sometimes the conjoint firefighting operations are jeopardized since the information is not harmonized and the understanding of the situation by the civil protection agents from both countries is not unique. Thus, a methodology aiming the harmonization of the fire risk calculation and perception by Portuguese and Spanish Civil protection authorities is hereby presented. The final results are presented as well. The fire risk index used in this work is the Canadian Fire Weather Index (FWI), which is based on meteorological data. The FWI is limited on its application as it does not take into account other important factors with great effect on the fire appearance and development. The combination of these factors is very complex since, besides the meteorology, it addresses several parameters of different topics, namely: sociology, topography, vegetation and soil cover. Therefore, the meaning of FWI values is different from region to region, according the specific characteristics of each region. In this work, a methodology for FWI calibration based on the number of fire occurrences and on the burnt area in the transboundary regions of Portugal and Spain, in order to assess the fire risk based on calibrated FWI values, is proposed. As previously mentioned, the cooperative firefighting operations require a common perception of the information shared. Therefore, a common classification of the fire risk for the fire events occurred in the transboundary strip is proposed with the objective of harmonizing this type of information. This work is integrated in the ECHO project SpitFire - Spanish-Portuguese Meteorological Information System for Transboundary Operations in Forest Fires, which aims the development of a web platform for the sharing of information and supporting decision tools to be used in international fire events involving Portugal and Spain.

Keywords: data harmonization, FWI, international collaboration, transboundary wildfires

Procedia PDF Downloads 241
15156 Effect of Distance Education Students Motivation with the Turkish Language and Literature Course

Authors: Meva Apaydin, Fatih Apaydin

Abstract:

Role of education in the development of society is great. Teaching and training started with the beginning of the history and different methods and techniques which have been applied as the time passed and changed everything with the aim of raising the level of learning. In addition to the traditional teaching methods, technology has been used in recent years. With the beginning of the use of internet in education, some problems which could not be soluted till that time has been dealt and it is inferred that it is possible to educate the learners by using contemporary methods as well as traditional methods. As an advantage of technological developments, distance education is a system which paves the way for the students to be educated individually wherever and whenever they like without the needs of physical school environment. Distance education has become prevalent because of the physical inadequacies in education institutions, as a result; disadvantageous circumstances such as social complexities, individual differences and especially geographical distance disappear. What’s more, the high-speed of the feedbacks between teachers and learners, improvement in student motivation because there is no limitation of time, low-cost, the objective measuring and evaluation are on foreground. In spite of the fact that there is teaching beneficences in distance education, there are also limitations. Some of the most important problems are that : Some problems which are highly possible to come across may not be solved in time, lack of eye-contact between the teacher and the learner, so trust-worthy feedback cannot be got or the problems stemming from the inadequate technological background are merely some of them. Courses are conducted via distance education in many departments of the universities in our country. In recent years, giving lectures such as Turkish Language, English, and History in the first grades of the academic departments in the universities is an application which is constantly becoming prevalent. In this study, the application of Turkish Language course via distance education system by analyzing advantages and disadvantages of the distance education system which is based on internet.

Keywords: distance education, Turkish language, motivation, benefits

Procedia PDF Downloads 430
15155 Water Management of Polish Agriculture and Adaptation to Climate Change

Authors: Dorota M. Michalak

Abstract:

The agricultural sector, due to the growing demand for food and over-exploitation of the natural environment, contributes to the deepening of climate change, on the one hand, and on the other hand, shrinking freshwater resources, as a negative effect of climate change, threaten the food security of each country. Therefore, adaptation measures to climate change should take into account effective water management and seek solutions ensuring food production at an unchanged or higher level, while not burdening the environment and not contributing to the worsening of the negative consequences of climate change. The problems of Poland's water management result not only from relatively small, natural water resources but to a large extent on the low efficiency of their use. Appropriate agricultural practices and state solutions in this field can contribute to achieving significant benefits in terms of economical water management in agriculture, providing a greater amount of water that could also be used for other purposes, including for purposes related to environmental protection. The aim of the article is to determine the level of use of water resources in Polish agriculture and the advancement of measures aimed at adapting Polish agriculture in the field of water management to climate change. The study provides knowledge about Polish legal regulations and water management tools, the shaping of water policy of Polish agriculture against the background of EU countries and other sources of energy, and measures supporting Polish agricultural holdings in the effective management of water resources run by state budget institutions. In order to achieve the above-mentioned goals, the author used research tools such as the analysis of existing sources and a survey conducted among five groups of entities, i.e. agricultural advisory centers and departments, agricultural, rural and environmental protection departments, regional water management boards, provincial agricultural chambers and restructuring and modernization of agriculture. The main conclusion of the analyses carried out is the low use of water in Polish agriculture in relation to other EU countries, other sources of intake in Poland, as well as irrigation. The analysis allows us to observe another problem, which is the lack of reporting and data collection, which is extremely important from the point of view of the effectiveness of adaptation measures to climate change. The results obtained from the survey indicate a very low level of support for government institutions in the implementation of adaptation measures to climate change and the water management of Polish farms. Some of the basic problems of the adaptation policy to change climate with regard to water management in Polish agriculture include a lack of knowledge regarding climate change, the possibilities of adapting, the available tools or ways to rationalize the use of water resources. It also refers to the lack of ordering procedures and the separation of responsibility with a proper territorial unit, non-functioning channels of information flow and practically low effects.

Keywords: water management, adaptation policy, agriculture, climate change

Procedia PDF Downloads 132
15154 Innovations in the Implementation of Preventive Strategies and Measuring Their Effectiveness Towards the Prevention of Harmful Incidents to People with Mental Disabilities who Receive Home and Community Based Services

Authors: Carlos V. Gonzalez

Abstract:

Background: Providers of in-home and community based services strive for the elimination of preventable harm to the people under their care as well as to the employees who support them. Traditional models of safety and protection from harm have assumed that the absence of incidents of harm is a good indicator of safe practices. However, this model creates an illusion of safety that is easily shaken by sudden and inadvertent harmful events. As an alternative, we have developed and implemented an evidence-based resilient model of safety known as C.O.P.E. (Caring, Observing, Predicting and Evaluating). Within this model, safety is not defined by the absence of harmful incidents, but by the presence of continuous monitoring, anticipation, learning, and rapid response to events that may lead to harm. Objective: The objective was to evaluate the effectiveness of the C.O.P.E. model for the reduction of harm to individuals with mental disabilities who receive home and community based services. Methods: Over the course of 2 years we counted the number of incidents of harm and near misses. We trained employees on strategies to eliminate incidents before they fully escalated. We trained employees to track different levels of patient status within a scale from 0 to 10. Additionally, we provided direct support professionals and supervisors with customized smart phone applications to track and notify the team of changes in that status every 30 minutes. Finally, the information that we collected was saved in a private computer network that analyzes and graphs the outcome of each incident. Result and conclusions: The use of the COPE model resulted in: A reduction in incidents of harm. A reduction the use of restraints and other physical interventions. An increase in Direct Support Professional’s ability to detect and respond to health problems. Improvement in employee alertness by decreasing sleeping on duty. Improvement in caring and positive interaction between Direct Support Professionals and the person who is supported. Developing a method to globally measure and assess the effectiveness of prevention from harm plans. Future applications of the COPE model for the reduction of harm to people who receive home and community based services are discussed.

Keywords: harm, patients, resilience, safety, mental illness, disability

Procedia PDF Downloads 439
15153 TDApplied: An R Package for Machine Learning and Inference with Persistence Diagrams

Authors: Shael Brown, Reza Farivar

Abstract:

Persistence diagrams capture valuable topological features of datasets that other methods cannot uncover. Still, their adoption in data pipelines has been limited due to the lack of publicly available tools in R (and python) for analyzing groups of them with machine learning and statistical inference. In an easy-to-use and scalable R package called TDApplied, we implement several applied analysis methods tailored to groups of persistence diagrams. The two main contributions of our package are comprehensiveness (most functions do not have implementations elsewhere) and speed (shown through benchmarking against other R packages). We demonstrate applications of the tools on simulated data to illustrate how easily practical analyses of any dataset can be enhanced with topological information.

Keywords: machine learning, persistence diagrams, R, statistical inference

Procedia PDF Downloads 73
15152 Government Payments to Minority American Producers

Authors: Anil K. Giri, Dipak Subedi, Kathleen Kassel, Ashok Mishra

Abstract:

The United States Department of Agriculture’s programs has been accused of being discriminatory in the past based on the race of the farmer, especially African-American producers. This study examines if there was racial discrimination in payments from the most recent new USDA programs, including those made in response to the pandemic. This study uses the Analysis of Variance (ANOVA) to examine the payments after normalizing them relative to cash receipts to test if discrimination in the number of payments received exists. Three programs investigated in this study are: i) the Coronavirus Food Assistance Program (CFAP), ii) the Market Facilitation Program (MFP), and (iii) the Paycheck Protection Program (PPP). The PPP program was administered by the Small Business Administration, whereas the other two were designed and implemented by the USDA. The PPP made forgivable loans to small businesses and, initially, was heavily criticized for not reaching minority businesses (in general). The Small Business Administration then initiated a second draw of PPP loans, prioritizing minority-owned businesses. This study compares attributes of PPP loans made to African-American farming businesses and other farming businesses in the two draws of the PPP. We find that the number of African-American farming businesses participating in the second draw of PPP loans decreased significantly from the first draw. However, the average amount of PPP loans to African-American farming businesses increased in the second draw. In the first draw, the average cost of jobs reported per loan was higher for African-American farming businesses than for other producers. In the second draw, the average cost of jobs reported per loan was significantly higher for other farming businesses than for African-American businesses. The share of PPP loans forgiven for African-American farming businesses is significantly below the national rate of 89 percent. The rate of forgiveness for PPP loans made to African-American producers is unlikely to increase significantly without policy changes. This can increase financial burdens in the future to farm operations operated by African- Americans. Finally, we conclude that the initial goal of increasing minority participation in PPP loans in the second draw, at least among African-Americans in the agricultural sector, did not meet. CFAP made almost $600 million in direct payments to minority producers, including Black producers. Black or African American producers received more than $52 million in CFAP payments. CFAP payments were proportional to the value of agricultural commodities sold for most minority producers. The 2017 Census of Agriculture showed that the majority of minority producers, including African American producers but excluding Asian producers, raised livestock. CFAP made the highest payments to livestock minority producers.

Keywords: United States department of agriculture (USDA), coronavirus food assistance program (CFAP), paycheck protection program (PPP), African-American producers, minority American producers

Procedia PDF Downloads 85
15151 A Review of Digital Twins to Reduce Emission in the Construction Industry

Authors: Zichao Zhang, Yifan Zhao, Samuel Court

Abstract:

The carbon emission problem of the traditional construction industry has long been a pressing issue. With the growing emphasis on environmental protection and advancement of science and technology, the organic integration of digital technology and emission reduction has gradually become a mainstream solution. Among various sophisticated digital technologies, digital twins, which involve creating virtual replicas of physical systems or objects, have gained enormous attention in recent years as tools to improve productivity, optimize management and reduce carbon emissions. However, the relatively high implementation costs including finances, time, and manpower associated with digital twins have limited their widespread adoption. As a result, most of the current applications are primarily concentrated within a few industries. In addition, the creation of digital twins relies on a large amount of data and requires designers to possess exceptional skills in information collection, organization, and analysis. Unfortunately, these capabilities are often lacking in the traditional construction industry. Furthermore, as a relatively new concept, digital twins have different expressions and usage methods across different industries. This lack of standardized practices poses a challenge in creating a high-quality digital twin framework for construction. This paper firstly reviews the current academic studies and industrial practices focused on reducing greenhouse gas emissions in the construction industry using digital twins. Additionally, it identifies the challenges that may be encountered during the design and implementation of a digital twin framework specific to this industry and proposes potential directions for future research. This study shows that digital twins possess substantial potential and significance in enhancing the working environment within the traditional construction industry, particularly in their ability to support decision-making processes. It proves that digital twins can improve the work efficiency and energy utilization of related machinery while helping this industry save energy and reduce emissions. This work will help scholars in this field to better understand the relationship between digital twins and energy conservation and emission reduction, and it also serves as a conceptual reference for practitioners to implement related technologies.

Keywords: digital twins, emission reduction, construction industry, energy saving, life cycle, sustainability

Procedia PDF Downloads 81
15150 Coagulase Negative Staphylococci: Phenotypic Characterization and Antimicrobial Susceptibility Pattern

Authors: Lok Bahadur Shrestha, Narayan Raj Bhattarai, Basudha Khanal

Abstract:

Introduction: Coagulase-negative staphylococci (CoNS) are the normal commensal of human skin and mucous membranes. The study was carried out to study the prevalence of CoNS among clinical isolates, to characterize them up to species level and to compare the three conventional methods for detection of biofilm formation. Objectives: to characterize the clinically significant coagulase-negative staphylococci up to species level, to compare the three phenotypic methods for the detection of biofilm formation and to study the antimicrobial susceptibility pattern of the isolates. Methods: CoNS isolates were obtained from various clinical samples during the period of 1 year. Characterization up to species level was done using biochemical test and study of biofilm formation was done by tube adherence, congo red agar, and tissue culture plate method. Results: Among 71 CoNS isolates, seven species were identified. S. epidermidis was the most common species followed by S. saprophyticus, S. haemolyticus. Antimicrobial susceptibility pattern of CoNS documented resistance of 90% to ampicillin. Resistance to cefoxitin and ceftriaxone was observed in 55% of the isolates. We detected biofilm formation in 71.8% of isolates. The sensitivity of tube adherence method was 82% while that of congo red agar method was 78%. Conclusion: Among 71 CoNS isolated, S. epidermidis was the most common isolates followed by S. saprophyticus and S. haemolyticus. Biofilm formation was detected in 71.8% of the isolates. All of the methods were effective at detecting biofilm-producing CoNS strains. Biofilm former strains are more resistant to antibiotics as compared to biofilm non-formers.

Keywords: CoNS, congo red agar, bloodstream infections, foreign body-related infections, tissue culture plate

Procedia PDF Downloads 188
15149 Comparing Community Detection Algorithms in Bipartite Networks

Authors: Ehsan Khademi, Mahdi Jalili

Abstract:

Despite the special features of bipartite networks, they are common in many systems. Real-world bipartite networks may show community structure, similar to what one can find in one-mode networks. However, the interpretation of the community structure in bipartite networks is different as compared to one-mode networks. In this manuscript, we compare a number of available methods that are frequently used to discover community structure of bipartite networks. These networks are categorized into two broad classes. One class is the methods that, first, transfer the network into a one-mode network, and then apply community detection algorithms. The other class is the algorithms that have been developed specifically for bipartite networks. These algorithms are applied on a model network with prescribed community structure.

Keywords: community detection, bipartite networks, co-clustering, modularity, network projection, complex networks

Procedia PDF Downloads 612
15148 Multichannel Surface Electromyography Trajectories for Hand Movement Recognition Using Intrasubject and Intersubject Evaluations

Authors: Christina Adly, Meena Abdelmeseeh, Tamer Basha

Abstract:

This paper proposes a system for hand movement recognition using multichannel surface EMG(sEMG) signals obtained from 40 subjects using 40 different exercises, which are available on the Ninapro(Non-Invasive Adaptive Prosthetics) database. First, we applied processing methods to the raw sEMG signals to convert them to their amplitudes. Second, we used deep learning methods to solve our problem by passing the preprocessed signals to Fully connected neural networks(FCNN) and recurrent neural networks(RNN) with Long Short Term Memory(LSTM). Using intrasubject evaluation, The accuracy using the FCNN is 72%, with a processing time for training around 76 minutes, and for RNN's accuracy is 79.9%, with 8 minutes and 22 seconds processing time. Third, we applied some postprocessing methods to improve the accuracy, like majority voting(MV) and Movement Error Rate(MER). The accuracy after applying MV is 75% and 86% for FCNN and RNN, respectively. The MER value has an inverse relationship with the prediction delay while varying the window length for measuring the MV. The different part uses the RNN with the intersubject evaluation. The experimental results showed that to get a good accuracy for testing with reasonable processing time, we should use around 20 subjects.

Keywords: hand movement recognition, recurrent neural network, movement error rate, intrasubject evaluation, intersubject evaluation

Procedia PDF Downloads 127
15147 Utilization of Long Acting Reversible Contraceptive Methods, and Associated Factors among Female College Students in Gondar Town, Northwest Ethiopia, 2018

Authors: Woledegebrieal Aregay

Abstract:

Introduction: Family planning is defined as the ability of individuals and couples to anticipate and attain their desired number of children and the spacing and timing of their births. It is part of a strategy to reduce poverty, maternal, infant and child mortality; empowers women by lightening the burden of excessive childbearing. Family planning is achieved through the use of different contraceptive methods among which the most effective method is modern family planning methods like Long-Acting Reversible Contraceptive (LARCs) which are IUCD and Implant and these methods have multiple advantages over other reversible methods. Most importantly, once in place, they do not require maintenance and their duration of action is long, ranging from 3 to10 years. Methods: An institutional-based cross-sectional study was conducted in Gondar town among female college students from April-May. A simple random sampling technique was employed to recruit a total of 1166 study subjects. Descriptive variables were computed for all predictors & dependent variables. The presence of an association between covariates & LARC use was observed by two tables’ findings using the chi-square test. Bivariate logistic regression was conducted to identify all possible factors affecting LARC utilization & its crude Odds Ratio, 95% Confidence Interval (CI) & P-value was observed. A multivariable logistic regression model was developed to control possible confounding variables. Adjusted Odds Ratio (AOR) with 95% Confidence Interval (CI) &P-values will be computed to identify significantly associated factors (P < 0.05) with LARC utilization. Result: Utilization of LARCs was 20.4%, the most common is Implant 86(96.5%), and followed by Intra-Uterine Contraceptive Device (IUCD) 3(3.5%). The result of the multivariate analysis revealed that the significant association of marital status of the respondent on utilization of LARC [AOR 3.965(2.051-7.665)], discussion of the respondent about LARC utilization with the husband/boyfriend [AOR 2.198(1.191-4.058)], and attitude of the respondent on implant was found to be associated [AOR 0.365(0.143-0.933)].Conclusion: The level of knowledge and attitude in this study was not satisfactory, the utilization of long-acting reversible contraceptives among college students was relatively satisfactory but if the knowledge and attitude of the participant has improved the prevalence of LARC were increased.

Keywords: utilization, long-acting reversible contraceptive, Ethiopia, Gondar

Procedia PDF Downloads 217
15146 Possible Role of Fenofibrate and Clofibrate in Attenuated Cardioprotective Effect of Ischemic Preconditioning in Hyperlipidemic Rat Hearts

Authors: Gurfateh Singh, Mu Khan, Razia Khanam, Govind Mohan

Abstract:

Objective: The present study has been designed to investigate the beneficial role of Fenofibrate & Clofibrate in attenuated the cardioprotective effect of ischemic preconditioning (IPC) in hyperlipidemic rat hearts. Materials & Methods: Experimental hyperlipidemia was produced by feeding high fat diet to rats for a period of 28 days. Isolated langendorff’s perfused normal and hyperlipidemic rat hearts were subjected to global ischemia for 30 min followed by reperfusion for 120 min. The myocardial infarct size was assessed macroscopically using triphenyltetrazolium chloride staining. Coronary effluent was analyzed for lactate dehydrogenase (LDH) and creatine kinase-MB release to assess the extent of cardiac injury. Moreover, the oxidative stress in heart was assessed by measuring thiobarbituric acid reactive substance, superoxide anion generation and reduced form of glutathione. Results: The ischemia-reperfusion (I/R) has been noted to induce oxidative stress by increasing TBARS, superoxide anion generation and decreasing reduced form of glutathione in normal and hyperlipidemic rat hearts. Moreover, I/R produced myocardial injury, which was assessed in terms of increase in myocardial infarct size, LDH and CK-MB release in coronary effluent and decrease in coronary flow rate in normal and hyperlipidemic rat hearts. In addition, the hyperlipidemic rat hearts showed enhanced I/R-induced myocardial injury with high degree of oxidative stress as compared with normal rat hearts subjected to I/R. Four episodes of IPC (5 min each) afforded cardioprotection against I/R-induced myocardial injury in normal rat hearts as assessed in terms of improvement in coronary flow rate and reduction in myocardial infarct size, LDH, CK-MB and oxidative stress. On the other hand, IPC mediated myocardial protection against I/R-injury was abolished in hyperlipidemic rat hearts. However, Treatment with Fenofibrate (100 mg/kg/day, i.p.), Clofibrate (300mg/kg/day, i.p.) as a agonists of PPAR-α have not affected the cardioprotective effect of IPC in normal rat hearts, but its treatment markedly restored the cardioprotective potentials of IPC in hyperlipidemic rat hearts. Conclusion: It is noted that the high degree of oxidative stress produced in hyperlipidemic rat heart during reperfusion and consequent down regulation of PPAR-α may be responsible to abolish the cardioprotective potentials of IPC.

Keywords: Hyperlipidemia, ischemia-reperfusion injury, ischemic preconditioning, PPAR-α

Procedia PDF Downloads 279
15145 Optimal Design of Substation Grounding Grid Based on Genetic Algorithm Technique

Authors: Ahmed Z. Gabr, Ahmed A. Helal, Hussein E. Said

Abstract:

With the incessant increase of power systems capacity and voltage grade, the safety of grounding grid becomes more and more prominent. In this paper, the designing substation grounding grid is presented by means of genetic algorithm (GA). This approach purposes to control the grounding cost of the power system with the aid of controlling grounding rod number and conductor lengths under the same safety limitations. The proposed technique is used for the design of the substation grounding grid in Khalda Petroleum Company “El-Qasr” power plant and the design was simulated by using CYMGRD software for results verification. The result of the design is highly complying with IEEE 80-2000 standard requirements.

Keywords: genetic algorithm, optimum grounding grid design, power system analysis, power system protection, single layer model, substation

Procedia PDF Downloads 521
15144 Ultrasonic Treatment of Baker’s Yeast Effluent

Authors: Emine Yılmaz, Serap Fındık

Abstract:

Baker’s yeast industry uses molasses as a raw material. Molasses is end product of sugar industry. Wastewater from molasses processing presents large amount of coloured substances that give dark brown color and high organic load to the effluents. The main coloured compounds are known as melanoidins. Melanoidins are product of Maillard reaction between amino acid and carbonyl groups in molasses. Dark colour prevents sunlight penetration and reduces photosynthetic activity and dissolved oxygen level of surface waters. Various methods like biological processes (aerobic and anaerobic), ozonation, wet air oxidation, coagulation/flocculation are used to treatment of baker’s yeast effluent. Before effluent is discharged adequate treatment is imperative. In addition to this, increasingly stringent environmental regulations are forcing distilleries to improve existing treatment and also to find alternative methods of effluent management or combination of treatment methods. Sonochemical oxidation is one of the alternative methods. Sonochemical oxidation employs ultrasound resulting in cavitation phenomena. In this study, decolorization of baker’s yeast effluent was investigated by using ultrasound. Baker’s yeast effluent was supplied from a factory which is located in the north of Turkey. An ultrasonic homogenizator used for this study. Its operating frequency is 20 kHz. TiO2-ZnO catalyst has been used as sonocatalyst. The effects of molar proportion of TiO2-ZnO, calcination temperature and time, catalyst amount were investigated on the decolorization of baker’s yeast effluent. The results showed that prepared composite TiO2-ZnO with 4:1 molar proportion treated at 700°C for 90 min provides better result. Initial decolorization rate at 15 min is 3% without catalyst, 14,5% with catalyst treated at 700°C for 90 min respectively.

Keywords: baker’s yeast effluent, decolorization, sonocatalyst, ultrasound

Procedia PDF Downloads 458