Search results for: Oslo manual
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 688

Search results for: Oslo manual

268 The Effects of an Exercise Program Integrated with the Transtheoretical Model on Pain and Trunk Muscle Endurance of Rice Farmers with Chronic Low Back Pain

Authors: Thanakorn Thanawat, Nomjit Nualnetr

Abstract:

Background and Purpose: In Thailand, rice farmers have the most prevalence of low back pain when compared with other manual workers. Exercises have been suggested to be a principal part of treatment programs for low back pain. However, the programs should be tailored to an individual’s readiness to change categorized by a behavioral approach. This study aimed to evaluate a difference between the responses of rice farmers with chronic low back pain who received an exercise program integrated with the transtheoretical model of behavior change (TTM) and those of the comparison group regarding severity of pain and trunk muscle endurance. Materials and Methods: An 8-week exercise program was conducted to rice farmers with chronic low back pain who were randomized to either the TTM (n=62, 52 woman and 10 men, mean age ± SD 45.0±5.4 years) or non-TTM (n=64, 53 woman and 11 men, mean age ± SD 44.7±5.4 years) groups. All participants were tested for their severity of pain and trunk (abdominal and back) muscle endurance at baseline (week 0) and immediately after termination of the program (week 8). Data were analysed by using descriptive statistics and student’s t-tests. The results revealed that both TTM and non-TTM groups could decrease their severity of pain and improve trunk muscle endurance after participating in the 8-week exercise program. When compared with the non-TTM group, however, the TTM showed a significantly greater increase in abdominal muscle endurance than did the non-TTM (P=0.004, 95% CI -12.4 to -2.3). Conclusions and Clinical Relevance: An exercise program integrated with the TTM could provide benefits to rice farmers with chronic low back pain. Future studies with a longitudinal design and more outcome measures such as physical performance and quality of life are suggested to reveal further benefits of the program.

Keywords: chronic low back pain, transtheoretical model, rice farmers, exercise program

Procedia PDF Downloads 369
267 Experimental Investigation of Mechanical Friction Influence in Semi-Hydraulic Clutch Actuation System Over Mileage

Authors: Abdul Azarrudin M. A., Pothiraj K., Kandasamy Satish

Abstract:

In the current automobile scenario, there comes a demand on more sophistication and comfort drive feel on passenger segments. The clutch pedal effort is one such customer touch feels in manual transmission vehicles, where the driver continuous to operate the clutch pedal in his entire the driving maneuvers. Hence optimum pedal efforts at green condition and over mileage to be ensured for fatigue free the driving. As friction is one the predominant factor and its tendency to challenge the technicality by causing the function degradation. One such semi-hydraulic systems shows load efficiency of about 70-75% over lifetime only due to the increase in friction which leads to the increase in pedal effort and cause fatigue to the vehicle driver. This work deals with the study of friction with different interfaces and its influence in the fulcrum points over mileage, with the objective of understanding the trend over mileage and determining the alternative ways of resolving it. In that one way of methodology is the reduction of friction by experimental investigation of various friction reduction interfaces like metal-to-metal interface and it has been tried out and is detailed further. Also, the specific attention has been put up considering the fulcrum load and its contact interfaces to move on with this study. The main results of the experimental data with the influence of three different contact interfaces are being presented with an ultimate intention of ending up into less fatigue with longer consistent pedal effort, thus smoothens the operation of the end user. The Experimental validation also has been done through rig-level test setup to depict the performance at static condition and in-parallel vehicle level test has also been performed to record the additional influences if any.

Keywords: automobile, clutch, friction, fork

Procedia PDF Downloads 102
266 Water-Sensitive Landscaping in Desert-Located Egyptian Cities through Sheer Reductions of Turfgrass and Efficient Water Use

Authors: Sarah M. Asar, Nabeel M. Elhady

Abstract:

Egypt’s current per capita water share indicates that the country suffers and has been suffering from water poverty. The abundant utilization of turfgrass in Egypt’s new urban settlements, the reliance on freshwater for irrigation, and the inadequate plant selection increase the water demand in such settlements. Decreasing the surface area of turfgrass by using alternative landscape features such as mulching, using ornamental low-maintenance plants, increasing pathways, etc., could significantly decrease the water demand of urban landscapes. The use of Ammochloa palaestina, Cenchrus crientalis (Oriental Fountain Grass), and Cistus parviflorus (with water demands of approximately 0.005m³/m²/day) as alternatives for Cynodon dactylon (0.01m³/m²/day), which is the most commonly used grass species in Egypt’s landscape, could decrease an area’s water demand by approximately 40-50%. Moreover, creating hydro-zones of similar water demanding plants would enable irrigation facilitation rather than the commonly used uniformed irrigation. Such a practice could further reduce water consumption by 15-20%. These results are based on a case-study analysis of one of Egypt’s relatively new urban settlements, Al-Rehab. Such results emphasize the importance of utilizing native, drought-tolerant vegetation in the urban landscapes of Egypt to reduce irrigation demands. Furthermore, proper implementation, monitoring, and maintenance of automated irrigation systems could be an important factor in a space’s efficient water use. As most new urban settlements in Egypt adopt sprinkler and drip irrigation systems, the lack of maintenance leads to the manual operation of such systems, and, thereby, excessive irrigation occurs.

Keywords: alternative landscape, native plants, efficient irrigation, low water demand

Procedia PDF Downloads 55
265 Automated Fact-Checking by Incorporating Contextual Knowledge and Multi-Faceted Search

Authors: Wenbo Wang, Yi-Fang Brook Wu

Abstract:

The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state-of-the-art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study introduces a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive, and authoritative data; 2) developing a search function to automatically select relevant, new, and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graphs in Wikidata to dynamically augment the representations of claims and references without introducing too much noise, II) exploring semantic relations in claims and references to further enhance fact-checking.

Keywords: fact checking, claim verification, deep learning, natural language processing

Procedia PDF Downloads 43
264 Nonlinear Estimation Model for Rail Track Deterioration

Authors: M. Karimpour, L. Hitihamillage, N. Elkhoury, S. Moridpour, R. Hesami

Abstract:

Rail transport authorities around the world have been facing a significant challenge when predicting rail infrastructure maintenance work for a long period of time. Generally, maintenance monitoring and prediction is conducted manually. With the restrictions in economy, the rail transport authorities are in pursuit of improved modern methods, which can provide precise prediction of rail maintenance time and location. The expectation from such a method is to develop models to minimize the human error that is strongly related to manual prediction. Such models will help them in understanding how the track degradation occurs overtime under the change in different conditions (e.g. rail load, rail type, rail profile). They need a well-structured technique to identify the precise time that rail tracks fail in order to minimize the maintenance cost/time and secure the vehicles. The rail track characteristics that have been collected over the years will be used in developing rail track degradation prediction models. Since these data have been collected in large volumes and the data collection is done both electronically and manually, it is possible to have some errors. Sometimes these errors make it impossible to use them in prediction model development. This is one of the major drawbacks in rail track degradation prediction. An accurate model can play a key role in the estimation of the long-term behavior of rail tracks. Accurate models increase the track safety and decrease the cost of maintenance in long term. In this research, a short review of rail track degradation prediction models has been discussed before estimating rail track degradation for the curve sections of Melbourne tram track system using Adaptive Network-based Fuzzy Inference System (ANFIS) model.

Keywords: ANFIS, MGT, prediction modeling, rail track degradation

Procedia PDF Downloads 303
263 Pattern the Location and Area of Earth-Dumping Stations from Vehicle GPS Data in Taiwan

Authors: Chun-Yuan Chen, Ming-Chang Li, Xiu-Hui Wen, Yi-Ching Tu

Abstract:

The objective of this study explores GPS (Global Positioning System) applied to trace construction vehicles such as trucks or cranes, help to pattern the earth-dumping stations of traffic construction in Taiwan. Traffic construction in this research is defined as the engineering of high-speed railways, expressways, and which that distance more than kilometers. Audit the location and check the compliance with regulations of earth-dumping stations is one of important tasks in Taiwan EPA. Basically, the earth-dumping station was known as one source of particulate matter from air pollution during construction process. Due to GPS data can be analyzed quickly and be used conveniently, this study tried to find out dumping stations by modeling vehicles tracks from GPS data during work cycle of construction. The GPS data updated from 13 vehicles related to an expressway construction in central Taiwan. The GPS footprints were retrieved to Keyhole Markup Language (KML) files so that can pattern the tracks of trucks by computer applications, the data was collected about eight months- from Feb. to Oct. in 2017. The results of GPS footprints identified dumping station and outlined the areas of earthwork had been passed to the Taiwan EPA for on-site inspection. Taiwan EPA had issued advice comments to the agency which was in charge of the construction to prevent the air pollution. According to the result of this study compared to the commonly methods in inspecting environment by manual collection, the GPS with KML patterning and modeling method can consumes less time. On the other hand, through monitoring the GPS data from construction vehicles could be useful for administration to development and implementation of strategies in environmental management.

Keywords: automatic management, earth-dumping station, environmental management, Global Positioning System (GPS), particulate matter, traffic construction

Procedia PDF Downloads 150
262 In-Service Training to Enhance Community Based Corrections

Authors: Varathagowry Vasudevan

Abstract:

This paper attempts to demonstrate the importance of capacity building of the para-professionals in community based corrections for enhancing family and child welfare as a crucial factor in providing in-service training as a responsive methodology in community based corrections to enhance the best practices. The Diploma programme in community-based corrections initiated by the National Institute of Social Development has been engaged in this noble task of training quality personnel knowledgeable in the best practices and fieldwork skills on community-based correction and its best practice. To protect the families and children and enhance best practices, National Institute of Social Development with support from the department of community-based corrections initiated a Diploma programme in community-based corrections to enhance and update the knowledge, skills, attitudes with the right mindset of the work supervisors employed at the department of community-based corrections. This study based on reflective practice illustrated the effectiveness of curriculum of in-service training programme as a tool to enhance the capacities of the relevant officers in Sri Lanka. The data for the study was obtained from participants and coordinator through classroom discussions and key informant interviews. This study showed that use of appropriate tailor-made curriculum and field practice manual by the officers during the training was very much dependent on the provision of appropriate administrative facilities, passion, teaching methodology that promote capacity to involve best practices. It also demonstrated further the fact that professional social work response, strengthening families within legal framework was very much grounded in the adoption of proper skills imbibed through training in appropriate methodology practiced in the field under guided supervision.

Keywords: capacity building, community-based corrections, in-service training, paraprofessionals

Procedia PDF Downloads 140
261 Digitalization and High Audit Fees: An Empirical Study Applied to US Firms

Authors: Arpine Maghakyan

Abstract:

The purpose of this paper is to study the relationship between the level of industry digitalization and audit fees, especially, the relationship between Big 4 auditor fees and industry digitalization level. On the one hand, automation of business processes decreases internal control weakness and manual mistakes; increases work effectiveness and integrations. On the other hand, it may cause serious misstatements, high business risks or even bankruptcy, typically in early stages of automation. Incomplete automation can bring high audit risk especially if the auditor does not fully understand client’s business automation model. Higher audit risk consequently will cause higher audit fees. Higher audit fees for clients with high automation level are more highlighted in Big 4 auditor’s behavior. Using data of US firms from 2005-2015, we found that industry level digitalization is an interaction for the auditor quality on audit fees. Moreover, the choice of Big4 or non-Big4 is correlated with client’s industry digitalization level. Big4 client, which has higher digitalization level, pays more than one with low digitalization level. In addition, a high-digitalized firm that has Big 4 auditor pays higher audit fee than non-Big 4 client. We use audit fees and firm-specific variables from Audit Analytics and Compustat databases. We analyze collected data by using fixed effects regression methods and Wald tests for sensitivity check. We use fixed effects regression models for firms for determination of the connections between technology use in business and audit fees. We control for firm size, complexity, inherent risk, profitability and auditor quality. We chose fixed effects model as it makes possible to control for variables that have not or cannot be measured.

Keywords: audit fees, auditor quality, digitalization, Big4

Procedia PDF Downloads 284
260 Performance Evaluation of the CareSTART S1 Analyzer for Quantitative Point-Of-Care Measurement of Glucose-6-Phosphate Dehydrogenase Activity

Authors: Haiyoung Jung, Mi Joung Leem, Sun Hwa Lee

Abstract:

Background & Objective: Glucose-6-phosphate dehydrogenase (G6PD) deficiency is a genetic abnormality that results in an inadequate amount of G6PD, leading to increased susceptibility of red blood cells to reactive oxygen species and hemolysis. The present study aimed to evaluate the careSTARTTM S1 analyzer for measuring G6PD activity to hemoglobin (Hb) ratio. Methods: Precision for G6PD activity and hemoglobin measurement was evaluated using control materials with two levels on five repeated runs per day for five days. The analytic performance of the careSTARTTM S1 analyzer was compared with spectrophotometry in 40 patient samples. Reference ranges suggested by the manufacturer were validated in 20 healthy males and females each. Results: The careSTARTTM S1 analyzer demonstrated precision of 6.0% for low-level (14~45 U/dL) and 2.7% for high-level (60~90 U/dL) control in G6PD activity, and 1.4% in hemoglobin (7.9~16.3 u/g Hb). A comparison study of G6PD to Hb ratio between the careSTARTTM S1 analyzer and spectrophotometry showed an average difference of 29.1% with a positive bias of the careSTARTTM S1 analyzer. All normal samples from the healthy population were validated for the suggested reference range for males (≥2.19 U/g Hb) and females (≥5.83 U/g Hb). Conclusion: The careSTARTTM S1 analyzer demonstrated good analytical performance and can replace the current spectrophotometric measurement of G6PD enzyme activity. In the aspect of the management of clinical laboratories, it can be a reasonable option as a point-of-care analyzer with minimal handling of samples and reagents, in addition to the automatic calculation of the ratio of measured G6PD activity and Hb concentration, to minimize any clerical errors involved with manual calculation.

Keywords: POCT, G6PD, performance evaluation, careSTART

Procedia PDF Downloads 51
259 Seismic Assessment of Passive Control Steel Structure with Modified Parameter of Oil Damper

Authors: Ahmad Naqi

Abstract:

Today, the passively controlled buildings are extensively becoming popular due to its excellent lateral load resistance circumstance. Typically, these buildings are enhanced with a damping device that has high market demand. Some manufacturer falsified the damping device parameter during the production to achieve the market demand. Therefore, this paper evaluates the seismic performance of buildings equipped with damping devices, which their parameter modified to simulate the falsified devices, intentionally. For this purpose, three benchmark buildings of 4-, 10-, and 20-story were selected from JSSI (Japan Society of Seismic Isolation) manual. The buildings are special moment resisting steel frame with oil damper in the longitudinal direction only. For each benchmark buildings, two types of structural elements are designed to resist the lateral load with and without damping devices (hereafter, known as Trimmed & Conventional Building). The target building was modeled using STERA-3D, a finite element based software coded for study purpose. Practicing the software one can develop either three-dimensional Model (3DM) or Lumped Mass model (LMM). Firstly, the seismic performance of 3DM and LMM models was evaluated and found excellent coincide for the target buildings. The simplified model of LMM used in this study to produce 66 cases for both of the buildings. Then, the device parameters were modified by ± 40% and ±20% to predict many possible conditions of falsification. It is verified that the building which is design to sustain the lateral load with support of damping device (Trimmed Building) are much more under threat as a result of device falsification than those building strengthen by damping device (Conventional Building).

Keywords: passive control system, oil damper, seismic assessment, lumped mass model

Procedia PDF Downloads 100
258 A Cadaveric Study of Branching Pattern of Arch of Aorta and Its Clinical Significance in Nepalese Population

Authors: Gulam Anwer Khan, A. Gautam

Abstract:

Background: The arch of aorta is a large artery that arches over the root of the left lung and connects the ascending aorta and descending aorta. It is situated in the superior mediastinum behind the manubrium sterni. It gives off three major branches i.e. brachiocephalic trunk, left common carotid artery and left subclavian artery arising from the superior surface of arch of aorta from right to left. Material and Methods: This was a descriptive study. It was carried out in 44 cadavers, obtained during dissections for undergraduates of Department of Anatomy, Chitwan Medical College, Bharatpur, Chitwan, between March 2015 to October 2016. Cadavers of both sexes were included in the present study. The arch of aorta was dissected and exposed according to the methods described by Romanes in Cunningham’s manual of practical anatomy. Results: Out of 44 dissected cadavers, 35 (79.54%) were male and 9 (20.46%) were female cadavers. The normal branching pattern of the arch of aorta was encountered in 28 (63.64%) cadavers and the remaining 16 (36.36%) cadavers showed variations in the branching pattern of arch of aorta. Two different types of variations on the branching pattern of arch of aorta were noted in the present study, in which 12 (27.27%) cadavers had common trunk of the Arch of Aorta. In 3 (5.00%) male cadavers, we found the origin of the Thyroid ima artery. This variation was noted in 1(1.66%) female cadaver. Conclusion: The present study carried out on adult human cadavers’ revealed wide variations in the branching pattern of the arch of ao rta. These variations are of clinical significance and also very useful for the anatomists, radiologists, anesthesiologists, surgeons for practice during angiography, instrumentation, supra-aortic thoracic, head and neck surgery.

Keywords: arch of aorta, brachiocephalic trunk, left common carotid artery, left subclavian artery, Thyroidea ima artery

Procedia PDF Downloads 313
257 Occupational Exposure to Polycyclic Aromatic Hydrocarbons (Pha) among Asphalt and Road Paving Workers

Authors: Boularas El Alia, H. Rezk-Allah, S. Chaoui, A. Chama, B. Rezk-Allah

Abstract:

Aims: To assess the current exposure to the PHA among various workers in the sector of asphalt and road paving. Methods: The assessment of the exposure to PHA has been performed on workers (n=14) belonging to two companies, allocated into several activities such as road paving, manufacturing of coated bituminous warm, manufacturing of asphalt cut-back, manufacturing of emulsion of asphalt. A group of control subjects (n=18) was associated. The internal exposure to PHA was investigated by measurement of the urinary excretion of 2-naphtol, urine metabolite of naphtalene, one of the biomarkers of total PHA exposure. Urine samples were collected from the exposed workers, at the beginning of the week, at the beginning of the work shift (BWBS) and at the end of the work shift, at the end of the week (ESEW). In the control subjects, single samples of urine were collected after the end of the work shift.Every subject was invited to answer a questionnaire for the collection of technical and medical data as well as smoking habits and food intake. The concentration of 2-naphtol in the hydrolysate of urine was determined spectrophotometrically, after its reaction with the Fast Blue BB salt (diazotized 4-benzoylamino-2,5-diethoxyaniline). Results: For all the workers included in the study, the 2-urinary naphtol concentrations were higher than those in the control subjects (Median=9,55 µg/g creatinine) whether it is at (BWBS) (Md=16,2 µg/g creatinine) or at (ESEW) (n=18,Median=32,22 µg/g creatinine). Considerable differences are observed according to the category of job. The concentrations are also higher among smokers. Conclusion:The results show a significant exposure, mainly during manual laying, reveals an important risk particularly for the respiratory system.Considering the current criteria, carcinogenic risk due to the PHA seems not insignificant.

Keywords: PHA, asphalt, assessment, occupational, exposure

Procedia PDF Downloads 461
256 2D Convolutional Networks for Automatic Segmentation of Knee Cartilage in 3D MRI

Authors: Ananya Ananya, Karthik Rao

Abstract:

Accurate segmentation of knee cartilage in 3-D magnetic resonance (MR) images for quantitative assessment of volume is crucial for studying and diagnosing osteoarthritis (OA) of the knee, one of the major causes of disability in elderly people. Radiologists generally perform this task in slice-by-slice manner taking 15-20 minutes per 3D image, and lead to high inter and intra observer variability. Hence automatic methods for knee cartilage segmentation are desirable and are an active field of research. This paper presents design and experimental evaluation of 2D convolutional neural networks based fully automated methods for knee cartilage segmentation in 3D MRI. The architectures are validated based on 40 test images and 60 training images from SKI10 dataset. The proposed methods segment 2D slices one by one, which are then combined to give segmentation for whole 3D images. Proposed methods are modified versions of U-net and dilated convolutions, consisting of a single step that segments the given image to 5 labels: background, femoral cartilage, tibia cartilage, femoral bone and tibia bone; cartilages being the primary components of interest. U-net consists of a contracting path and an expanding path, to capture context and localization respectively. Dilated convolutions lead to an exponential expansion of receptive field with only a linear increase in a number of parameters. A combination of modified U-net and dilated convolutions has also been explored. These architectures segment one 3D image in 8 – 10 seconds giving average volumetric Dice Score Coefficients (DSC) of 0.950 - 0.962 for femoral cartilage and 0.951 - 0.966 for tibia cartilage, reference being the manual segmentation.

Keywords: convolutional neural networks, dilated convolutions, 3 dimensional, fully automated, knee cartilage, MRI, segmentation, U-net

Procedia PDF Downloads 242
255 Effects of Occupational Therapy on Children with Unilateral Cerebral Palsy

Authors: Sedef Şahin, Meral Huri

Abstract:

Cerebral Palsy (CP) represents the most frequent cause of physical disability in children with a rate of 2,9 per 1000 live births. The activity-focused intervention is known to improve function and reduce activity limitations and barriers to participation of children with disabilities. The aim of the study was to assess the effects of occupational therapy on level of fatigue, activity performance and satisfaction in children with Unilateral Cerebral Palsy. Twenty-two children with hemiparetic cerebral palsy (mean age: 9,3 ± 2.1years; Gross Motor Function Classification System ( GMFCS) level from I to V (I = 54%, II = 23%, III = 14%, IV= 9%, V= 0%), Manual Ability Classification System (MACS) level from I to V (I = 40%, II = 32%, III = 14%, IV= 10%, V= 4%), were assigned to occupational therapy program for 6 weeks.Visual Analogue Scale (VAS) was used for intensity of the fatigue they experienced at the time on a 10 point Likert scale (1-10).Activity performance and satisfaction were measured with Canadian Occupational Performance Measure (COPM).A client-centered occupational therapy intervention was designed according to results of COPM. The results were compared with nonparametric Wilcoxon test before and after the intervention. Thirteen of the children were right-handed, whereas nine of the children were left handed.Six weeks of intervention showed statistically significant differences in level of fatigue, compared to first assessment(p<0,05). The mean score of first and the second activity performance scores were 4.51 ± 1.70 and 7.35 ± 2.51 respectively. Statistically significant difference between performance scores were found (p<0.01). The mean scores of first and second activity satisfaction scores were of 2.30± 1.05 and 5.51 ± 2.26 respectively. Statistically significant difference between satisfaction assessments were found (p<0.01). Occupational therapy is an evidence-based approach and occupational therapy interventions implemented by therapists were clinically effective on severity of fatigue, activity performance and satisfaction if implemented individually during 6 weeks.

Keywords: activity performance, cerebral palsy, fatigue, occupational therapy

Procedia PDF Downloads 219
254 Prevalence and Risk Factors of Musculoskeletal Disorders among Physical Therapist's Seniors versus Internship Students

Authors: A. H. Bekhet, N. Helmy

Abstract:

Background: Physical therapists are knowledgeable in treatment and prevention of musculoskeletal injuries; however, they have occupational musculoskeletal injuries because Physical therapy profession requires effort that may lead to work-related musculoskeletal disorders. No previous studies among physical therapists have been reported in Egypt. We aim to assess the prevalence and risk factors of musculoskeletal disorders among physical therapist’s seniors versus internship students. Method: We conducted a cross-sectional study in faculty of physical therapy Cairo university Prevalence and risk factors of musculoskeletal injuries were assessed using self-administered questionnaire with closed-ended questions. Seniors therapist was defined as a physical therapist with more than 5 years of work experience. Data were analyzed using SPSS 22.0 for Windows. Results: The study included 106 physical therapists (Junior = 72; senior = 34), the mean age of senior therapists was 30.1 (SD 6.3) years and junior therapists were 22.8 (SD 2.4). Female subjects constituted 83.9% of the studied sample. The mean hours of contact with patients was higher among junior therapists 6.4 (SD 2.6) vs. 5.7 (SD 2.1) among senior therapists. The prevalence of a musculoskeletal injury, once or more in their lifetime, was significantly higher among senior therapists (86% vs. 66.7%; p = 0.04). The highest risk factor in increasing the symptoms of the injury among junior therapists was maintaining a position for a prolonged period of time at 28% while performing manual therapy techniques was the highest risk factor among senior therapists at 32%. 53% of senior therapists have limited their patient contact time as a result of their injury in comparison to 25% of junior therapists (p = 0.09). Conclusion: the presented study shows that the prevalence of musculoskeletal injuries, once or more in their lifetime, is significantly higher among senior therapists.

Keywords: musculoskeletal injuries, occupational injuries, physical therapists, work related disorders

Procedia PDF Downloads 275
253 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution

Authors: Nikolay P. Brayanov, Anna V. Stoynova

Abstract:

Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.

Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development

Procedia PDF Downloads 224
252 Accessibility Analysis of Urban Green Space in Zadar Settlement, Croatia

Authors: Silvija Šiljeg, Ivan Marić, Ante Šiljeg

Abstract:

The accessibility of urban green spaces (UGS) is an integral element in the quality of life. Due to rapid urbanization, UGS studies have become a key element in urban planning. The potential benefits of space for its inhabitants are frequently analysed. A functional transport network system and the optimal spatial distribution of urban green surfaces are the prerequisites for maintaining the environmental equilibrium of the urban landscape. An accessibility analysis was conducted as part of the Urban Green Belts Project (UGB). The development of a GIS database for Zadar was the first step in generating the UGS accessibility indicator. Data were collected using the supervised classification method of multispectral LANDSAT images and manual vectorization of digital orthophoto images (DOF). An analysis of UGS accessibility according to the ANGst standard was conducted in the first phase of research. The accessibility indicator was generated on the basis of seven objective measurements, which included average UGS surface per capita and accessibility according to six functional levels of green surfaces. The generated indicator was compared with subjective measurements obtained by conducting a survey (718 respondents) within statistical units. The collected data reflected individual assessments and subjective evaluations of UGS accessibility. This study highlighted the importance of using objective and subjective measures in the process of understanding the accessibility of urban green surfaces. It may be concluded that when evaluating UGS accessibility, residents emphasize the immediate residential environment, ignoring higher UGS functional levels. It was also concluded that large areas of UGS within a city do not necessarily generate similar satisfaction with accessibility. The heterogeneity of output results may serve as guidelines for the further development of a functional UGS city network.

Keywords: urban green spaces (UGS), accessibility indicator, subjective and objective measurements, Zadar

Procedia PDF Downloads 234
251 Computing Machinery and Legal Intelligence: Towards a Reflexive Model for Computer Automated Decision Support in Public Administration

Authors: Jacob Livingston Slosser, Naja Holten Moller, Thomas Troels Hildebrandt, Henrik Palmer Olsen

Abstract:

In this paper, we propose a model for human-AI interaction in public administration that involves legal decision-making. Inspired by Alan Turing’s test for machine intelligence, we propose a way of institutionalizing a continuous working relationship between man and machine that aims at ensuring both good legal quality and higher efficiency in decision-making processes in public administration. We also suggest that our model enhances the legitimacy of using AI in public legal decision-making. We suggest that case loads in public administration could be divided between a manual and an automated decision track. The automated decision track will be an algorithmic recommender system trained on former cases. To avoid unwanted feedback loops and biases, part of the case load will be dealt with by both a human case worker and the automated recommender system. In those cases an experienced human case worker will have the role of an evaluator, choosing between the two decisions. This model will ensure that the algorithmic recommender system is not compromising the quality of the legal decision making in the institution. It also enhances the legitimacy of using algorithmic decision support because it provides justification for its use by being seen as superior to human decisions when the algorithmic recommendations are preferred by experienced case workers. The paper outlines in some detail the process through which such a model could be implemented. It also addresses the important issue that legal decision making is subject to legislative and judicial changes and that legal interpretation is context sensitive. Both of these issues requires continuous supervision and adjustments to algorithmic recommender systems when used for legal decision making purposes.

Keywords: administrative law, algorithmic decision-making, decision support, public law

Procedia PDF Downloads 197
250 Bridge Members Segmentation Algorithm of Terrestrial Laser Scanner Point Clouds Using Fuzzy Clustering Method

Authors: Donghwan Lee, Gichun Cha, Jooyoung Park, Junkyeong Kim, Seunghee Park

Abstract:

3D shape models of the existing structure are required for many purposes such as safety and operation management. The traditional 3D modeling methods are based on manual or semi-automatic reconstruction from close-range images. It occasions great expense and time consuming. The Terrestrial Laser Scanner (TLS) is a common survey technique to measure quickly and accurately a 3D shape model. This TLS is used to a construction site and cultural heritage management. However there are many limits to process a TLS point cloud, because the raw point cloud is massive volume data. So the capability of carrying out useful analyses is also limited with unstructured 3-D point. Thus, segmentation becomes an essential step whenever grouping of points with common attributes is required. In this paper, members segmentation algorithm was presented to separate a raw point cloud which includes only 3D coordinates. This paper presents a clustering approach based on a fuzzy method for this objective. The Fuzzy C-Means (FCM) is reviewed and used in combination with a similarity-driven cluster merging method. It is applied to the point cloud acquired with Lecia Scan Station C10/C5 at the test bed. The test-bed was a bridge which connects between 1st and 2nd engineering building in Sungkyunkwan University in Korea. It is about 32m long and 2m wide. This bridge was used as pedestrian between two buildings. The 3D point cloud of the test-bed was constructed by a measurement of the TLS. This data was divided by segmentation algorithm for each member. Experimental analyses of the results from the proposed unsupervised segmentation process are shown to be promising. It can be processed to manage configuration each member, because of the segmentation process of point cloud.

Keywords: fuzzy c-means (FCM), point cloud, segmentation, terrestrial laser scanner (TLS)

Procedia PDF Downloads 215
249 Similar Script Character Recognition on Kannada and Telugu

Authors: Gurukiran Veerapur, Nytik Birudavolu, Seetharam U. N., Chandravva Hebbi, R. Praneeth Reddy

Abstract:

This work presents a robust approach for the recognition of characters in Telugu and Kannada, two South Indian scripts with structural similarities in characters. To recognize the characters exhaustive datasets are required, but there are only a few publicly available datasets. As a result, we decided to create a dataset for one language (source language),train the model with it, and then test it with the target language.Telugu is the target language in this work, whereas Kannada is the source language. The suggested method makes use of Canny edge features to increase character identification accuracy on pictures with noise and different lighting. A dataset of 45,150 images containing printed Kannada characters was created. The Nudi software was used to automatically generate printed Kannada characters with different writing styles and variations. Manual labelling was employed to ensure the accuracy of the character labels. The deep learning models like CNN (Convolutional Neural Network) and Visual Attention neural network (VAN) are used to experiment with the dataset. A Visual Attention neural network (VAN) architecture was adopted, incorporating additional channels for Canny edge features as the results obtained were good with this approach. The model's accuracy on the combined Telugu and Kannada test dataset was an outstanding 97.3%. Performance was better with Canny edge characteristics applied than with a model that solely used the original grayscale images. The accuracy of the model was found to be 80.11% for Telugu characters and 98.01% for Kannada words when it was tested with these languages. This model, which makes use of cutting-edge machine learning techniques, shows excellent accuracy when identifying and categorizing characters from these scripts.

Keywords: base characters, modifiers, guninthalu, aksharas, vattakshara, VAN

Procedia PDF Downloads 36
248 The UNESCO Management Plan for Urban Heritage Sites: A Critical Review of Olinda and Porto, in Brazil and Portugal

Authors: Francine Morales Tavares, Jose Alberto Rio Fernandes

Abstract:

The expanding concept of Heritage and the increased relevance of how heritage places relate to their surroundings is associated with an important shift in public heritage policies and how they consider the development of cities and communities, with an increasingly relevant role of management. Within the current discussions, management plans, mandatory since the year 2005 in areas classified by UNESCO as World Heritage, it is a tool for the reconciliation of cultural heritage demands with the needs of multiple users of a certain area, being especially critical in the case of urban areas with intense touristic pressure. Considering the transformations of the heritage policy management model, this paper discusses the practices on the integration of cultural heritage in urban policies through indicators which were selected from resource manual 'Managing Cultural World Heritage (2013)' and analyzed two case studies: The Management Plan of the Historic Centre of Porto (Portugal) and The Management Plan for the Historic Site of Olinda (Brazil). The empirical evidence concluded that for the historic centre of Porto the increase of tourism is the main aim driver in the management plan, with positive and negative aspects on the heritage management point of view, unlike Olinda, where the plan for the development of local urban policies was identified as essential. Plans also differ in form, content and process but coincide on being unaligned with committed local political leaders’ agendas, with the consequent misunderstandings between theory and practice, planning and management, and critically missing in the field integration of urban policies. Therefore, more debate about management plans, more efficient tools and also, appropriate methodologies to correlate cultural heritage and urban public policy are still lacking.

Keywords: world heritage, management plan, planning, urban policies

Procedia PDF Downloads 139
247 Facilitating Written Biology Assessment in Large-Enrollment Courses Using Machine Learning

Authors: Luanna B. Prevost, Kelli Carter, Margaurete Romero, Kirsti Martinez

Abstract:

Writing is an essential scientific practice, yet, in several countries, the increasing university science class-size limits the use of written assessments. Written assessments allow students to demonstrate their learning in their own words and permit the faculty to evaluate students’ understanding. However, the time and resources required to grade written assessments prohibit their use in large-enrollment science courses. This study examined the use of machine learning algorithms to automatically analyze student writing and provide timely feedback to the faculty about students' writing in biology. Written responses to questions about matter and energy transformation were collected from large-enrollment undergraduate introductory biology classrooms. Responses were analyzed using the LightSide text mining and classification software. Cohen’s Kappa was used to measure agreement between the LightSide models and human raters. Predictive models achieved agreement with human coding of 0.7 Cohen’s Kappa or greater. Models captured that when writing about matter-energy transformation at the ecosystem level, students focused on primarily on the concepts of heat loss, recycling of matter, and conservation of matter and energy. Models were also produced to capture writing about processes such as decomposition and biochemical cycling. The models created in this study can be used to provide automatic feedback about students understanding of these concepts to biology faculty who desire to use formative written assessments in larger enrollment biology classes, but do not have the time or personnel for manual grading.

Keywords: machine learning, written assessment, biology education, text mining

Procedia PDF Downloads 260
246 Derivation of a Risk-Based Level of Service Index for Surface Street Network Using Reliability Analysis

Authors: Chang-Jen Lan

Abstract:

Current Level of Service (LOS) index adopted in Highway Capacity Manual (HCM) for signalized intersections on surface streets is based on the intersection average delay. The delay thresholds for defining LOS grades are subjective and is unrelated to critical traffic condition. For example, an intersection delay of 80 sec per vehicle for failing LOS grade F does not necessarily correspond to the intersection capacity. Also, a specific measure of average delay may result from delay minimization, delay equality, or other meaningful optimization criteria. To that end, a reliability version of the intersection critical degree of saturation (v/c) as the LOS index is introduced. Traditionally, the level of saturation at a signalized intersection is defined as the ratio of critical volume sum (per lane) to the average saturation flow (per lane) during all available effective green time within a cycle. The critical sum is the sum of the maximal conflicting movement-pair volumes in northbound-southbound and eastbound/westbound right of ways. In this study, both movement volume and saturation flow are assumed log-normal distributions. Because, when the conditions of central limit theorem obtain, multiplication of the independent, positive random variables tends to result in a log-normal distributed outcome in the limit, the critical degree of saturation is expected to be a log-normal distribution as well. Derivation of the risk index predictive limits is complex due to the maximum and absolute value operators, as well as the ratio of random variables. A fairly accurate functional form for the predictive limit at a user-specified significant level is yielded. The predictive limit is then compared with the designated LOS thresholds for the intersection critical degree of saturation (denoted as X

Keywords: reliability analysis, level of service, intersection critical degree of saturation, risk based index

Procedia PDF Downloads 118
245 Close-Range Remote Sensing Techniques for Analyzing Rock Discontinuity Properties

Authors: Sina Fatolahzadeh, Sergio A. Sepúlveda

Abstract:

This paper presents advanced developments in close-range, terrestrial remote sensing techniques to enhance the characterization of rock masses. The study integrates two state-of-the-art laser-scanning technologies, the HandySCAN and GeoSLAM laser scanners, to extract high-resolution geospatial data for rock mass analysis. These instruments offer high accuracy, precision, low acquisition time, and high efficiency in capturing intricate geological features in small to medium size outcrops and slope cuts. Using the HandySCAN and GeoSLAM laser scanners facilitates real-time, three-dimensional mapping of rock surfaces, enabling comprehensive assessments of rock mass characteristics. The collected data provide valuable insights into structural complexities, surface roughness, and discontinuity patterns, which are essential for geological and geotechnical analyses. The synergy of these advanced remote sensing technologies contributes to a more precise and straightforward understanding of rock mass behavior. In this case, the main parameters of RQD, joint spacing, persistence, aperture, roughness, infill, weathering, water condition, and joint orientation in a slope cut along the Sea-to-Sky Highway, BC, were remotely analyzed to calculate and evaluate the Rock Mass Rating (RMR) and Geological Strength Index (GSI) classification systems. Automatic and manual analyses of the acquired data are then compared with field measurements. The results show the usefulness of the proposed remote sensing methods and their appropriate conformity with the actual field data.

Keywords: remote sensing, rock mechanics, rock engineering, slope stability, discontinuity properties

Procedia PDF Downloads 45
244 The Omani Learner of English Corpus: Source and Tools

Authors: Anood Al-Shibli

Abstract:

Designing a learner corpus is not an easy task to accomplish because dealing with learners’ language has many variables which might affect the results of any study based on learners’ language production (spoken and written). Also, it is very essential to systematically design a learner corpus especially when it is aimed to be a reference to language research. Therefore, designing the Omani Learner Corpus (OLEC) has undergone many explicit and systematic considerations. These criteria can be regarded as the foundation to design any learner corpus to be exploited effectively in language use and language learning studies. Added to that, OLEC is manually error-annotated corpus. Error-annotation in learner corpora is very essential; however, it is time-consuming and prone to errors. Consequently, a navigating tool is designed to help the annotators to insert errors’ codes in order to make the error-annotation process more efficient and consistent. To assure accuracy, error annotation procedure is followed to annotate OLEC and some preliminary findings are noted. One of the main results of this procedure is creating an error-annotation system based on the Omani learners of English language production. Because OLEC is still in the first stages, the primary findings are related to only one level of proficiency and one error type which is verb related errors. It is found that Omani learners in OLEC has the tendency to have more errors in forming the verb and followed by problems in agreement of verb. Comparing the results to other error-based studies indicate that the Omani learners tend to have basic verb errors which can found in lower-level of proficiency. To this end, it is essential to admit that examining learners’ errors can give insights to language acquisition and language learning and most errors do not happen randomly but they occur systematically among language learners.

Keywords: error-annotation system, error-annotation manual, learner corpora, verbs related errors

Procedia PDF Downloads 125
243 The Marketing Development of Cloth Products Woven in Krasaesin, Songkhla Province

Authors: Auntika Thipjumnong

Abstract:

This research study aimed to investigate the production process and the market target of Kraseasin’s woven cloth including the customers’ behaviors towards the local woven products. The suggestions of a better process of production were recommended in this study. This survey research was conducted by using a questionnaire and interview, which were considered as the practical instruments to collect the data. The 200 Kraseasin’s woven makers and consumers were subjects by using a purposive sampling. Percentages, means and standard deviation were used to analyze data. The findings revealed that only 22 local woven members owned their 18 manual weavers in producing the raw materials like cotton or fiber. The main products were flowery woven cloth e.g. pikul, puangchompoo, pakakrong and ban mai roo roiy, and the others were rainy, glass wall, dice glass ball and yok dok etc. At the present, all local woven products were applied to be modernized but the strong point of those products were keeping the quality standard and firming textures, not thickness. The main objective of producing these local woven products was to earn and increase their extra incomes. Moreover, there were two dominant sales: Firstly, the makers sold their own products by themselves in their community and malls; and secondly, they would weave their products by customers’ orders. The prices’ allocation was on the difficulties in producing process. The government officials and non-government officials in local were normally customers. However the drawback of producing this local product was lack of raw material and this brought about the higher investment. The community’s customers were now lacking of interest in wearing these local products, even though they maintained their quality standard. The factors in customers’ purchasing decision were product (M = 3.93), price (M = 3.74), distribution (M = 3.73) and promotion (M = 3.97) for marketing mix well-known. Suggestion was a designing pattern of products had to be matched to the customers’ needs.

Keywords: marketing, consumer behavior, cloth products weaves, Songkhla Thailand

Procedia PDF Downloads 269
242 Developing an Automated Protocol for the Wristband Extraction Process Using Opentrons

Authors: Tei Kim, Brooklynn McNeil, Kathryn Dunn, Douglas I. Walker

Abstract:

To better characterize the relationship between complex chemical exposures and disease, our laboratory uses an approach that combines low-cost, polydimethylsiloxane (silicone) wristband samplers that absorb many of the chemicals we are exposed to with untargeted high-resolution mass spectrometry (HRMS) to characterize 1000’s of chemicals at a time. In studies with human populations, these wristbands can provide an important measure of our environment: however, there is a need to use this approach in large cohorts to study exposures associated with the disease. To facilitate the use of silicone samplers in large scale population studies, the goal of this research project was to establish automated sample preparation methods that improve throughput, robustness, and scalability of analytical methods for silicone wristbands. Using the Opentron OT2 automated liquid platform, which provides a low-cost and opensource framework for automated pipetting, we created two separate workflows that translate the manual wristband preparation method to a fully automated protocol that requires minor intervention by the operator. These protocols include a sequence generation step, which defines the location of all plates and labware according to user-specified settings, and a transfer protocol that includes all necessary instrument parameters and instructions for automated solvent extraction of wristband samplers. These protocols were written in Python and uploaded to GitHub for use by others in the research community. Results from this project show it is possible to establish automated and open source methods for the preparation of silicone wristband samplers to support profiling of many environmental exposures. Ongoing studies include deployment in longitudinal cohort studies to investigate the relationship between personal chemical exposure and disease.

Keywords: bioinformatics, automation, opentrons, research

Procedia PDF Downloads 92
241 Evaluation of the Analytic for Hemodynamic Instability as a Prediction Tool for Early Identification of Patient Deterioration

Authors: Bryce Benson, Sooin Lee, Ashwin Belle

Abstract:

Unrecognized or delayed identification of patient deterioration is a key cause of in-hospitals adverse events. Clinicians rely on vital signs monitoring to recognize patient deterioration. However, due to ever increasing nursing workloads and the manual effort required, vital signs tend to be measured and recorded intermittently, and inconsistently causing large gaps during patient monitoring. Additionally, during deterioration, the body’s autonomic nervous system activates compensatory mechanisms causing the vital signs to be lagging indicators of underlying hemodynamic decline. This study analyzes the predictive efficacy of the Analytic for Hemodynamic Instability (AHI) system, an automated tool that was designed to help clinicians in early identification of deteriorating patients. The lead time analysis in this retrospective observational study assesses how far in advance AHI predicted deterioration prior to the start of an episode of hemodynamic instability (HI) becoming evident through vital signs? Results indicate that of the 362 episodes of HI in this study, 308 episodes (85%) were correctly predicted by the AHI system with a median lead time of 57 minutes and an average of 4 hours (240.5 minutes). Of the 54 episodes not predicted, AHI detected 45 of them while the episode of HI was ongoing. Of the 9 undetected, 5 were not detected by AHI due to either missing or noisy input ECG data during the episode of HI. In total, AHI was able to either predict or detect 98.9% of all episodes of HI in this study. These results suggest that AHI could provide an additional ‘pair of eyes’ on patients, continuously filling the monitoring gaps and consequently giving the patient care team the ability to be far more proactive in patient monitoring and adverse event management.

Keywords: clinical deterioration prediction, decision support system, early warning system, hemodynamic status, physiologic monitoring

Procedia PDF Downloads 170
240 Modeling Average Paths Traveled by Ferry Vessels Using AIS Data

Authors: Devin Simmons

Abstract:

At the USDOT’s Bureau of Transportation Statistics, a biannual census of ferry operators in the U.S. is conducted, with results such as route mileage used to determine federal funding levels for operators. AIS data allows for the possibility of using GIS software and geographical methods to confirm operator-reported mileage for individual ferry routes. As part of the USDOT’s work on the ferry census, an algorithm was developed that uses AIS data for ferry vessels in conjunction with known ferry terminal locations to model the average route travelled for use as both a cartographic product and confirmation of operator-reported mileage. AIS data from each vessel is first analyzed to determine individual journeys based on the vessel’s velocity, and changes in velocity over time. These trips are then converted to geographic linestring objects. Using the terminal locations, the algorithm then determines whether the trip represented a known ferry route. Given a large enough dataset, routes will be represented by multiple trip linestrings, which are then filtered by DBSCAN spatial clustering to remove outliers. Finally, these remaining trips are ready to be averaged into one route. The algorithm interpolates the point on each trip linestring that represents the start point. From these start points, a centroid is calculated, and the first point of the average route is determined. Each trip is interpolated again to find the point that represents one percent of the journey’s completion, and the centroid of those points is used as the next point in the average route, and so on until 100 points have been calculated. Routes created using this algorithm have shown demonstrable improvement over previous methods, which included the implementation of a LOESS model. Additionally, the algorithm greatly reduces the amount of manual digitizing needed to visualize ferry activity.

Keywords: ferry vessels, transportation, modeling, AIS data

Procedia PDF Downloads 154
239 Digital Twin for Retail Store Security

Authors: Rishi Agarwal

Abstract:

Digital twins are emerging as a strong technology used to imitate and monitor physical objects digitally in real time across sectors. It is not only dealing with the digital space, but it is also actuating responses in the physical space in response to the digital space processing like storage, modeling, learning, simulation, and prediction. This paper explores the application of digital twins for enhancing physical security in retail stores. The retail sector still relies on outdated physical security practices like manual monitoring and metal detectors, which are insufficient for modern needs. There is a lack of real-time data and system integration, leading to ineffective emergency response and preventative measures. As retail automation increases, new digital frameworks must control safety without human intervention. To address this, the paper proposes implementing an intelligent digital twin framework. This collects diverse data streams from in-store sensors, surveillance, external sources, and customer devices and then Advanced analytics and simulations enable real-time monitoring, incident prediction, automated emergency procedures, and stakeholder coordination. Overall, the digital twin improves physical security through automation, adaptability, and comprehensive data sharing. The paper also analyzes the pros and cons of implementation of this technology through an Emerging Technology Analysis Canvas that analyzes different aspects of this technology through both narrow and wide lenses to help decision makers in their decision of implementing this technology. On a broader scale, this showcases the value of digital twins in transforming legacy systems across sectors and how data sharing can create a safer world for both retail store customers and owners.

Keywords: digital twin, retail store safety, digital twin in retail, digital twin for physical safety

Procedia PDF Downloads 55