Search results for: participatory error correction process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16814

Search results for: participatory error correction process

16754 A Comparative Study on the Dimensional Error of 3D CAD Model and SLS RP Model for Reconstruction of Cranial Defect

Authors: L. Siva Rama Krishna, Sriram Venkatesh, M. Sastish Kumar, M. Uma Maheswara Chary

Abstract:

Rapid Prototyping (RP) is a technology that produces models and prototype parts from 3D CAD model data, CT/MRI scan data, and model data created from 3D object digitizing systems. There are several RP process like Stereolithography (SLA), Solid Ground Curing (SGC), Selective Laser Sintering (SLS), Fused Deposition Modelling (FDM), 3D Printing (3DP) among them SLS and FDM RP processes are used to fabricate pattern of custom cranial implant. RP technology is useful in engineering and biomedical application. This is helpful in engineering for product design, tooling and manufacture etc. RP biomedical applications are design and development of medical devices, instruments, prosthetics and implantation; it is also helpful in planning complex surgical operation. The traditional approach limits the full appreciation of various bony structure movements and therefore the custom implants produced are difficult to measure the anatomy of parts and analyse the changes in facial appearances accurately. Cranioplasty surgery is a surgical correction of a defect in cranial bone by implanting a metal or plastic replacement to restore the missing part. This paper aims to do a comparative study on the dimensional error of CAD and SLS RP Models for reconstruction of cranial defect by comparing the virtual CAD with the physical RP model of a cranial defect.

Keywords: rapid prototyping, selective laser sintering, cranial defect, dimensional error

Procedia PDF Downloads 299
16753 Rights-Based Approach to Artificial Intelligence Design: Addressing Harm through Participatory ex ante Impact Assessment

Authors: Vanja Skoric

Abstract:

The paper examines whether the impacts of artificial intelligence (AI) can be meaningfully addressed through the rights-based approach to AI design, investigating in particular how the inclusive, participatory process of assessing the AI impact would make this viable. There is a significant gap between envisioning rights-based AI systems and their practical application. Plausibly, internalizing human rights approach within AI design process might be achieved through identifying and assessing implications of AI features human rights, especially considering the case of vulnerable individuals and communities. However, there is no clarity or consensus on how such an instrument should be operationalised to usefully identify the impact, mitigate harms and meaningfully ensure relevant stakeholders’ participation. In practice, ensuring the meaningful inclusion of those individuals, groups, or entire communities who are affected by the use of the AI system is a prerequisite for a process seeking to assess human rights impacts and risks. Engagement in the entire process of the impact assessment should enable those affected and interested to access information and better understand the technology, product, or service and resulting impacts, but also to learn about their rights and the respective obligations and responsibilities of developers and deployers to protect and/or respect these rights. This paper will provide an overview of the study and practice of the participatory design process for AI, including inclusive impact assessment, its main elements, propose a framework, and discuss the lessons learned from the existing theory. In addition, it will explore pathways for enhancing and promoting individual and group rights through such engagement by discussing when, how, and whom to include, at which stage of the process, and what are the pre-requisites for meaningful and engaging. The overall aim is to ensure using the technology that works for the benefit of society, individuals, and particular (historically marginalised) groups.

Keywords: rights-based design, AI impact assessment, inclusion, harm mitigation

Procedia PDF Downloads 116
16752 Determinants of Aggregate Electricity Consumption in Ghana: A Multivariate Time Series Analysis

Authors: Renata Konadu

Abstract:

In Ghana, electricity has become the main form of energy which all sectors of the economy rely on for their businesses. Therefore, as the economy grows, the demand and consumption of electricity also grow alongside due to the heavy dependence on it. However, since the supply of electricity has not increased to match the demand, there has been frequent power outages and load shedding affecting business performances. To solve this problem and advance policies to secure electricity in Ghana, it is imperative that those factors that cause consumption to increase be analysed by considering the three classes of consumers; residential, industrial and non-residential. The main argument, however, is that, export of electricity to other neighbouring countries should be included in the electricity consumption model and considered as one of the significant factors which can decrease or increase consumption. The author made use of multivariate time series data from 1980-2010 and econometric models such as Ordinary Least Squares (OLS) and Vector Error Correction Model. Findings show that GDP growth, urban population growth, electricity exports and industry value added to GDP were cointegrated. The results also showed that there is unidirectional causality from electricity export and GDP growth and Industry value added to GDP to electricity consumption in the long run. However, in the short run, there was found to be a directional causality among all the variables and electricity consumption. The results have useful implication for energy policy makers especially with regards to electricity consumption, demand, and supply.

Keywords: electricity consumption, energy policy, GDP growth, vector error correction model

Procedia PDF Downloads 409
16751 Role of Vision Centers in Eliminating Avoidable Blindness Caused Due to Uncorrected Refractive Error in Rural South India

Authors: Ranitha Guna Selvi D, Ramakrishnan R, Mohideen Abdul Kader

Abstract:

Purpose: To study the role of Vision centers in managing preventable blindness through refractive error correction in Rural South India. Methods: A retrospective analysis of patients attending 15 Vision centers in Rural South India from a period of January 2021 to December 2021 was done. Medical records of 10,85,81 patients both new and reviewed, 79,562 newly registered patients and 29,019 review patient’s from15 Vision centers were included for data analysis. All the patients registered at the vision center underwent basic eye examination, including visual acuity, IOP measurement, Slit-lamp examination, retinoscopy, Fundus examination etc. Results: A total of 1,08,581 patients were included in the study. Of the total 1,08,581 patients, 79,562 were newly registered patients at Vision center and 29,019 were review patients. Males were 52,201(48.1%) and Females were 56,308(51.9) among them. The mean age of all examined patients was 41.03 ± 20.9 years (Standard deviation) and ranged from 01 – 113 years. Presenting mean visual acuity was 0.31 ± 0.5 in the right eye and 0.31 ± 0.4 in the left eye. Of the 1,08,581 patients 22,770 patients had refractive error in right eye and 22,721 patients had uncorrected refractive error in left eye. Glass prescription was given to 17,178 (15.8%) patients. 8,109 (7.5%) patients were referred to the base hospital for specialty clinic expert opinion or for cataract surgery. Conclusion: Vision center utilizing teleconsultation for comprehensive eye screening unit is a very effective tool in reducing the avoidable visual impairment caused due to uncorrected refractive error. Vision Centre model is believed to be efficient as it facilitates early detection and management of uncorrected refractive errors.

Keywords: refractive error, uncorrected refractive error, vision center, vision technician, teleconsultation

Procedia PDF Downloads 107
16750 Design for Error-Proofing Assembly: A Systematic Approach to Prevent Assembly Issues since Early Design Stages, an Industrial Case Study

Authors: Gabriela Estrada, Joaquim Lloveras

Abstract:

Design for error-proofing assembly is a new DFX approach to prevent assembly issues since early design stages. Assembly issues that can happen during the life phases of a system such as: production, installation, operation, and replacement phases. This prevention is possible by designing the product with poka-yoke or error-proofing characteristics. This approach guide designers to make decisions based on poka-yoke assembly design requirements. As a result of applying these requirements designers are able to create solutions to prevent assembly issues for the product in development stage. This paper integrates the needs to design products in an error proofing way into the systematic approach of design process by Pahl and Beitz. A case study is presented applying this approach.

Keywords: poka-yoke, error-proofing, assembly issues, design process, life phases of a system

Procedia PDF Downloads 344
16749 Design for Error-Proofing Assembly: A Systematic Approach to Prevent Assembly Issues since Early Design Stages. An Industry Case Study

Authors: Gabriela Estrada, Joaquim Lloveras

Abstract:

Design for error-proofing assembly is a new DFX approach to prevent assembly issues since early design stages. Assembly issues that can happen during the life phases of a system such as: production, installation, operation and replacement phases. This prevention is possible by designing the product with poka-yoke or error-proofing characteristics. This approach guide designers to make decisions based on poka-yoke assembly design requirements. As a result of applying these requirements designers are able to create solutions to prevent assembly issues for the product in development stage. This paper integrates the needs to design products in an error proofing way into the systematic approach of design process by Pahl and Beitz. A case study is presented applying this approach.

Keywords: poka-yoke, error-proofing, assembly issues, design process, life phases of a system

Procedia PDF Downloads 292
16748 Hardware Error Analysis and Severity Characterization in Linux-Based Server Systems

Authors: Nikolaos Georgoulopoulos, Alkis Hatzopoulos, Konstantinos Karamitsios, Konstantinos Kotrotsios, Alexandros I. Metsai

Abstract:

In modern server systems, business critical applications run in different types of infrastructure, such as cloud systems, physical machines and virtualization. Often, due to high load and over time, various hardware faults occur in servers that translate to errors, resulting to malfunction or even server breakdown. CPU, RAM and hard drive (HDD) are the hardware parts that concern server administrators the most regarding errors. In this work, selected RAM, HDD and CPU errors, that have been observed or can be simulated in kernel ring buffer log files from two groups of Linux servers, are investigated. Moreover, a severity characterization is given for each error type. Better understanding of such errors can lead to more efficient analysis of kernel logs that are usually exploited for fault diagnosis and prediction. In addition, this work summarizes ways of simulating hardware errors in RAM and HDD, in order to test the error detection and correction mechanisms of a Linux server.

Keywords: hardware errors, Kernel logs, Linux servers, RAM, hard disk, CPU

Procedia PDF Downloads 123
16747 The Impact of Natural Resources on Financial Development: The Global Perspective

Authors: Remy Jonkam Oben

Abstract:

Using a time series approach, this study investigates how natural resources impact financial development from a global perspective over the 1980-2019 period. Some important determinants of financial development (economic growth, trade openness, population growth, and investment) have been added to the model as control variables. Unit root tests have revealed that all the variables are integrated into order one. Johansen's cointegration test has shown that the variables are in a long-run equilibrium relationship. The vector error correction model (VECM) has estimated the coefficient of the error correction term (ECT), which suggests that the short-run values of natural resources, economic growth, trade openness, population growth, and investment contribute to financial development converging to its long-run equilibrium level by a 23.63% annual speed of adjustment. The estimated coefficients suggest that global natural resource rent has a statistically-significant negative impact on global financial development in the long-run (thereby validating the financial resource curse) but not in the short-run. Causality test results imply that neither global natural resource rent nor global financial development Granger-causes each other.

Keywords: financial development, natural resources, resource curse hypothesis, time series analysis, Granger causality, global perspective

Procedia PDF Downloads 120
16746 Participatory Air Quality Monitoring in African Cities: Empowering Communities, Enhancing Accountability, and Ensuring Sustainable Environments

Authors: Wabinyai Fidel Raja, Gideon Lubisa

Abstract:

Air pollution is becoming a growing concern in Africa due to rapid industrialization and urbanization, leading to implications for public health and the environment. Establishing a comprehensive air quality monitoring network is crucial to combat this issue. However, conventional methods of monitoring are insufficient in African cities due to the high cost of setup and maintenance. To address this, low-cost sensors (LCS) can be deployed in various urban areas through the use of participatory air quality network siting (PAQNS). PAQNS involves stakeholders from the community, local government, and private sector working together to determine the most appropriate locations for air quality monitoring stations. This approach improves the accuracy and representativeness of air quality monitoring data, engages and empowers community members, and reflects the actual exposure of the population. Implementing PAQNS in African cities can build trust, promote accountability, and increase transparency in the air quality management process. However, challenges to implementing this approach must be addressed. Nonetheless, improving air quality is essential for protecting public health and promoting a sustainable environment. Implementing participatory and data-informed air quality monitoring can take a significant step toward achieving these important goals in African cities and beyond.

Keywords: low-cost sensors, participatory air quality network siting, air pollution, air quality management

Procedia PDF Downloads 56
16745 Relationship between Electricity Consumption and Economic Growth: Evidence from Nigeria (1971-2012)

Authors: N. E Okoligwe, Okezie A. Ihugba

Abstract:

Few scholars disagrees that electricity consumption is an important supporting factor for economy growth. However, the relationship between electricity consumption and economy growth has different manifestation in different countries according to previous studies. This paper examines the causal relationship between electricity consumption and economic growth for Nigeria. In an attempt to do this, the paper tests the validity of the modernization or depending hypothesis by employing various econometric tools such as Augmented Dickey Fuller (ADF) and Johansen Co-integration test, the Error Correction Mechanism (ECM) and Granger Causality test on time series data from 1971-2012. The Granger causality is found not to run from electricity consumption to real GDP and from GDP to electricity consumption during the year of study. The null hypothesis is accepted at the 5 per cent level of significance where the probability value (0.2251 and 0.8251) is greater than five per cent level of significance because both of them are probably determined by some other factors like; increase in urban population, unemployment rate and the number of Nigerians that benefit from the increase in GDP and increase in electricity demand is not determined by the increase in GDP (income) over the period of study because electricity demand has always been greater than consumption. Consequently; the policy makers in Nigeria should place priority in early stages of reconstruction on building capacity additions and infrastructure development of the electric power sector as this would force the sustainable economic growth in Nigeria.

Keywords: economic growth, electricity consumption, error correction mechanism, granger causality test

Procedia PDF Downloads 279
16744 The Learning Loops in the Public Realm Project in South Verona: Air Quality and Noise Pollution Participatory Data Collection towards Co-Design, Planning and Construction of Mitigation Measures in Urban Areas

Authors: Massimiliano Condotta, Giovanni Borga, Chiara Scanagatta

Abstract:

Urban systems are places where the various actors involved interact and enter in conflict, in particular with reference to topics such as traffic congestion and security. But topics of discussion, and often clash because of their strong complexity, are air and noise pollution. For air pollution, the complexity stems from the fact that atmospheric pollution is due to many factors, but above all, the observation and measurement of the amount of pollution of a transparent, mobile and ethereal element like air is very difficult. Often the perceived condition of the inhabitants does not coincide with the real conditions, because it is conditioned - sometimes in positive ways other in negative ways - from many other factors such as the presence, or absence, of natural elements such as trees or rivers. These problems are seen with noise pollution as well, which is also less considered as an issue even if it’s problematic just as much as air quality. Starting from these opposite positions, it is difficult to identify and implement valid, and at the same time shared, mitigation solutions for the problem of urban pollution (air and noise pollution). The LOOPER (Learning Loops in the Public Realm) project –described in this paper – wants to build and test a methodology and a platform for participatory co-design, planning, and construction process inside a learning loop process. Novelties in this approach are various; the most relevant are three. The first is that citizens participation starts since from the research of problems and air quality analysis through a participatory data collection, and that continues in all process steps (design and construction). The second is that the methodology is characterized by a learning loop process. It means that after the first cycle of (1) problems identification, (2) planning and definition of design solution and (3) construction and implementation of mitigation measures, the effectiveness of implemented solutions is measured and verified through a new participatory data collection campaign. In this way, it is possible to understand if the policies and design solution had a positive impact on the territory. As a result of the learning process produced by the first loop, it will be possible to improve the design of the mitigation measures and start the second loop with new and more effective measures. The third relevant aspect is that the citizens' participation is carried out via Urban Living Labs that involve all stakeholder of the city (citizens, public administrators, associations of all urban stakeholders,…) and that the Urban Living Labs last for all the cycling of the design, planning and construction process. The paper will describe in detail the LOOPER methodology and the technical solution adopted for the participatory data collection and design and construction phases.

Keywords: air quality, co-design, learning loops, noise pollution, urban living labs

Procedia PDF Downloads 335
16743 High Performance of Direct Torque and Flux Control of a Double Stator Induction Motor Drive with a Fuzzy Stator Resistance Estimator

Authors: K. Kouzi

Abstract:

In order to have stable and high performance of direct torque and flux control (DTFC) of double star induction motor drive (DSIM), proper on-line adaptation of the stator resistance is very important. This is inevitably due to the variation of the stator resistance during operating conditions, which introduces error in estimated flux position and the magnitude of the stator flux. Error in the estimated stator flux deteriorates the performance of the DTFC drive. Also, the effect of error in estimation is very important especially at low speed. Due to this, our aim is to overcome the sensitivity of the DTFC to the stator resistance variation by proposing on-line fuzzy estimation stator resistance. The fuzzy estimation method is based on an on-line stator resistance correction through the variations of the stator current estimation error and its variations. The fuzzy logic controller gives the future stator resistance increment at the output. The main advantage of the suggested algorithm control is to avoid the drive instability that may occur in certain situations and ensure the tracking of the actual stator resistance. The validity of the technique and the improvement of the whole system performance are proved by the results.

Keywords: direct torque control, dual stator induction motor, Fuzzy Logic estimation, stator resistance adaptation

Procedia PDF Downloads 296
16742 Corrective Feedback and Uptake Patterns in English Speaking Lessons at Hanoi Law University

Authors: Nhac Thanh Huong

Abstract:

New teaching methods have led to the changes in the teachers’ roles in an English class, in which teachers’ error correction is an integral part. Language error and corrective feedback have been the interest of many researchers in foreign language teaching. However, the techniques and the effectiveness of teachers’ feedback have been a question of much controversy. This present case study has been carried out with a view to finding out the patterns of teachers’ corrective feedback and their impact on students’ uptake in English speaking lessons of legal English major students at Hanoi Law University. In order to achieve those aims, the study makes use of classroom observations as the main method of data collection to seeks answers to the two following questions: 1. What patterns of corrective feedback occur in English speaking lessons for second- year legal English major students in Hanoi Law University?; 2. To what extent does that corrective feedback lead to students’ uptake? The study provided some important findings, among which was a close relationship between corrective feedback and uptake. In particular, recast was the most commonly used feedback type, yet it was the least effective in terms of students’ uptake and repair, while the most successful feedback, namely meta-linguistic feedback, clarification requests and elicitation, which led to students’ generated repair, was used at a much lower rate by teachers. Furthermore, it revealed that different types of errors needed different types of feedback. Also, the use of feedback depended on the students’ English proficiency level. In the light of findings, a number of pedagogical implications have been drawn in the hope of enhancing the effectiveness of teachers’ corrective feedback to students’ uptake in foreign language acquisition process.

Keywords: corrective feedback, error, uptake, speaking English lesson

Procedia PDF Downloads 231
16741 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria

Authors: Isaac Kayode Ogunlade

Abstract:

Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.

Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device

Procedia PDF Downloads 63
16740 A Coevolutionary Framework of Business-IT Alignment through the Lens of Enterprise Architecture

Authors: Mengmeng Zhang, Honghui Chen, Kalle Lyytinen

Abstract:

The major challenges for sustainable business-IT alignment (BITA) in a company root in its volatile external competitive environment, increasingly complex internal relationships, and subversive IT roles. Failure to adequately address BITA results in wasting organizational resources, losing competitive advantages, and failing to produce adequate returns on investments. The coevolution is more suitable to describe the dynamic relationships of business and IT and has received certain attention in recent years. Multiple mechanisms for achieving BITC (e.g., sharing domain knowledge, modular design) were obtained. However, instead of a complete managing process, BITC achievement is still hard to operate in practice. This study emphasizes what the BITC management process looks like and how to execute this coevolution step-by-step. A practical coevolutionary framework that combines the enterprise architecture (EA) method with misalignment analysis is proposed in this paper. It contains steps of EA design, misalignment detection, misalignment correction, and EA management /misalignment prevention. The step of misalignment correction is especially discussed at length. This study also evaluates the proposed framework by comparing the characteristics, principles, and approaches of coevolution in the literature.

Keywords: business-IT alignment, business-IT coevolution, enterprise architecture, misalignment analysis, misalignment correction

Procedia PDF Downloads 116
16739 The Effect of the Acquisition and Reconstruction Parameters in Quality of Spect Tomographic Images with Attenuation and Scatter Correction

Authors: N. Boutaghane, F. Z. Tounsi

Abstract:

Many physical and technological factors degrade the SPECT images, both qualitatively and quantitatively. For this, it is not always put into leading technological advances to improve the performance of tomographic gamma camera in terms of detection, collimation, reconstruction and correction of tomographic images methods. We have to master firstly the choice of various acquisition and reconstruction parameters, accessible to clinical cases and using the attenuation and scatter correction methods to always optimize quality image and minimized to the maximum dose received by the patient. In this work, an evaluation of qualitative and quantitative tomographic images is performed based on the acquisition parameters (counts per projection) and reconstruction parameters (filter type, associated cutoff frequency). In addition, methods for correcting physical effects such as attenuation and scatter degrading the image quality and preventing precise quantitative of the reconstructed slices are also presented. Two approaches of attenuation and scatter correction are implemented: the attenuation correction by CHANG method with a filtered back projection reconstruction algorithm and scatter correction by the subtraction JASZCZAK method. Our results are considered as such recommandation, which permits to determine the origin of the different artifacts observed both in quality control tests and in clinical images.

Keywords: attenuation, scatter, reconstruction filter, image quality, acquisition and reconstruction parameters, SPECT

Procedia PDF Downloads 411
16738 Broadening the Public Sphere: Examining the Role of Community Radio in Fostering Participatory Democracy in Selected Communities in Ondo State, Nigeria

Authors: John Ibanga

Abstract:

Since May 1999, when Nigeria returned to uninterrupted democratic rule, there have been various attempts by successive governments at committing themselves to democratic ideals. Such efforts include a revision of communication policies after repeated calls by civil society organisations, development partners, researchers, and academics to allow not only the commencement of campus radio broadcasting but also the takeoff of community radio broadcasting. Thus, in 2015, operating licenses were granted to several communities spread across the six geopolitical zones in the country for the establishment of community radio stations culminating in the establishment of the first community radio in Nigeria on July 17, 2015. And, since citizens’ involvement in policy matters and governance is one of the tenets of participatory democracy, it becomes imperative to investigate how the emerging community radio sector in Nigeria is facilitating participatory democracy among Nigerians, even in the face of attempts by the present government to silence all dissenting voices. This study, therefore, examines how residents in Ondo State, Southwest Nigeria, are utilising programmes on Ejule Nen and Kakaaki community radio stations in Ondo State, Nigeria, to deepen participatory democracy. Much of the existing studies on the role of community radio in participatory democracy and citizens' engagement efforts miss out on Nigeria because of the delayed implementation of community radio policy in Nigeria being Africa’s most populous nation as well as a major player in the affairs of the African continent. While the participatory communication and communication infrastructure theories were used as framework, data were collected from in-depth interviews with staff of the community radio station and community leaders, focus group discussions with the community residents, and qualitative content analysis of programmes on the station. The residents used the community radio stations as platforms for demanding accountability from government, mobilising resources for the execution of a number of community projects, promoting credible electoral practices, and influencing the implementation of free education policy in their communities. Hence the community radio stations became the reliable and authoritative voices of residents for participating in the public sphere and, generally, the democratic process.

Keywords: community, community radio, democracy, participatory democracy

Procedia PDF Downloads 88
16737 GPU-Accelerated Triangle Mesh Simplification Using Parallel Vertex Removal

Authors: Thomas Odaker, Dieter Kranzlmueller, Jens Volkert

Abstract:

We present an approach to triangle mesh simplification designed to be executed on the GPU. We use a quadric error metric to calculate an error value for each vertex of the mesh and order all vertices based on this value. This step is followed by the parallel removal of a number of vertices with the lowest calculated error values. To allow for the parallel removal of multiple vertices we use a set of per-vertex boundaries that prevent mesh foldovers even when simplification operations are performed on neighbouring vertices. We execute multiple iterations of the calculation of the vertex errors, ordering of the error values and removal of vertices until either a desired number of vertices remains in the mesh or a minimum error value is reached. This parallel approach is used to speed up the simplification process while maintaining mesh topology and avoiding foldovers at every step of the simplification.

Keywords: computer graphics, half edge collapse, mesh simplification, precomputed simplification, topology preserving

Procedia PDF Downloads 334
16736 Correction Factor to Enhance the Non-Standard Hammer Effect Used in Standard Penetration Test

Authors: Khaled R. Khater

Abstract:

The weight of the SPT hammer is standard (0.623kN). The locally manufacturer drilling rigs use hammers, sometimes deviating off the standard weight. This affects the field measured blow counts (Nf) consequentially, affecting most of correlations previously obtained, as they were obtained based on standard hammer weight. The literature presents energy corrections factor (η2) to be applied to the SPT total input energy. This research investigates the effect of the hammer weight variation, as a single parameter, on the field measured blow counts (Nf). The outcome is a correction factor (ηk), equation, and correction chart. They are recommended to adjust back the measured misleading (Nf) to the standard one as if the standard hammer is used. This correction is very important to be done in such cases where a non-standard hammer is being used because the bore logs in any geotechnical report should contain true and representative values (Nf), let alone the long records of correlations, already in hand. The study here-in is achieved by using laboratory physical model to simulate the SPT dripping hammer mechanism. It is designed to allow different hammer weights to be used. Also, it is manufactured to avoid and eliminate the energy loss sources. This produces a transmitted efficiency up to 100%.

Keywords: correction factors, hammer weight, physical model, standard penetration test

Procedia PDF Downloads 353
16735 Public Participation for an Effective Flood Risk Management: Building Social Capacities in Ribera Alta Del Ebro, Spain

Authors: Alba Ballester Ciuró, Marc Pares Franzi

Abstract:

While coming decades are likely to see a higher flood risk in Europe and greater socio-economic damages, traditional flood risk management has become inefficient. In response to that, new approaches such as capacity building and public participation have recently been incorporated in natural hazards mitigation policy (i.e. Sendai Framework for Action, Intergovernmental Panel on Climate Change reports and EU Floods Directive). By integrating capacity building and public participation, we present a research concerning the promotion of participatory social capacity building actions for flood risk mitigation at the local level. Social capacities have been defined as the resources and abilities available at individual and collective level that can be used to anticipate, respond to, cope with, recover from and adapt to external stressors. Social capacity building is understood as a process of identifying communities’ social capacities and of applying collaborative strategies to improve them. This paper presents a proposal of systematization of participatory social capacity building process for flood risk mitigation, and its implementation in a high risk of flooding area in the Ebro river basin: Ribera Alta del Ebro. To develop this process, we designed and tested a tool that allows measuring and building five types of social capacities: knowledge, motivation, networks, participation and finance. The tool implementation has allowed us to assess social capacities in the area. Upon the results of the assessment we have developed a co-decision process with stakeholders and flood risk management authorities on which participatory activities could be employed to improve social capacities for flood risk mitigation. Based on the results of this process, and focused on the weaker social capacities, we developed a set of participatory actions in the area oriented to general public and stakeholders: informative sessions on flood risk management plan and flood insurances, interpretative river descents on flood risk management (with journalists, teachers, and general public), interpretative visit to the floodplain, workshop on agricultural insurance, deliberative workshop on project funding, deliberative workshops in schools on flood risk management (playing with a flood risk model). The combination of obtaining data through a mixed-methods approach of qualitative inquiry and quantitative surveys, as well as action research through co-decision processes and pilot participatory activities, show us the significant impact of public participation on social capacity building for flood risk mitigation and contributes to the understanding of which main factors intervene in this process.

Keywords: flood risk management, public participation, risk reduction, social capacities, vulnerability assessment

Procedia PDF Downloads 180
16734 From Manipulation to Citizen Control: A Case Study Revealing the Level of Participation in the Citizen Participatory Audit

Authors: Mark Jason E. Arca, Jay Vee R. Linatoc, Rex Francis N. Lupango, Michael Joe A. Ramirez

Abstract:

Participation promises an avenue for citizens to take part in governance, but it does not necessarily mean effective participation. The proper integration of participants in the decision-making process should be properly addressed to ensure effectiveness. This study explores the integration of the participants in the decision-making process to reveal the level of participation in the Solid Waste Management audit done by the Citizen Participatory Audit (CPA), a program under the supervision of the Commission on Audit. Specifically, this study will use the experience of participation to identify emerging themes that will help reveal the level of participation through the integrated ladder of participation. The researchers used key informant interviews to gather necessary data from the actors of the program. The findings revealed that the level of participation present in the CPA is at the Placation level, a level below the program’s targeted level of participation. The study also allowed the researchers to reveal facilitating factors in the program that contributed to a better understanding of the practice of participation.

Keywords: citizen participation, culture of participation, ladder of participation, level of participation

Procedia PDF Downloads 376
16733 Government Final Consumption Expenditure and Household Consumption Expenditure NPISHS in Nigeria

Authors: Usman A. Usman

Abstract:

Undeniably, unlike the Classical side, the Keynesian perspective of the aggregate demand side indeed has a significant position in the policy, growth, and welfare of Nigeria due to government involvement and ineffective demand of the population living with poor per capita income. This study seeks to investigate the effect of Government Final Consumption Expenditure, Financial Deepening on Households, and NPISHs Final consumption expenditure using data on Nigeria from 1981 to 2019. This study employed the ADF stationarity test, Johansen Cointegration test, and Vector Error Correction Model. The results of the study revealed that the coefficient of Government final consumption expenditure has a positive effect on household consumption expenditure in the long run. There is a long-run and short-run relationship between gross fixed capital formation and household consumption expenditure. The coefficients cpsgdp (financial deepening and gross fixed capital formation posit a negative impact on household final consumption expenditure. The coefficients money supply lm2gdp, which is another proxy for financial deepening, and the coefficient FDI have a positive effect on household final consumption expenditure in the long run. Therefore, this study recommends that Gross fixed capital formation stimulates household consumption expenditure; a legal framework to support investment is a panacea to increasing hoodmold income and consumption and reducing poverty in Nigeria. Therefore, this should be a key central component of policy.

Keywords: government final consumption expenditure, household consumption expenditure, vector error correction model, cointegration

Procedia PDF Downloads 21
16732 Error Analysis of Students’ Freewriting: A Study of Adult English Learners’ Errors

Authors: Louella Nicole Gamao

Abstract:

Writing in English is accounted as a complex skill and process for foreign language learners who commit errors in writing are found as an inevitable part of language learners' writing. This study aims to explore and analyze the learners of English-as-a foreign Language (EFL) freewriting in a University in Taiwan by identifying the category of mistakes that often appear in their freewriting activity and analyzing the learners' awareness of each error. Hopefully, this present study will be able to gain further information about students' errors in their English writing that may contribute to further understanding of the benefits of freewriting activity that can be used for future purposes as a powerful tool in English writing courses for EFL classes. The present study adopted the framework of error analysis proposed by Dulay, Burt, and Krashen (1982), which consisted of a compilation of data, identification of errors, classification of error types, calculation of frequency of each error, and error interpretation. Survey questionnaires regarding students' awareness of errors were also analyzed and discussed. Using quantitative and qualitative approaches, this study provides a detailed description of the errors found in the students'freewriting output, explores the similarities and differences of the students' errors in both academic writing and freewriting, and lastly, analyzes the students' perception of their errors.

Keywords: error, EFL, freewriting, taiwan, english

Procedia PDF Downloads 75
16731 Government Final Consumption Expenditure Financial Deepening and Household Consumption Expenditure NPISHs in Nigeria

Authors: Usman A. Usman

Abstract:

Undeniably, unlike the Classical side, the Keynesian perspective of the aggregate demand side indeed has a significant position in the policy, growth, and welfare of Nigeria due to government involvement and ineffective demand of the population living with poor per capita income. This study seeks to investigate the effect of Government Final Consumption Expenditure, Financial Deepening on Households, and NPISHs Final consumption expenditure using data on Nigeria from 1981 to 2019. This study employed the ADF stationarity test, Johansen Cointegration test, and Vector Error Correction Model. The results of the study revealed that the coefficient of Government final consumption expenditure has a positive effect on household consumption expenditure in the long run. There is a long-run and short-run relationship between gross fixed capital formation and household consumption expenditure. The coefficients cpsgdp financial deepening and gross fixed capital formation posit a negative impact on household final consumption expenditure. The coefficients money supply lm2gdp, which is another proxy for financial deepening, and the coefficient FDI have a positive effect on household final consumption expenditure in the long run. Therefore, this study recommends that Gross fixed capital formation stimulates household consumption expenditure; a legal framework to support investment is a panacea to increasing hoodmold income and consumption and reducing poverty in Nigeria. Therefore, this should be a key central component of policy.

Keywords: household, government expenditures, vector error correction model, johansen test

Procedia PDF Downloads 26
16730 Relevancy Measures of Errors in Displacements of Finite Elements Analysis Results

Authors: A. B. Bolkhir, A. Elshafie, T. K. Yousif

Abstract:

This paper highlights the methods of error estimation in finite element analysis (FEA) results. It indicates that the modeling error could be eliminated by performing finite element analysis with successively finer meshes or by extrapolating response predictions from an orderly sequence of relatively low degree of freedom analysis results. In addition, the paper eliminates the round-off error by running the code at a higher precision. The paper provides application in finite element analysis results. It draws a conclusion based on results of application of methods of error estimation.

Keywords: finite element analysis (FEA), discretization error, round-off error, mesh refinement, richardson extrapolation, monotonic convergence

Procedia PDF Downloads 457
16729 Lowering Error Floors by Concatenation of Low-Density Parity-Check and Array Code

Authors: Cinna Soltanpur, Mohammad Ghamari, Behzad Momahed Heravi, Fatemeh Zare

Abstract:

Low-density parity-check (LDPC) codes have been shown to deliver capacity approaching performance; however, problematic graphical structures (e.g. trapping sets) in the Tanner graph of some LDPC codes can cause high error floors in bit-error-ratio (BER) performance under conventional sum-product algorithm (SPA). This paper presents a serial concatenation scheme to avoid the trapping sets and to lower the error floors of LDPC code. The outer code in the proposed concatenation is the LDPC, and the inner code is a high rate array code. This approach applies an interactive hybrid process between the BCJR decoding for the array code and the SPA for the LDPC code together with bit-pinning and bit-flipping techniques. Margulis code of size (2640, 1320) has been used for the simulation and it has been shown that the proposed concatenation and decoding scheme can considerably improve the error floor performance with minimal rate loss.

Keywords: concatenated coding, low–density parity–check codes, array code, error floors

Procedia PDF Downloads 329
16728 Neural Networks and Genetic Algorithms Approach for Word Correction and Prediction

Authors: Rodrigo S. Fonseca, Antônio C. P. Veiga

Abstract:

Aiming at helping people with some movement limitation that makes typing and communication difficult, there is a need to customize an assistive tool with a learning environment that helps the user in order to optimize text input, identifying the error and providing the correction and possibilities of choice in the Portuguese language. The work presents an Orthographic and Grammatical System that can be incorporated into writing environments, improving and facilitating the use of an alphanumeric keyboard, using a prototype built using a genetic algorithm in addition to carrying out the prediction, which can occur based on the quantity and position of the inserted letters and even placement in the sentence, ensuring the sequence of ideas using a Long Short Term Memory (LSTM) neural network. The prototype optimizes data entry, being a component of assistive technology for the textual formulation, detecting errors, seeking solutions and informing the user of accurate predictions quickly and effectively through machine learning.

Keywords: genetic algorithm, neural networks, word prediction, machine learning

Procedia PDF Downloads 164
16727 Community Based Participatory Research in Opioid Use: Design of an Informatics Solution

Authors: Sue S. Feldman, Bradley Tipper, Benjamin Schooley

Abstract:

Nearly every community in the US has been impacted by opioid related addictions/deaths; it is a national problem that is threatening our social and economic welfare. Most believe that tackling this problem from a prevention perspective advances can be made toward breaking the chain of addiction. One mechanism, community based participatory research, involves the community in the prevention approach. This project combines that approach with a design science approach to develop an integrated solution. Findings suggested accountable care communities, transpersonal psychology, and social exchange theory as product kernel theories. Evaluation was conducted on a prototype.

Keywords: substance use and abuse recovery, community resource centers, accountable care communities, community based participatory research

Procedia PDF Downloads 117
16726 Study of Side Effects of Myopia Contact Correction by Soft Lenses and Orthokeratology Lenses among Medical Students

Authors: K. Iu. Hrizhymalska, O. Ol. Andrushkova, I. Iu. Pshenychna

Abstract:

Aim. To study and copare the side effects of myopia contact correction by soft lenses and orthokeratology lenses among medical students. Patients and methods: 34 students (68 eyes) with moderate and severe myopia, who used contact correction of myopia for 2-4 years, were examined. Some of them used soft lenses, while others - orthokeratology lenses. Methods were used: biomicroscopy of the eye surface, Schirmer's test, Norn's test, survey regarding satisfaction with use. Results. Corneal vascularization along the limbus was noted in 4 (5%) eyes of the examined students. In 8 (11%) eyes, symptoms of mild dry eye disease were detected. 2 (3%) eyes showed signs of meibomitis. Allergic conjunctivitis was observed in 4 (5%) eyes, and a purulent corneal ulcer was present in 1 eye. Surveys have shown that orthokeratology lenses unlike soft lenses don't limit everyday activity (in sports, tourism, swimming etc.), they also don't cause discomfort during temperature changes and reduce existing symptoms of dry eye disease. Conclusion. Thus, myopia contact correction is one of the optimal options among students, which allows to expand physical and mental activity. However, taking into account the frequency of side effects in users of soft contact lenses, it is necessary to carry out prevention and treatment of myopia in medical students, follow the recommendations for use, instill preservative-free tear substitutes with trehalose when symptoms of dry eye appear. Also when side reactions occur, contact correction with soft lenses should be changed to orthokeratology lenses.

Keywords: correction, myopia, soft lenses, orthokeratology, specracles, cornea, dry eye, side effects, refractive errors

Procedia PDF Downloads 25
16725 Calibration of the Radical Installation Limit Error of the Accelerometer in the Gravity Gradient Instrument

Authors: Danni Cong, Meiping Wu, Xiaofeng He, Junxiang Lian, Juliang Cao, Shaokuncai, Hao Qin

Abstract:

Gravity gradient instrument (GGI) is the core of the gravity gradiometer, so the structural error of the sensor has a great impact on the measurement results. In order not to affect the aimed measurement accuracy, limit error is required in the installation of the accelerometer. In this paper, based on the established measuring principle model, the radial installation limit error is calibrated, which is taken as an example to provide a method to calculate the other limit error of the installation under the premise of ensuring the accuracy of the measurement result. This method provides the idea for deriving the limit error of the geometry structure of the sensor, laying the foundation for the mechanical precision design and physical design.

Keywords: gravity gradient sensor, radial installation limit error, accelerometer, uniaxial rotational modulation

Procedia PDF Downloads 397