Search results for: computational assistance
1529 Biodiversity Conservation Practices Among Indigenous Peoples in Caraga Region, Mindanao, Philippines
Authors: Milagros S. Salibad, Levita B. Grana
Abstract:
The presence and role of Indigenous Peoples residing in key biodiversity, protected, and watershed areas within the ancestral domain in the Caraga Region hold immense significance. This study aimed to determine the level of biodiversity conservation practices among the Mamanwas, Manobos, and Higaonons, and identify facilitating or hindering factors. Employing a mixed-method research design, 421 respondents participated through a researcher-made questionnaire. Focus group discussions, key informant interviews, researcher field notes, community immersions, and secondary sources were done. The three groups have demonstrated a high level of biodiversity conservation practices manifesting their commitment to conserving their natural resources and ecosystems. Evidently, selecting and cutting only mature trees for shelter and tribal usage, and preservation of large trees that harbor ancestors’ spirits and worship through rituals (Mambabaja). Each group exhibited unique environmental practices shaped by their distinct cultures, traditions, customary knowledge, and access to information. The Mamanwa practiced traditional hunting and gathering by using traps while Manobo practiced shifting cultivation to maintain soil fertility and biodiversity, and Higaonon managed forest resources through traditional forest management (establishment of sacred forests and conservation areas). Various facilitating and hindering factors influenced their conservation efforts. Their traditional knowledge and practices, partnership and collaboration, legal recognition and support, access to information, and biodiversity monitoring system facilitate practices. Insufficient government assistance, political and social issues, scarce financial support, inadequate policy enforcement, lack of livelihood opportunities, and land use conflicts hinder them. Monitoring the sustainability of IPs' local biodiversity conservation practices is essential as they contribute to conservation endeavors.Keywords: biodiversity, conservation, indigenous peoples, traditional knowledge
Procedia PDF Downloads 781528 Application of Wavelet Based Approximation for the Solution of Partial Integro-Differential Equation Arising from Viscoelasticity
Authors: Somveer Singh, Vineet Kumar Singh
Abstract:
This work contributes a numerical method based on Legendre wavelet approximation for the treatment of partial integro-differential equation (PIDE). Operational matrices of Legendre wavelets reduce the solution of PIDE into the system of algebraic equations. Some useful results concerning the computational order of convergence and error estimates associated to the suggested scheme are presented. Illustrative examples are provided to show the effectiveness and accuracy of proposed numerical method.Keywords: legendre wavelets, operational matrices, partial integro-differential equation, viscoelasticity
Procedia PDF Downloads 4501527 A Hebbian Neural Network Model of the Stroop Effect
Authors: Vadim Kulikov
Abstract:
The classical Stroop effect is the phenomenon that it takes more time to name the ink color of a printed word if the word denotes a conflicting color than if it denotes the same color. Over the last 80 years, there have been many variations of the experiment revealing various mechanisms behind semantic, attentional, behavioral and perceptual processing. The Stroop task is known to exhibit asymmetry. Reading the words out loud is hardly dependent on the ink color, but naming the ink color is significantly influenced by the incongruent words. This asymmetry is reversed, if instead of naming the color, one has to point at a corresponding color patch. Another debated aspects are the notions of automaticity and how much of the effect is due to semantic and how much due to response stage interference. Is automaticity a continuous or an all-or-none phenomenon? There are many models and theories in the literature tackling these questions which will be discussed in the presentation. None of them, however, seems to capture all the findings at once. A computational model is proposed which is based on the philosophical idea developed by the author that the mind operates as a collection of different information processing modalities such as different sensory and descriptive modalities, which produce emergent phenomena through mutual interaction and coherence. This is the framework theory where ‘framework’ attempts to generalize the concepts of modality, perspective and ‘point of view’. The architecture of this computational model consists of blocks of neurons, each block corresponding to one framework. In the simplest case there are four: visual color processing, text reading, speech production and attention selection modalities. In experiments where button pressing or pointing is required, a corresponding block is added. In the beginning, the weights of the neural connections are mostly set to zero. The network is trained using Hebbian learning to establish connections (corresponding to ‘coherence’ in framework theory) between these different modalities. The amount of data fed into the network is supposed to mimic the amount of practice a human encounters, in particular it is assumed that converting written text into spoken words is a more practiced skill than converting visually perceived colors to spoken color-names. After the training, the network performs the Stroop task. The RT’s are measured in a canonical way, as these are continuous time recurrent neural networks (CTRNN). The above-described aspects of the Stroop phenomenon along with many others are replicated. The model is similar to some existing connectionist models but as will be discussed in the presentation, has many advantages: it predicts more data, the architecture is simpler and biologically more plausible.Keywords: connectionism, Hebbian learning, artificial neural networks, philosophy of mind, Stroop
Procedia PDF Downloads 2701526 Cuckoo Search Optimization for Black Scholes Option Pricing
Authors: Manas Shah
Abstract:
Black Scholes option pricing model is one of the most important concepts in modern world of computational finance. However, its practical use can be challenging as one of the input parameters must be estimated; implied volatility of the underlying security. The more precisely these values are estimated, the more accurate their corresponding estimates of theoretical option prices would be. Here, we present a novel model based on Cuckoo Search Optimization (CS) which finds more precise estimates of implied volatility than Particle Swarm Optimization (PSO) and Genetic Algorithm (GA).Keywords: black scholes model, cuckoo search optimization, particle swarm optimization, genetic algorithm
Procedia PDF Downloads 4531525 Effect of the Applied Bias on Miniband Structures in Dimer Fibonacci Inas/Ga1-Xinxas Superlattices
Authors: Z. Aziz, S. Terkhi, Y. Sefir, R. Djelti, S. Bentata
Abstract:
The effect of a uniform electric field across multibarrier systems (InAs/InxGa1-xAs) is exhaustively explored by a computational model using exact airy function formalism and the transfer-matrix technique. In the case of biased DFHBSL structure a strong reduction in transmission properties was observed and the width of the miniband structure linearly decreases with the increase of the applied bias. This is due to the confinement of the states in the miniband structure, which becomes increasingly important (Wannier-Stark Effect).Keywords: dimer fibonacci height barrier superlattices, singular extended state, exact airy function, transfer matrix formalism
Procedia PDF Downloads 3071524 Impact of Revenue Reform on Vulnerable Communities
Authors: Pauliasi Tony Fakahau
Abstract:
This paper provides an overview of the impact of the revenue reform programme on vulnerable communities in the Kingdom of Tonga. Economic turmoil and mismanagement during the late 1990s forced the government to seek technical and financial assistance from the Asian Development Bank to undertake a comprehensive Economic and Public Sector Reform (EPSR) programme. The EPSR is a Western model recommended by donor agencies as the solution to Tonga’s economic challenges. The EPSR programme included public sector reform, private sector growth, and revenue generation. Tax reform was the main tool for revenue generation, which set out to strengthen tax compliance and administration as well as implement a value-added consumption tax. The EPSR is based on Western values and ideology but failed to recognise that Tongan cultural values are important to the local community. Two participant groups were interviewed. Participant group one consisted of 51 people representing vulnerable communities. Participant group two consisted of six people from the government and business sector who were from the elite of Tongan society. The Kakala Research Methodology provided the framework for the research, and the Talanoa Research Method was used to conduct semi-structured interviews in the homes of the first group and in the workplaces of the second group. The research found a heavy burden of the consumption tax on the purchasing power of participant group one (vulnerable participants), having an impact on nearly every financial transaction they made. Participant group ones’ main financial priorities were kavenga fakalotu (obligations to the church), kavenga fakafāmili (obligations to the family) and kavenga fakafonua (obligations to cultural events for the village, nobility, and royalty). The findings identified inequalities of the revenue reform, especially from consumption tax, for vulnerable people and communities compared to the elite of society. The research concluded that government and donor agencies need ameliorating policies to reduce the burden of tax on vulnerable groups more susceptible to the impact of revenue reform.Keywords: tax reform, tonga vulnerable community revenue, revenue reform, public sector reform
Procedia PDF Downloads 1321523 Adaptation of Hough Transform Algorithm for Text Document Skew Angle Detection
Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye
Abstract:
The skew detection and correction form an important part of digital document analysis. This is because uncompensated skew can deteriorate document features and can complicate further document image processing steps. Efficient text document analysis and digitization can rarely be achieved when a document is skewed even at a small angle. Once the documents have been digitized through the scanning system and binarization also achieved, document skew correction is required before further image analysis. Research efforts have been put in this area with algorithms developed to eliminate document skew. Skew angle correction algorithms can be compared based on performance criteria. Most important performance criteria are accuracy of skew angle detection, range of skew angle for detection, speed of processing the image, computational complexity and consequently memory space used. The standard Hough Transform has successfully been implemented for text documentation skew angle estimation application. However, the standard Hough Transform algorithm level of accuracy depends largely on how much fine the step size for the angle used. This consequently consumes more time and memory space for increase accuracy and, especially where number of pixels is considerable large. Whenever the Hough transform is used, there is always a tradeoff between accuracy and speed. So a more efficient solution is needed that optimizes space as well as time. In this paper, an improved Hough transform (HT) technique that optimizes space as well as time to robustly detect document skew is presented. The modified algorithm of Hough Transform presents solution to the contradiction between the memory space, running time and accuracy. Our algorithm starts with the first step of angle estimation accurate up to zero decimal place using the standard Hough Transform algorithm achieving minimal running time and space but lacks relative accuracy. Then to increase accuracy, suppose estimated angle found using the basic Hough algorithm is x degree, we then run again basic algorithm from range between ±x degrees with accuracy of one decimal place. Same process is iterated till level of desired accuracy is achieved. The procedure of our skew estimation and correction algorithm of text images is implemented using MATLAB. The memory space estimation and process time are also tabulated with skew angle assumption of within 00 and 450. The simulation results which is demonstrated in Matlab show the high performance of our algorithms with less computational time and memory space used in detecting document skew for a variety of documents with different levels of complexity.Keywords: hough-transform, skew-detection, skew-angle, skew-correction, text-document
Procedia PDF Downloads 1591522 Ergonomic Adaptations in Visually Impaired Workers - A Literature Review
Authors: Kamila Troper, Pedro Mestre, Maria Lurdes Menano, Joana Mendonça, Maria João Costa, Sandra Demel
Abstract:
Introduction: Visual impairment is a problem that has an influence on hundreds of thousands of people all over the world. Although it is possible for a Visually Impaired person to do most jobs, the right training, technological assistance, and emotional support are essential. Ergonomics be able to solve many of the problems/issues with the relative ease of positioning, lighting and design of the workplace. A little forethought can make a tremendous difference to the ease with which a person with an impairment function. Objectives: Review the main ergonomic adaptation measures reported in the literature in order to promote better working conditions and safety measures for the visually impaired. Methodology: This was an exploratory-descriptive, qualitative literature systematic review study. The main databases used were: PubMed, BIREME, LILACS, with articles and studies published between 2000 and 2021. Results: Based on the principles of the theoretical references of ergonomic analysis of work, the main restructuring of the physical space of the workstations were: Accessibility facilities and assistive technologies; A screen reader that captures information from a computer and sends it in real-time to a speech synthesizer or Braille terminal; Installations of software with voice recognition, Monitors with enlarged screens; Magnification software; Adequate lighting, magnifying lenses in addition to recommendations regarding signage and clearance of the places where the visually impaired pass through. Conclusions: Employability rates for people with visual impairments(both those who are blind and those who have low vision)are low and continue to be a concern to the world and for researchers as a topic of international interest. Although numerous authors have identified barriers to employment and proposed strategies to remediate or circumvent those barriers, people with visual impairments continue to experience high rates of unemployment.Keywords: ergonomic adaptations, visual impairments, ergonomic analysis of work, systematic review
Procedia PDF Downloads 1831521 Computational and Experimental Determination of Acoustic Impedance of Internal Combustion Engine Exhaust
Authors: A. O. Glazkov, A. S. Krylova, G. G. Nadareishvili, A. S. Terenchenko, S. I. Yudin
Abstract:
The topic of the presented materials concerns the design of the exhaust system for a certain internal combustion engine. The exhaust system can be divided into two parts. The first is the engine exhaust manifold, turbocharger, and catalytic converters, which are called “hot part.” The second part is the gas exhaust system, which contains elements exclusively for reducing exhaust noise (mufflers, resonators), the accepted designation of which is the "cold part." The design of the exhaust system from the point of view of acoustics, that is, reducing the exhaust noise to a predetermined level, consists of working on the second part. Modern computer technology and software make it possible to design "cold part" with high accuracy in a given frequency range but with the condition of accurately specifying the input parameters, namely, the amplitude spectrum of the input noise and the acoustic impedance of the noise source in the form of an engine with a "hot part". Getting this data is a difficult problem: high temperatures, high exhaust gas velocities (turbulent flows), and high sound pressure levels (non-linearity mode) do not allow the calculated results to be applied with sufficient accuracy. The aim of this work is to obtain the most reliable acoustic output parameters of an engine with a "hot part" based on a complex of computational and experimental studies. The presented methodology includes several parts. The first part is a finite element simulation of the "cold part" of the exhaust system (taking into account the acoustic impedance of radiation of outlet pipe into open space) with the result in the form of the input impedance of "cold part". The second part is a finite element simulation of the "hot part" of the exhaust system (taking into account acoustic characteristics of catalytic units and geometry of turbocharger) with the result in the form of the input impedance of the "hot part". The next third part of the technique consists of the mathematical processing of the results according to the proposed formula for the convergence of the mathematical series of summation of multiple reflections of the acoustic signal "cold part" - "hot part". This is followed by conducting a set of tests on an engine stand with two high-temperature pressure sensors measuring pulsations in the nozzle between "hot part" and "cold part" of the exhaust system and subsequent processing of test results according to a well-known technique in order to separate the "incident" and "reflected" waves. The final stage consists of the mathematical processing of all calculated and experimental data to obtain a result in the form of a spectrum of the amplitude of the engine noise and its acoustic impedance.Keywords: acoustic impedance, engine exhaust system, FEM model, test stand
Procedia PDF Downloads 591520 Characterization of the Music Admission Requirements and Evaluation of the Relationship among Motivation and Performance Achievement
Authors: Antonio M. Oliveira, Patricia Oliveira-Silva, Jose Matias Alves, Gary McPherson
Abstract:
The music teaching is oriented towards offering formal music training. Due to its specificities, this vocational program starts at a very young age. Although provided by the State, the offer is limited to 6 schools throughout the country, which means that the vacancies for prospective students are very limited every year. It is therefore crucial that these vacancies be taken by especially motivated children grown within households that offer the ideal setting for success. Some of the instruments used to evaluate musical performance are highly sensitive to specific previous training, what represents a severe validity problem for testing children who have had restricted opportunities for formal training. Moreover, these practices may be unfair because, for instance, they may not reflect the candidates’ music aptitudes. Based on what constitutes a prerequisite for making an excellent music student, researchers in this field have long argued that motivation, task commitment, and parents’ support are as important as ability. Thus, the aim of this study is: (1) to prepare an inventory of admission requirements in Australia, Portugal and Ireland; (2) to examine whether the candidates to music conservatories and parents’ level of motivation, assessed at three evaluation points (i.e., admission, at the end of the first year, and at the end of the second year), correlates positively with the candidates’ progress in learning a musical instrument (i.e., whether motivation at the admission may predict student musicianship); (3) an adaptation of an existing instrument to assess the motivation (i.e., to adapt the items to the music setting, focusing on the motivation for playing a musical instrument). The inclusion criteria are: only children registered in the administrative services to be evaluated for entrance to the conservatory will be accepted for this study. The expected number of participants is fifty (5-6 years old) in all the three frequency schemes: integrated, articulated and supplementary. Revisiting musical admission procedures is of particular importance and relevance to musical education because this debate may bring guidance and assistance about the needed improvement to make the process of admission fairer and more transparent.Keywords: music learning, music admission requirements, student’s motivation, parent’s motivation
Procedia PDF Downloads 1681519 Monte Carlo Methods and Statistical Inference of Multitype Branching Processes
Authors: Ana Staneva, Vessela Stoimenova
Abstract:
A parametric estimation of the MBP with Power Series offspring distribution family is considered in this paper. The MLE for the parameters is obtained in the case when the observable data are incomplete and consist only with the generation sizes of the family tree of MBP. The parameter estimation is calculated by using the Monte Carlo EM algorithm. The estimation for the posterior distribution and for the offspring distribution parameters are calculated by using the Bayesian approach and the Gibbs sampler. The article proposes various examples with bivariate branching processes together with computational results, simulation and an implementation using R.Keywords: Bayesian, branching processes, EM algorithm, Gibbs sampler, Monte Carlo methods, statistical estimation
Procedia PDF Downloads 4221518 The Experiences and Needs of Fathers’ of Children With Cancer in Coping With the Child's Illness
Authors: Karina Lõbus, Silver Muld, Kadri Kööp, Mare Tupits
Abstract:
Aim: The aim of the research is to describe the experiences and needs of fathers’ of children with cancer in coping with the child's disease. Background: Today, about 80% of children diagnosed with malignancy in developed countries survive. Despite the positive statistics, recovery is not always certain, treatment is often very intensive and long-term. Cancer is affecting an increasing number of the population, which is increasing the demand for quality care, but the nature of expected care is currently unclear. This topic is important for the development of professional practice, as nurses complain that their knowledge to deal with the relatives of a patient with a difficult diagnosis is limited and would therefore like additional information to deal with the situation. Design: Qualitative, empirical, descriptive research. Method: The data were collected through semi-structured interviews and analysed by inductive content analysis method. Interviews were conducted during Autumn 2020. 4 subjects participated in the research. Results and Conclusions: The thesis revealed that fathers had different experiences and needs in dealing with the child's illness. Fathers' experiences of coping with child's diseases encompassed experiences with information, social relationships, healthcare, changes in personal health and experiences regarding the child. Regarding information, the respondents pointed out bad experiences with the availability of information and the ability to convey the necessary information. Experiences regarding social relationships included experiences with relatives and strangers. Regarding healthcare, fathers mentioned experiences related to the child's health and healthcare professionals. In regards to personal health, fathers pointed out negative changes in their mental and physical health. In relation to the child, the subjects revealed experiences regarding changed values, way of life and raising the child. According to the research, fathers’ needs in relation to dealing with child's cancer included material, social, and spiritual needs. In regard to material needs, fathers pointed out the need for state assistance and the needs related to the surrounding environment. The needs concerning social belonging involved needs for a driving force and involvement in the treatment process. Regarding spiritual needs, fathers expressed mixed feelings towards the need for outside and professional help.Keywords: father, coping, cancer, child, experience, need
Procedia PDF Downloads 1361517 Automated Manual Handling Risk Assessments: Practitioner Experienced Determinants of Automated Risk Analysis and Reporting Being a Benefit or Distraction
Authors: S. Cowley, M. Lawrance, D. Bick, R. McCord
Abstract:
Technology that automates manual handling (musculoskeletal disorder or MSD) risk assessments is increasingly available to ergonomists, engineers, generalist health and safety practitioners alike. The risk assessment process is generally based on the use of wearable motion sensors that capture information about worker movements for real-time or for posthoc analysis. Traditionally, MSD risk assessment is undertaken with the assistance of a checklist such as that from the SafeWork Australia code of practice, the expert assessor observing the task and ideally engaging with the worker in a discussion about the detail. Automation enables the non-expert to complete assessments and does not always require the assessor to be there. This clearly has cost and time benefits for the practitioner but is it an improvement on the assessment by the human. Human risk assessments draw on the knowledge and expertise of the assessor but, like all risk assessments, are highly subjective. The complexity of the checklists and models used in the process can be off-putting and sometimes will lead to the assessment becoming the focus and the end rather than a means to an end; the focus on risk control is lost. Automated risk assessment handles the complexity of the assessment for the assessor and delivers a simple risk score that enables decision-making regarding risk control. Being machine-based, they are objective and will deliver the same each time they assess an identical task. However, the WHS professional needs to know that this emergent technology asks the right questions and delivers the right answers. Whether it improves the risk assessment process and results or simply distances the professional from the task and the worker. They need clarity as to whether automation of manual task risk analysis and reporting leads to risk control or to a focus on the worker. Critically, they need evidence as to whether automation in this area of hazard management leads to better risk control or just a bigger collection of assessments. Practitioner experienced determinants of this automated manual task risk analysis and reporting being a benefit or distraction will address an understanding of emergent risk assessment technology, its use and things to consider when making decisions about adopting and applying these technologies.Keywords: automated, manual-handling, risk-assessment, machine-based
Procedia PDF Downloads 1201516 CFD Effect of the Tidal Grating in Opposite Directions
Authors: N. M. Thao, I. Dolguntseva, M. Leijon
Abstract:
Flow blockages referring to the increase in flow are considered as a vital equipment for marine current energy conversion. However, the shape of these devices will result in extracted energy under the operation. The present work investigates the effect of two configurations of a grating, convergent and divergent that located upstream, to the water flow velocity. Computational Fluid Dynamic simulation studies the flow characteristics by using the ANSYS Fluent solver for these specified arrangements of the grating. The results indicate that distinct features of flow velocity between “convergent” and “divergent” grating placements are up to in confined conditions. Furthermore, the velocity in case of granting is higher than that of the divergent grating.Keywords: marine current energy, converter, turbine granting, RANS simulation, water flow velocity
Procedia PDF Downloads 4101515 Numerical Investigation of Flow Past in a Staggered Tube Bundle
Authors: Kerkouri Abdelkadir
Abstract:
Numerical calculations of turbulent flows are one of the most prominent modern interests in various engineering applications. Due to the difficulty of predicting, following up and studying this flow for computational fluid dynamic (CFD), in this paper, we simulated numerical study of a flow past in a staggered tube bundle, using CFD Code ANSYS FLUENT with several models of turbulence following: k-ε, k-ω and SST approaches. The flow is modeled based on the experimental studies. The predictions of mean velocities are in very good agreement with detailed LDA (Laser Doppler Anemometry) measurements performed in 8 stations along the depth of the array. The sizes of the recirculation zones behind the cylinders are also predicted. The simulations are conducted for Reynolds numbers of 12858. The Reynolds number is set to depend experimental results.Keywords: flow, tube bundle, ANSYS Fluent, CFD, turbulence, LDA, RANS (k-ε, k-ω, SST)
Procedia PDF Downloads 1661514 Fast and Accurate Finite-Difference Method Solving Multicomponent Smoluchowski Coagulation Equation
Authors: Alexander P. Smirnov, Sergey A. Matveev, Dmitry A. Zheltkov, Eugene E. Tyrtyshnikov
Abstract:
We propose a new computational technique for multidimensional (multicomponent) Smoluchowski coagulation equation. Using low-rank approximations in Tensor Train format of both the solution and the coagulation kernel, we accelerate the classical finite-difference Runge-Kutta scheme keeping its level of accuracy. The complexity of the taken finite-difference scheme is reduced from O(N^2d) to O(d^2 N log N ), where N is the number of grid nodes and d is a dimensionality of the problem. The efficiency and the accuracy of the new method are demonstrated on concrete problem with known analytical solution.Keywords: tensor train decomposition, multicomponent Smoluchowski equation, runge-kutta scheme, convolution
Procedia PDF Downloads 4331513 The Effect of Tax Evasion and Avoidance on Somalia’s Economy
Authors: Mohamed Salad Ahmed
Abstract:
This study explores the impact of tax evasion and avoidance on the economy of Somalia. Somalia's economy is largely informal and cash-based, making it challenging to accurately assess the extent of tax evasion and avoidance. However, it is widely recognized that these practices have significant negative effects on the economy, including reduced government revenue, an uneven playing field for businesses, corruption, and a lack of access to international aid and investment. The study focuses on identifying strategies and solutions to reduce tax evasion and avoidance and increase revenue collection. This includes improving the government's capacity to enforce tax laws and regulations, creating a more transparent and accountable tax system, and increasing public awareness of the importance of paying taxes. By addressing these issues, Somalia can improve its economic stability and enhance its ability to provide essential public services, reduce poverty, and promote growth and development. Tax evasion and avoidance have a significant negative impact on the economy of Somalia. The informal nature of the country's economy and the difficulty in accurately assessing the extent of tax evasion and avoidance make it challenging to address these issues effectively. The lack of government revenue resulting from tax evasion and avoidance makes it difficult for the government to fund essential services, leading to a decline in the quality of public services and hindering economic growth. Tax evasion and avoidance also create an uneven playing field for businesses, discourage investment, contribute to corruption, and undermine the rule of law. Additionally, tax evasion and avoidance can make it more difficult for Somalia to access international aid and investment. Addressing these issues will require a concerted effort by the government to strengthen tax collection and enforcement, as well as by the international community to provide technical assistance and support. This abstract highlights the importance of addressing tax evasion and avoidance in Somalia and the potential benefits of doing so.Keywords: tax evasion, tax avoidance, Somalia economy, revenue collection, informal economy, corruption economic growth, investment, tax policy, tax administration, governance, private sector
Procedia PDF Downloads 181512 Nitrogen Effects on Ignition Delay Time in Supersonic Premixed and Diffusion Flames
Authors: A. M. Tahsini
Abstract:
Computational study of two dimensional supersonic reacting hydrogen-air flows is performed to investigate the nitrogen effects on ignition delay time for premixed and diffusion flames. Chemical reaction is treated using detail kinetics and the advection upstream splitting method is used to calculate the numerical inviscid fluxes. The results show that only in the stoichiometric condition for both premixed and diffusion flames, there is monotone dependency of the ignition delay time to the nitrogen addition. In other situations, the optimal condition from ignition viewpoint should be found using numerical investigations.Keywords: diffusion flame, ignition delay time, mixing layer, numerical simulation, premixed flame, supersonic flow
Procedia PDF Downloads 4631511 Environmental Impacts on the British Era Structures of Faisalabad-a Detailed Study of the Clock Tower of Faisalabad
Authors: Bazla Manzoor, Aqsa Yasin
Abstract:
Pakistan is the country which is progressing by leaps and bounds through agricultural and industrial growth. The main area, which presents the largest income rate through industrial activities, is Faisalabad from the Province of Punjab. Faisalabad’s main occupations include agriculture and industry. As these sectors i.e. agriculture and industry is developing day by day, they are earning much income for the country and generating thousands of job vacancies. On one hand the city, i.e. Faisalabad is on the way of development through industrial growth, while on the other hand this industrial growth is producing a bad impact on the environment. In return, that damaged environment is affecting badly on the people and built environment. This research is chiefly based on one of the above-mentioned factors i.e. adverse environmental impacts on the built structures. Faisalabad is an old city, therefore; it is having many old structures especially from British Era. Many of those structures are still surviving and are functioning as the government, private and public buildings. However, these structures are getting in a poor condition with the passage of time due to bad maintenance and adverse environmental impacts. Bad maintenance is a factor, which can be controlled by financial assistance and management. The factor needs to be seriously considered is the other one i.e. adverse environmental impacts on British Era structures of the city because this factor requires controlled and refined human activities and actions. For this reason, a research was required to conserve the British Era structures of Faisalabad so that these structures can function well. The other reason to conserve them is that these structures are historically important and are the heritage of the city. For doing this research, literature has been reviewed which was present in the libraries of the city. Department of Environment, Town Municipal Administration, Faisalabad Development Authority and Lyallpur Heritage Foundation were visited to collect the existing data available. Various British Era structures were also visited to note down the environmental impacts on them. From all the structures “Clock Tower,” was deeply studied as it is one of the oldest and most important heritage structures of the city because the earlier settlements of the city were planned based on its location by The British Government. The architectural and environmental analyses were done for The Clock Tower. This research study found the deterioration factors of the tower according to which suggestions have been made.Keywords: lyallpur, heritage, architecture, environment
Procedia PDF Downloads 3031510 Evaluation of Public Library Adult Programs: Use of Servqual and Nippa Assessment Standards
Authors: Anna Ching-Yu Wong
Abstract:
This study aims to identify the quality and effectiveness of the adult programs provided by the public library using the ServQUAL Method and the National Library Public Programs Assessment guidelines (NIPPA, June 2019). ServQUAl covers several variables, namely: tangible, reliability, responsiveness, assurance, and empathy. NIPPA guidelines focus on program characteristics, particularly on the outcomes – the level of satisfaction from program participants. The reached populations were adults who participated in library adult programs at a small-town public library in Kansas. This study was designed as quantitative evaluative research which analyzed the quality and effectiveness of the library adult programs by analyzing the role of each factor based on ServQUAL and the NIPPA's library program assessment guidelines. Data were collected from November 2019 to January 2020 using a questionnaire with a Likert Scale. The data obtained were analyzed in a descriptive quantitative manner. The impact of this research can provide information about the quality and effectiveness of existing programs and can be used as input to develop strategies for developing future adult programs. Overall the result of ServQUAL measurement is in very good quality, but still, areas need improvement and emphasis in each variable: Tangible Variables still need improvement in indicators of the temperature and space of the meeting room. Reliability Variable still needs improvement in the timely delivery of the programs. Responsiveness Variable still needs improvement in terms of the ability of the presenters to convey trust and confidence from participants. Assurance Variables still need improvement in the indicator of knowledge and skills of program presenters. Empathy Variable still needs improvement in terms of the presenters' willingness to provide extra assistance. The result of program outcomes measurement based on NIPPA guidelines is very positive. Over 96% of participants indicated that the programs were informative and fun. They learned new knowledge and new skills and would recommend the programs to their friends and families. They believed that together, the library and participants build stronger and healthier communities.Keywords: ServQual model, ServQual in public libraries, library program assessment, NIPPA library programs assessment
Procedia PDF Downloads 971509 Reliability Based Topology Optimization: An Efficient Method for Material Uncertainty
Authors: Mehdi Jalalpour, Mazdak Tootkaboni
Abstract:
We present a computationally efficient method for reliability-based topology optimization under material properties uncertainty, which is assumed to be lognormally distributed and correlated within the domain. Computational efficiency is achieved through estimating the response statistics with stochastic perturbation of second order, using these statistics to fit an appropriate distribution that follows the empirical distribution of the response, and employing an efficient gradient-based optimizer. The proposed algorithm is utilized for design of new structures and the changes in the optimized topology is discussed for various levels of target reliability and correlation strength. Predictions were verified thorough comparison with results obtained using Monte Carlo simulation.Keywords: material uncertainty, stochastic perturbation, structural reliability, topology optimization
Procedia PDF Downloads 6061508 Aerodynamic Analysis of a Frontal Deflector for Vehicles
Authors: C. Malça, N. Alves, A. Mateus
Abstract:
This work was one of the tasks of the Manufacturing2Client project, whose objective was to develop a frontal deflector to be commercialized in the automotive industry, using new project and manufacturing methods. In this task, in particular, it was proposed to develop the ability to predict computationally the aerodynamic influence of flow in vehicles, in an effort to reduce fuel consumption in vehicles from class 3 to 8. With this aim, two deflector models were developed and their aerodynamic performance analyzed. The aerodynamic study was done using the Computational Fluid Dynamics (CFD) software Ansys CFX and allowed the calculation of the drag coefficient caused by the vehicle motion for the different configurations considered. Moreover, the reduction of diesel consumption and carbon dioxide (CO2) emissions associated with the optimized deflector geometry could be assessed.Keywords: erodynamic analysis, CFD, CO2 emissions, drag coefficient, frontal deflector, fuel consumption
Procedia PDF Downloads 4071507 Groundwater Potential Delineation Using Geodetector Based Convolutional Neural Network in the Gunabay Watershed of Ethiopia
Authors: Asnakew Mulualem Tegegne, Tarun Kumar Lohani, Abunu Atlabachew Eshete
Abstract:
Groundwater potential delineation is essential for efficient water resource utilization and long-term development. The scarcity of potable and irrigation water has become a critical issue due to natural and anthropogenic activities in meeting the demands of human survival and productivity. With these constraints, groundwater resources are now being used extensively in Ethiopia. Therefore, an innovative convolutional neural network (CNN) is successfully applied in the Gunabay watershed to delineate groundwater potential based on the selected major influencing factors. Groundwater recharge, lithology, drainage density, lineament density, transmissivity, and geomorphology were selected as major influencing factors during the groundwater potential of the study area. For dataset training, 70% of samples were selected and 30% were used for serving out of the total 128 samples. The spatial distribution of groundwater potential has been classified into five groups: very low (10.72%), low (25.67%), moderate (31.62%), high (19.93%), and very high (12.06%). The area obtains high rainfall but has a very low amount of recharge due to a lack of proper soil and water conservation structures. The major outcome of the study showed that moderate and low potential is dominant. Geodetoctor results revealed that the magnitude influences on groundwater potential have been ranked as transmissivity (0.48), recharge (0.26), lineament density (0.26), lithology (0.13), drainage density (0.12), and geomorphology (0.06). The model results showed that using a convolutional neural network (CNN), groundwater potentiality can be delineated with higher predictive capability and accuracy. CNN-based AUC validation platform showed that 81.58% and 86.84% were accrued from the accuracy of training and testing values, respectively. Based on the findings, the local government can receive technical assistance for groundwater exploration and sustainable water resource development in the Gunabay watershed. Finally, the use of a detector-based deep learning algorithm can provide a new platform for industrial sectors, groundwater experts, scholars, and decision-makers.Keywords: CNN, geodetector, groundwater influencing factors, Groundwater potential, Gunabay watershed
Procedia PDF Downloads 231506 Facial Emotion Recognition Using Deep Learning
Authors: Ashutosh Mishra, Nikhil Goyal
Abstract:
A 3D facial emotion recognition model based on deep learning is proposed in this paper. Two convolution layers and a pooling layer are employed in the deep learning architecture. After the convolution process, the pooling is finished. The probabilities for various classes of human faces are calculated using the sigmoid activation function. To verify the efficiency of deep learning-based systems, a set of faces. The Kaggle dataset is used to verify the accuracy of a deep learning-based face recognition model. The model's accuracy is about 65 percent, which is lower than that of other facial expression recognition techniques. Despite significant gains in representation precision due to the nonlinearity of profound image representations.Keywords: facial recognition, computational intelligence, convolutional neural network, depth map
Procedia PDF Downloads 2311505 CFD Simulations to Study the Cooling Effects of Different Greening Modifications
Authors: An-Shik Yang, Chih-Yung Wen, Chiang-Ho Cheng, Yu-Hsuan Juan
Abstract:
The objective of this study is to conduct computational fluid dynamic (CFD) simulations for evaluating the cooling efficacy from vegetation implanted in a public park in the Taipei, Taiwan. To probe the impacts of park renewal by means of adding three pavilions and supplementary green areas on urban microclimates, the simulated results have revealed that the park having a higher percentage of green coverage ratio (GCR) tended to experience a better cooling effect. These findings can be used to explore the effects of different greening modifications on urban environments for achieving an effective thermal comfort in urban public spaces.Keywords: CFD simulations, Green Coverage Ratio, Urban heat island, Urban Public Park
Procedia PDF Downloads 4941504 A Method for Improving the Embedded Runge Kutta Fehlberg 4(5)
Authors: Sunyoung Bu, Wonkyu Chung, Philsu Kim
Abstract:
In this paper, we introduce a method for improving the embedded Runge-Kutta-Fehlberg 4(5) method. At each integration step, the proposed method is comprised of two equations for the solution and the error, respectively. This solution and error are obtained by solving an initial value problem whose solution has the information of the error at each integration step. The constructed algorithm controls both the error and the time step size simultaneously and possesses a good performance in the computational cost compared to the original method. For the assessment of the effectiveness, EULR problem is numerically solved.Keywords: embedded Runge-Kutta-Fehlberg method, initial value problem, EULR problem, integration step
Procedia PDF Downloads 4651503 Simplifying Writing Composition to Assist Students in Rural Areas: An Experimental Study for the Comparison of Guided and Unguided Instruction
Authors: Neha Toppo
Abstract:
Method and strategies of teaching instruction highly influence learning of students. In second language teaching, number of ways and methods has been suggested by different scholars and researchers through times. The present article deals with the role of teaching instruction in developing compositional ability of students in writing. It focuses on the secondary level students of rural areas, whose exposure to English language is limited and they face challenges even in simple compositions. The students till high school suffer with their disability in writing formal letter, application, essay, paragraph etc. They face problem in note making, writing answers in examination using their own words and depend fully on rote learning. It becomes difficult for them to give language to their own ideas. Teaching writing composition deserves special attention as writing is an integral part of language learning and students at this level are expected to have sound compositional ability for it is useful in numerous domains. Effective method of instruction could help students to learn expression of self, correct selection of vocabulary and grammar, contextual writing, composition of formal and informal writing. It is not limited to school but continues to be important in various other fields outside the school such as in newspaper and magazine, official work, legislative work, material writing, academic writing, personal writing, etc. The study is based on the experimental method, which hypothesize that guided instruction will be more effective in teaching writing compositions than usual instruction in which students are left to compose by their own without any help. In the test, students of one section are asked to write an essay on the given topic without guidance and another section are asked to write the same but with the assistance of guided instruction in which students have been provided with a few vocabulary and sentence structure. This process is repeated in few more schools to get generalize data. The study shows the difference on students’ performance using both the instructions; guided and unguided. The conclusion of the study is followed by the finding that writing skill of the students is quite poor but with the help of guided instruction they perform better. The students are in need of better teaching instruction to develop their writing skills.Keywords: composition, essay, guided instruction, writing skill
Procedia PDF Downloads 2811502 Environmental Monitoring by Using Unmanned Aerial Vehicle (UAV) Images and Spatial Data: A Case Study of Mineral Exploitation in Brazilian Federal District, Brazil
Authors: Maria De Albuquerque Bercot, Caio Gustavo Mesquita Angelo, Daniela Maria Moreira Siqueira, Augusto Assucena De Vasconcellos, Rodrigo Studart Correa
Abstract:
Mining is an important socioeconomic activity in Brazil although it negatively impacts the environment. Mineral operations cause irreversible changes in topography, removal of vegetation and topsoil, habitat destruction, displacement of fauna, loss of biodiversity, soil erosion, siltation of watercourses and have potential to enhance climate change. Due to the impacts and its pollution potential, mining activity in Brazil is legally subjected to environmental licensing. Unlicensed mining operations or operations that not abide to the terms of an obtained license are taken as environmental crimes in the country. This work reports a case analyzed in the Forensic Institute of the Brazilian Federal District Civil Police. The case consisted of detecting illegal aspects of sand exploitation from a licensed mine in Federal District, nearby Brasilia city. The fieldwork covered an area of roughly 6 ha, which was surveyed with an unmanned aerial vehicle (UAV) (PHANTOM 3 ADVANCED). The overflight with UAV took about 20 min, with maximum flight height of 100 m. 592 UAV georeferenced images were obtained and processed in a photogrammetric software (AGISOFT PHOTOSCAN 1.1.4), which generated a mosaic of geo-referenced images and a 3D model in less than six working hours. The 3D model was analyzed in a forensic software for accurate modeling and volumetric analysis. (MAPTEK I-SITE FORENSIC 2.2). To ensure the 3D model was a true representation of the mine site, coordinates of ten control points and reference measures were taken during fieldwork and compared to respective spatial data in the model. Finally, these spatial data were used for measuring mining area, excavation depth and volume of exploited sand. Results showed that mine holder had not complied with some terms and conditions stated in the granted license, such as sand exploration beyond authorized extension, depth and volume. Easiness, the accuracy and expedition of procedures used in this case highlight the employment of UAV imagery and computational photogrammetry as efficient tools for outdoor forensic exams, especially on environmental issues.Keywords: computational photogrammetry, environmental monitoring, mining, UAV
Procedia PDF Downloads 3191501 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering
Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi
Abstract:
In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering
Procedia PDF Downloads 1521500 Optimization of Structures Subjected to Earthquake
Authors: Alireza Lavaei, Alireza Lohrasbi, Mohammadali M. Shahlaei
Abstract:
To reduce the overall time of structural optimization for earthquake loads two strategies are adopted. In the first strategy, a neural system consisting self-organizing map and radial basis function neural networks, is utilized to predict the time history responses. In this case, the input space is classified by employing a self-organizing map neural network. Then a distinct RBF neural network is trained in each class. In the second strategy, an improved genetic algorithm is employed to find the optimum design. A 72-bar space truss is designed for optimal weight using exact and approximate analysis for the El Centro (S-E 1940) earthquake loading. The numerical results demonstrate the computational advantages and effectiveness of the proposed method.Keywords: optimization, genetic algorithm, neural networks, self-organizing map
Procedia PDF Downloads 314