Search results for: panel data analysis
36909 Aeroelastic Analysis of Engine Nacelle Strake Considering Geometric Nonlinear Behavior
Authors: N. Manoj
Abstract:
The aeroelastic behavior of engine nacelle strake when subjected to unsteady aerodynamic flows is investigated in this paper. Geometric nonlinear characteristics and modal parameters of nacelle strake are studied when it is under dynamic loading condition. Here, an N-S based Finite Volume solver is coupled with Finite Element (FE) based nonlinear structural solver to investigate the nonlinear characteristics of nacelle strake over a range of dynamic pressures at various phases of flight like takeoff, climb, and cruise conditions. The combination of high fidelity models for both aerodynamics and structural dynamics is used to predict the nonlinearities of strake (chine). The methodology adopted for present aeroelastic analysis is partitioned-based time domain coupled CFD and CSD solvers and it is validated by the consideration of experimental and numerical comparison of aeroelastic data for a cropped delta wing model which has a proven record. The present strake geometry is derived from theoretical formulation. The amplitude and frequency obtained from the coupled solver at various dynamic pressures is discussed, which gives a better understanding of its impact on aerodynamic design-sizing of strake.Keywords: aeroelasticity, finite volume, geometric nonlinearity, limit cycle oscillations, strake
Procedia PDF Downloads 28436908 Method Validation for Heavy Metal Determination in Spring Water and Sediments
Authors: Habtamu Abdisa
Abstract:
Spring water is particularly valuable due to its high mineral content, which is beneficial for human health. However, anthropogenic activities usually imbalance the natural levels of its composition, which can cause adverse health effects. Regular monitoring of a naturally given environmental resource is of great concern in the world today. The spectrophotometric application is one of the best methods for qualifying and quantifying the mineral contents of environmental water samples. This research was conducted to evaluate the quality of spring water concerning its heavy metal composition. A grab sampling technique was employed to collect representative samples, including duplicates. The samples were then treated with concentrated HNO3 to a pH level below 2 and stored at 4oC. The samples were digested and analyzed for cadmium (Cd), chromium (Cr), manganese (Mn), copper (Cu), iron (Fe), and zinc (Zn) following method validation. Atomic Absorption Spectrometry (AAS) was utilized for the sample analysis. Quality control measures, including blanks, duplicates, and certified reference materials (CRMs), were implemented to ensure the accuracy and precision of the analytical results. Of the metals analyzed in the water samples, Cd and Cr were found to be below the detection limit. However, the concentrations of Mn, Cu, Fe, and Zn ranged from mean values of 0.119-0.227 mg/L, 0.142-0.166 mg/L, 0.183-0.267 mg/L, and 0.074-0.181 mg/L, respectively. Sediment analysis revealed mean concentration ranges of 348.31-429.21 mg/kg, 0.23-0.28 mg/kg, 18.73-22.84 mg/kg, 2.76-3.15 mg/kg, 941.84-1128.56 mg/kg, and 42.39-66.53 mg/kg for Mn, Cd, Cu, Cr, Fe, and Zn, respectively. The study results established that the evaluated spring water and its associated sediment met the regulatory standards and guidelines for heavy metal concentrations. Furthermore, this research can enhance the quality assurance and control processes for environmental sample analysis, ensuring the generation of reliable data.Keywords: method validation, heavy metal, spring water, sediment, method detection limit
Procedia PDF Downloads 6836907 Strategic Redesign of Public Spaces with a Sustainable Approach: Case Study of Parque Huancavilca, Guayaquil
Authors: Juan Carlos Briones Macias
Abstract:
Currently, the Huancavilca City Park in Guayaquil is an abandoned public space that is discovering a growing problem of insecurity, where various problems have been perceived, such as the lack of green areas, deteriorating furniture, insufficient lighting, the use of inadequate cladding materials and very sunny areas due to the lack of planning in the design of green areas. The objective of this scientific article is to redesign Huancavilca Park through public space design strategies for more attractive and comfortable areas, becoming a point of interaction in a safe and accessible way. A mixed methodology (qualitative and quantitative) was applied, obtaining information based on surveys, interviews, field observations, and systematizing the data in the traditional weighting of the structuring aspects of the park. The results were obtained from the methodological design scheme of iterative analysis of public spaces by Jan Güell. It is concluded that the use of urban strategies in the structuring elements of the park, such as vegetation, furniture, generating new activities, and security interventions, will specifically solve all the problems of the Huancavilca Park tested in a Pareto 80/20 Diagram.Keywords: public space, green areas, vegetation, street furniture, urban analysis
Procedia PDF Downloads 14636906 Designing Presentational Writing Assessments for the Advanced Placement World Language and Culture Exams
Authors: Mette Pedersen
Abstract:
This paper outlines the criteria that assessment specialists use when they design the 'Persuasive Essay' task for the four Advanced Placement World Language and Culture Exams (AP French, German, Italian, and Spanish). The 'Persuasive Essay' is a free-response, source-based, standardized measure of presentational writing. Each 'Persuasive Essay' item consists of three sources (an article, a chart, and an audio) and a prompt, which is a statement of the topic phrased as an interrogative sentence. Due to its richness of source materials and due to the amount of time that test takers are given to prepare for and write their responses (a total of 55 minutes), the 'Persuasive Essay' is the free-response task on the AP World Language and Culture Exams that goes to the greatest lengths to unleash the test takers' proficiency potential. The author focuses on the work that goes into designing the 'Persuasive Essay' task, outlining best practices for the selection of topics and sources, the interplay that needs to be present among the sources and the thinking behind the articulation of prompts for the 'Persuasive Essay' task. Using released 'Persuasive Essay' items from the AP World Language and Culture Exams and accompanying data on test taker performance, the author shows how different passages, and features of passages, have succeeded (and sometimes not succeeded) in eliciting writing proficiency among test takers over time. Data from approximately 215.000 test takers per year from 2014 to 2017 and approximately 35.000 test takers per year from 2012 to 2013 form the basis of this analysis. The conclusion of the study is that test taker performance improves significantly when the sources that test takers are presented with express directly opposing viewpoints. Test taker performance also improves when the interrogative prompt that the test takers respond to is phrased as a yes/no question. Finally, an analysis of linguistic difficulty and complexity levels of the printed sources reveals that test taker performance does not decrease when the complexity level of the article of the 'Persuasive Essay' increases. This last text complexity analysis is performed with the help of the 'ETS TextEvaluator' tool and the 'Complexity Scale for Information Texts (Scale)', two tools, which, in combination, provide a rubric and a fully-automated technology for evaluating nonfiction and informational texts in English translation.Keywords: advanced placement world language and culture exams, designing presentational writing assessments, large-scale standardized assessments of written language proficiency, source-based language testing
Procedia PDF Downloads 14536905 Beyond Personal Evidence: Using Learning Analytics and Student Feedback to Improve Learning Experiences
Authors: Shawndra Bowers, Allie Brandriet, Betsy Gilbertson
Abstract:
This paper will highlight how Auburn Online’s instructional designers leveraged student and faculty data to update and improve online course design and instructional materials. When designing and revising online courses, it can be difficult for faculty to know what strategies are most likely to engage learners and improve educational outcomes in a specific discipline. It can also be difficult to identify which metrics are most useful for understanding and improving teaching, learning, and course design. At Auburn Online, the instructional designers use a suite of data based student’s performance, participation, satisfaction, and engagement, as well as faculty perceptions, to inform sound learning and design principles that guide growth-mindset consultations with faculty. The consultations allow the instructional designer, along with the faculty member, to co-create an actionable course improvement plan. Auburn Online gathers learning analytics from a variety of sources that any instructor or instructional design team may have access to at their own institutions. Participation and performance data, such as page: views, assignment submissions, and aggregate grade distributions, are collected from the learning management system. Engagement data is pulled from the video hosting platform, which includes unique viewers, views and downloads, the minutes delivered, and the average duration each video is viewed. Student satisfaction is also obtained through a short survey that is embedded at the end of each instructional module. This survey is included in each course every time it is taught. The survey data is then analyzed by an instructional designer for trends and pain points in order to identify areas that can be modified, such as course content and instructional strategies, to better support student learning. This analysis, along with the instructional designer’s recommendations, is presented in a comprehensive report to instructors in an hour-long consultation where instructional designers collaborate with the faculty member on how and when to implement improvements. Auburn Online has developed a triage strategy of priority 1 or 2 level changes that will be implemented in future course iterations. This data-informed decision-making process helps instructors focus on what will best work in their teaching environment while addressing which areas need additional attention. As a student-centered process, it has created improved learning environments for students and has been well received by faculty. It has also shown to be effective in addressing the need for improvement while removing the feeling the faculty’s teaching is being personally attacked. The process that Auburn Online uses is laid out, along with the three-tier maintenance and revision guide that will be used over a three-year implementation plan. This information can help others determine what components of the maintenance and revision plan they want to utilize, as well as guide them on how to create a similar approach. The data will be used to analyze, revise, and improve courses by providing recommendations and models of good practices through determining and disseminating best practices that demonstrate an impact on student success.Keywords: data-driven, improvement, online courses, faculty development, analytics, course design
Procedia PDF Downloads 6136904 Information Communication Technology Based Road Traffic Accidents’ Identification, and Related Smart Solution Utilizing Big Data
Authors: Ghulam Haider Haidaree, Nsenda Lukumwena
Abstract:
Today the world of research enjoys abundant data, available in virtually any field, technology, science, and business, politics, etc. This is commonly referred to as big data. This offers a great deal of precision and accuracy, supportive of an in-depth look at any decision-making process. When and if well used, Big Data affords its users with the opportunity to produce substantially well supported and good results. This paper leans extensively on big data to investigate possible smart solutions to urban mobility and related issues, namely road traffic accidents, its casualties, and fatalities based on multiple factors, including age, gender, location occurrences of accidents, etc. Multiple technologies were used in combination to produce an Information Communication Technology (ICT) based solution with embedded technology. Those technologies include principally Geographic Information System (GIS), Orange Data Mining Software, Bayesian Statistics, to name a few. The study uses the Leeds accident 2016 to illustrate the thinking process and extracts thereof a model that can be tested, evaluated, and replicated. The authors optimistically believe that the proposed model will significantly and smartly help to flatten the curve of road traffic accidents in the fast-growing population densities, which increases considerably motor-based mobility.Keywords: accident factors, geographic information system, information communication technology, mobility
Procedia PDF Downloads 20836903 Error Analysis of Students’ Freewriting: A Study of Adult English Learners’ Errors
Authors: Louella Nicole Gamao
Abstract:
Writing in English is accounted as a complex skill and process for foreign language learners who commit errors in writing are found as an inevitable part of language learners' writing. This study aims to explore and analyze the learners of English-as-a foreign Language (EFL) freewriting in a University in Taiwan by identifying the category of mistakes that often appear in their freewriting activity and analyzing the learners' awareness of each error. Hopefully, this present study will be able to gain further information about students' errors in their English writing that may contribute to further understanding of the benefits of freewriting activity that can be used for future purposes as a powerful tool in English writing courses for EFL classes. The present study adopted the framework of error analysis proposed by Dulay, Burt, and Krashen (1982), which consisted of a compilation of data, identification of errors, classification of error types, calculation of frequency of each error, and error interpretation. Survey questionnaires regarding students' awareness of errors were also analyzed and discussed. Using quantitative and qualitative approaches, this study provides a detailed description of the errors found in the students'freewriting output, explores the similarities and differences of the students' errors in both academic writing and freewriting, and lastly, analyzes the students' perception of their errors.Keywords: error, EFL, freewriting, taiwan, english
Procedia PDF Downloads 10836902 A Case Study on the Estimation of Design Discharge for Flood Management in Lower Damodar Region, India
Authors: Susmita Ghosh
Abstract:
Catchment area of Damodar River, India experiences seasonal rains due to the south-west monsoon every year and depending upon the intensity of the storms, floods occur. During the monsoon season, the rainfall in the area is mainly due to active monsoon conditions. The upstream reach of Damodar river system has five dams store the water for utilization for various purposes viz, irrigation, hydro-power generation, municipal supplies and last but not the least flood moderation. But, in the downstream reach of Damodar River, known as Lower Damodar region, is severely and frequently suffering from flood due to heavy monsoon rainfall and also release from upstream reservoirs. Therefore, an effective flood management study is required to know in depth the nature and extent of flood, water logging, and erosion related problems, affected area, and damages in the Lower Damodar region, by conducting mathematical model study. The design flood or discharge is needed to decide to assign the respective model for getting several scenarios from the simulation runs. The ultimate aim is to achieve a sustainable flood management scheme from the several alternatives. there are various methods for estimating flood discharges to be carried through the rivers and their tributaries for quick drainage from inundated areas due to drainage congestion and excess rainfall. In the present study, the flood frequency analysis is performed to decide the design flood discharge of the study area. This, on the other hand, has limitations in respect of availability of long peak flood data record for determining long type of probability density function correctly. If sufficient past records are available, the maximum flood on a river with a given frequency can safely be determined. The floods of different frequency for the Damodar has been calculated by five candidate distributions i.e., generalized extreme value, extreme value-I, Pearson type III, Log Pearson and normal. Annual peak discharge series are available at Durgapur barrage for the period of 1979 to 2013 (35 years). The available series are subjected to frequency analysis. The primary objective of the flood frequency analysis is to relate the magnitude of extreme events to their frequencies of occurrence through the use of probability distributions. The design flood for return periods of 10, 15 and 25 years return period at Durgapur barrage are estimated by flood frequency method. It is necessary to develop flood hydrographs for the above floods to facilitate the mathematical model studies to find the depth and extent of inundation etc. Null hypothesis that the distributions fit the data at 95% confidence is checked with goodness of fit test, i.e., Chi Square Test. It is revealed from the goodness of fit test that the all five distributions do show a good fit on the sample population and is therefore accepted. However, it is seen that there is considerable variation in the estimation of frequency flood. It is therefore considered prudent to average out the results of these five distributions for required frequencies. The inundated area from past data is well matched using this flood.Keywords: design discharge, flood frequency, goodness of fit, sustainable flood management
Procedia PDF Downloads 20136901 Improved Classification Procedure for Imbalanced and Overlapped Situations
Authors: Hankyu Lee, Seoung Bum Kim
Abstract:
The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.Keywords: classification, imbalanced data with class overlap, split data space, support vector machine
Procedia PDF Downloads 30836900 An Investigation of Environmental Education Knowledge for Sustainable Development in High School Sectors in UK
Authors: Abolaji Mayowa Akinyele
Abstract:
The purpose of this study was to investigate student’s awareness, Knowledge and understanding of environmental issues for sustainable development. Findings revealed that; despite the positive attitude shown by students towards environmental education, a relatively low level of understanding of environmental concept was recorded in school settings regardless of efforts by government and other environmental agencies at creating awareness about environmental related issues. This brought about the investigation of students environmental education knowledge in high school settings. About 205 Students were randomly selected for data collection using validated instruments titled student’s knowledge and attitude questionnaire as well as student’s response to questions (interview) concerning global warming. T-test statistics, chi-square and simple percentage were the major statistical tools employed in data analysis. This study revealed that environment based-education (school curriculum) as well as efforts by government/environmental agencies (mass media) plays a major role in promoting students understanding, of environmental concepts, awareness of major environmental issues and positive attitude towards natural environment.Keywords: environmental issues, sustainable development, students attitude, students knowledge
Procedia PDF Downloads 45836899 Poster : Incident Signals Estimation Based on a Modified MCA Learning Algorithm
Authors: Rashid Ahmed , John N. Avaritsiotis
Abstract:
Many signal subspace-based approaches have already been proposed for determining the fixed Direction of Arrival (DOA) of plane waves impinging on an array of sensors. Two procedures for DOA estimation based neural networks are presented. First, Principal Component Analysis (PCA) is employed to extract the maximum eigenvalue and eigenvector from signal subspace to estimate DOA. Second, minor component analysis (MCA) is a statistical method of extracting the eigenvector associated with the smallest eigenvalue of the covariance matrix. In this paper, we will modify a Minor Component Analysis (MCA(R)) learning algorithm to enhance the convergence, where a convergence is essential for MCA algorithm towards practical applications. The learning rate parameter is also presented, which ensures fast convergence of the algorithm, because it has direct effect on the convergence of the weight vector and the error level is affected by this value. MCA is performed to determine the estimated DOA. Preliminary results will be furnished to illustrate the convergences results achieved.Keywords: Direction of Arrival, neural networks, Principle Component Analysis, Minor Component Analysis
Procedia PDF Downloads 45136898 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System
Authors: Dong Seop Lee, Byung Sik Kim
Abstract:
In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.Keywords: disaster information management, unstructured data, optical character recognition, machine learning
Procedia PDF Downloads 12936897 Field-observed Thermal Fractures during Reinjection and Its Numerical Simulation
Authors: Wen Luo, Phil J. Vardon, Anne-Catherine Dieudonne
Abstract:
One key process that partly controls the success of geothermal projects is fluid reinjection, which benefits in dealing with waste water, maintaining reservoir pressure, and supplying heat-exchange media, etc. Thus, sustaining the injectivity is of great importance for the efficiency and sustainability of geothermal production. However, the injectivity is sensitive to the reinjection process. Field experiences have illustrated that the injectivity can be damaged or improved. In this paper, the focus is on how the injectivity is improved. Since the injection pressure is far below the formation fracture pressure, hydraulic fracturing cannot be the mechanism contributing to the increase in injectivity. Instead, thermal stimulation has been identified as the main contributor to improving the injectivity. For low-enthalpy geothermal reservoirs, which are not fracture-controlled, thermal fracturing, instead of thermal shearing, is expected to be the mechanism for increasing injectivity. In this paper, field data from the sedimentary low-enthalpy geothermal reservoirs in the Netherlands were analysed to show the occurrence of thermal fracturing due to the cooling shock during reinjection. Injection data were collected and compared to show the effects of the thermal fractures on injectivity. Then, a thermo-hydro-mechanical (THM) model for the near field formation was developed and solved by finite element method to simulate the observed thermal fractures. It was then compared with the HM model, decomposed from the THM model, to illustrate the thermal effects on thermal fracturing. Finally, the effects of operational parameters, i.e. injection temperature and pressure, on the changes in injectivity were studied on the basis of the THM model. The field data analysis and simulation results illustrate that the thermal fracturing occurred during reinjection and contributed to the increase in injectivity. The injection temperature was identified as a key parameter that contributes to thermal fracturing.Keywords: injectivity, reinjection, thermal fracturing, thermo-hydro-mechanical model
Procedia PDF Downloads 21736896 Prediction of the Performance of a Bar-Type Piezoelectric Vibration Actuator Depending on the Frequency Using an Equivalent Circuit Analysis
Authors: J. H. Kim, J. H. Kwon, J. S. Park, K. J. Lim
Abstract:
This paper has investigated a technique that predicts the performance of a bar-type unimorph piezoelectric vibration actuator depending on the frequency. This paper has been proposed an equivalent circuit that can be easily analyzed for the bar-type unimorph piezoelectric vibration actuator. In the dynamic analysis, rigidity and resonance frequency, which are important mechanical elements, were derived using the basic beam theory. In the equivalent circuit analysis, the displacement and bandwidth of the piezoelectric vibration actuator depending on the frequency were predicted. Also, for the reliability of the derived equations, the predicted performance depending on the shape change was compared with the result of a finite element analysis program.Keywords: actuator, piezoelectric, performance, unimorph
Procedia PDF Downloads 46436895 Weighing the Economic Cost of Illness Due to Dysentery and Cholera Triggered by Poor Sanitation in Rural Faisalabad, Pakistan
Authors: Syed Asif Ali Naqvi, Muhammad Azeem Tufail
Abstract:
Inadequate sanitation causes direct costs of treating illnesses and loss of income through reduced productivity. This study estimated the economic cost of health (ECH) due to poor sanitation and factors determining the lack of access to latrine for the rural, backward hamlets and slums of district Faisalabad, Pakistan. Cross sectional data were collected and analyzed for the study. As the population under study was homogenous in nature, it is why a simple random sampling technique was used for the collection of data. Data of 440 households from 4 tehsils were gathered. The ordinary least square (OLS) model was used for health cost analysis, and the Probit regression model was employed for determining the factors responsible for inaccess to toilets. The results of the study showed that condition of toilets, situation of sewerage system, access to adequate sanitation, Cholera, diarrhea and dysentery, Water and Sanitation Agency (WASA) maintenance, source of medical treatment can plausibly have a significant connection with the dependent variable. Outcomes of the second model showed that the variables of education, family system, age, and type of dwelling have positive and significant sway with the dependent variable. Variable of age depicted an insignificant association with access to toilets. Variable of monetary expenses would negatively influence the dependent variable. Findings revealed the fact, health risks are often exacerbated by inadequate sanitation, and ultimately, the cost on health also surges. Public and community toilets for youths and social campaigning are suggested for public policy.Keywords: sanitation, toilet, economic cost of health, water, Punjab
Procedia PDF Downloads 12136894 Geographic Mapping of Tourism in Rural Areas: A Case Study of Cumbria, United Kingdom
Authors: Emma Pope, Demos Parapanos
Abstract:
Rural tourism has become more obvious and prevalent, with tourists’ increasingly seeking authentic experiences. This movement accelerated post-Covid, putting destinations in danger of reaching levels of saturation called ‘overtourism’. Whereas the phenomenon of overtourism has been frequently discussed in the urban context by academics and practitioners over recent years, it has hardly been referred to in the context of rural tourism, where perhaps it is even more difficult to manage. Rural tourism was historically considered small-scale, marked by its traditional character and by having little impact on nature and rural society. The increasing number of rural areas experiencing overtourism, however, demonstrates the need for new approaches, especially as the impacts and enablers of overtourism are context specific. Cumbria, with approximately 47 million visitors each year, and 23,000 operational enterprises, is one of these rural areas experiencing overtourism in the UK. Using the county of Cumbria as an example, this paper aims to explore better planning and management in rural destinations by clustering the area into rural and ‘urban-rural’ tourism zones. To achieve the aim, this study uses secondary data from a variety of sources to identify variables relating to visitor economy development and demand. These data include census data relating to population and employment, tourism industry-specific data including tourism revenue, visitor activities, and accommodation stock, and big data sources such as Trip Advisor and All Trails. The combination of these data sources provides a breadth of tourism-related variables. The subsequent analysis of this data draws upon various validated models. For example, tourism and hospitality employment density, territorial tourism pressure, and accommodation density. In addition to these statistical calculations, other data are utilized to further understand the context of these zones, for example, tourist services, attractions, and activities. The data was imported into ARCGIS where the density of the different variables is visualized on maps. This study aims to provide an understanding of the geographical context of visitor economy development and tourist behavior in rural areas. The findings contribute to an understanding of the spatial dynamics of tourism within the region of Cumbria through the creation of thematized maps. Different zones of tourism industry clusters are identified, which include elements relating to attractions, enterprises, infrastructure, tourism employment and economic impact. These maps visualize hot and cold spots relating to a variety of tourism contexts. It is believed that the strategy used to provide a visual overview of tourism development and demand in Cumbria could provide a strategic tool for rural areas to better plan marketing opportunities and avoid overtourism. These findings can inform future sustainability policy and destination management strategies within the areas through an understanding of the processes behind the emergence of both hot and cold spots. It may mean that attract and disperse needs to be reviewed in terms of a strategic option. In other words, to use sector or zonal policies for the individual hot or cold areas with transitional zones dependent upon local economic, social and environmental factors.Keywords: overtourism, rural tourism, sustainable tourism, tourism planning, tourism zones
Procedia PDF Downloads 7436893 Removal of Hexavalent Chromium from Aqueous Solutions by Biosorption Using Macadamia Nutshells: Effect of Different Treatment Methods
Authors: Vusumzi E. Pakade, Themba D. Ntuli, Augustine E. Ofomaja
Abstract:
Macadamia nutshell biosorbents treated in three different methods (raw Macadamia nutshell powder (RMN), acid-treated Macadamia nutshell (ATMN) and base-treated Macadamia nutshell (BTMN)) were investigated for the adsorption of Cr(VI) from aqueous solutions. Fourier transform infrared spectroscopy (FT-IR) spectra of free and Cr(VI)-loaded sorbents as well as thermogravimetric analysis (TGA) revealed that the acid and base treatments modified the surface properties of the sorbents. The optimum conditions for the adsorption of Cr(VI) by sorbents were pH 2, contact time 10 h, adsorbent dosage 0.2 g L-1, and concentration 100 mg L-1. The different treatment methods altered the surface characteristics of the sorbents and produced different maximum binding capacities of 42.5, 40.6 and 37.5 mg g-1 for RMN, ATMN and BTMN, respectively. The data was fitted into the Langmuir, Freundlich, Redlich-Peterson and Sips isotherms. No single model could clearly explain the data perhaps due to the complexity of process taking place. The kinetic modeling results showed that the process of Cr(VI) biosorption with Macadamia sorbents was better described by a process of chemical sorption in pseudo-second order. These results showed that the three treatment methods yielded different surface properties which then influenced adsorption of Cr(VI) differently.Keywords: biosorption, chromium(VI), isotherms, Macadamia, reduction, treatment
Procedia PDF Downloads 26736892 Multi-Agent Railway Control System: Requirements Definitions of Multi-Agent System Using the Behavioral Patterns Analysis (BPA) Approach
Authors: Assem I. El-Ansary
Abstract:
This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Multi-Agent Railway Control System (MARCS). The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are the Behavioral Pattern Analysis (BPA) modeling methodology, and the development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.Keywords: analysis, multi-agent, railway control, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases
Procedia PDF Downloads 54536891 Predicting Seoul Bus Ridership Using Artificial Neural Network Algorithm with Smartcard Data
Authors: Hosuk Shin, Young-Hyun Seo, Eunhak Lee, Seung-Young Kho
Abstract:
Currently, in Seoul, users have the privilege to avoid riding crowded buses with the installation of Bus Information System (BIS). BIS has three levels of on-board bus ridership level information (spacious, normal, and crowded). However, there are flaws in the system due to it being real time which could provide incomplete information to the user. For example, a bus comes to the station, and on the BIS it shows that the bus is crowded, but on the stop that the user is waiting many people get off, which would mean that this station the information should show as normal or spacious. To fix this problem, this study predicts the bus ridership level using smart card data to provide more accurate information about the passenger ridership level on the bus. An Artificial Neural Network (ANN) is an interconnected group of nodes, that was created based on the human brain. Forecasting has been one of the major applications of ANN due to the data-driven self-adaptive methods of the algorithm itself. According to the results, the ANN algorithm was stable and robust with somewhat small error ratio, so the results were rational and reasonable.Keywords: smartcard data, ANN, bus, ridership
Procedia PDF Downloads 16736890 Combination of Artificial Neural Network Model and Geographic Information System for Prediction Water Quality
Authors: Sirilak Areerachakul
Abstract:
Water quality has initiated serious management efforts in many countries. Artificial Neural Network (ANN) models are developed as forecasting tools in predicting water quality trend based on historical data. This study endeavors to automatically classify water quality. The water quality classes are evaluated using 6 factor indices. These factors are pH value (pH), Dissolved Oxygen (DO), Biochemical Oxygen Demand (BOD), Nitrate Nitrogen (NO3N), Ammonia Nitrogen (NH3N) and Total Coliform (T-Coliform). The methodology involves applying data mining techniques using multilayer perceptron (MLP) neural network models. The data consisted of 11 sites of Saen Saep canal in Bangkok, Thailand. The data is obtained from the Department of Drainage and Sewerage Bangkok Metropolitan Administration during 2007-2011. The results of multilayer perceptron neural network exhibit a high accuracy multilayer perception rate at 94.23% in classifying the water quality of Saen Saep canal in Bangkok. Subsequently, this encouraging result could be combined with GIS data improves the classification accuracy significantly.Keywords: artificial neural network, geographic information system, water quality, computer science
Procedia PDF Downloads 34336889 Knowledge, Attitude and Practice of Anemia among Females Attending Bolan Medical Complex Quetta, Balochistan
Authors: A. Abdullah, N. ul Haq, A. Nasim
Abstract:
Objectives: This study was aimed to assess the knowledge, attitude, and practice of anemia among females attending Bolan Medical Complex Quetta, Balochistan. Methods: A quantitative cross-sectional study by adopting a questionnaire containing 3 dimensions knowledge (15 questions), Attitude (5 questions), and Practice (4 questions) for the assessment of knowledge, attitude and practice of anemia among females was conducted. All females attending Bolan Medical Complex Quetta, Balochistan were approached for the study. Descriptive statistics were used to describe demographic and KAP related characteristics of the females regarding anemia.All data were analyzed by using SPSS (Statistical Package of Social Sciences) software program version 20.0. Results: Data was collected from six hundred and thirteen (613) participants. Majority of the respondents (n=180, 29.4%) were categorized in the age group of 29-33 years. Participants had knowledge regarding anemia was (n= 564, 91.9%), and attitude was (n= 516, 84.0%) whereas practice was (n=437, 71.3%). Multitative analysis revealed the negative correlation between Attitude-practice (P= -0.040) and a significant figure (0.001) was present between knowledge-attitude. Occupation and reason of diagnosis were not predictive of better KAP. Conclusions: Knowledge, attitude, and practice of Anemia shows a satisfactory response in this study. Furthermore, study finding implicates the need for health promotion among females. Improving nutritional knowledge and information related Anemia can result in better control and management.Keywords: anemia, knowledge attitude and practice, females, college
Procedia PDF Downloads 19336888 Rheumatoid Arthritis, Periodontitis and the Subgingival Microbiome: A Circular Relationship
Authors: Isabel Lopez-Oliva, Akshay Paropkari, Shweta Saraswat, Stefan Serban, Paola de Pablo, Karim Raza, Andrew Filer, Iain Chapple, Thomas Dietrich, Melissa Grant, Purnima Kumar
Abstract:
Objective: We aimed to explicate the role of the subgingival microbiome in the causal link between rheumatoid arthritis (RA) and periodontitis (PD). Methods: Subjects with/without RA and with/without PD were randomized for treatment with scaling and root planing (SRP) or oral hygiene instructions. Subgingival biofilm, gingival crevicular fluid, and serum were collected at baseline and at 3- and 6-months post-operatively. Correlations were generated between 72 million 16S rDNA sequences, immuno-inflammatory mediators, circulating antibodies to oral microbial antigens, serum inflammatory molecules, and clinical metrics of RA. The dynamics of inter-microbial and host-microbial interactions were modeled using differential network analysis. Results: RA superseded periodontitis as a determinant of microbial composition, and DAS28 score superseded the severity of periodontitis as a driver of microbial assemblages (p=0.001, ANOSIM). RA subjects evidenced higher serum anti-PPAD (p=0.0013), anti-Pg-enolase (p=0.0031), anti-RPP3, anti- Pg-OMP and anti- Pi-OMP (p=0.001) antibodies than non-RA controls (with and without periodontitis). Following SRP, bacterial networks anchored by IL-1b, IL-4, IL-6, IL-10, IL-13, MIP-1b, and PDGF-b underwent ≥5-fold higher rewiring; and serum antibodies to microbial antigens decreased significantly. Conclusions: Our data suggest a circular relationship between RA and PD, beginning with an RA-influenced dysbiosis within the healthy subgingival microbiome that leads to exaggerated local inflammation in periodontitis and circulating antibodies to periodontal pathogens and positive correlation between severity of periodontitis and RA activity. Periodontal therapy restores host-microbial homeostasis, reduces local inflammation, and decreases circulating microbial antigens. Our data highlights the importance of integrating periodontal care into the management of RA patients.Keywords: rheumatoid arthritis, periodontal, subgingival, DNA sequence analysis, oral microbiome
Procedia PDF Downloads 10836887 Trends in Incisional and Ventral Hernia Repair: A Population Analysis from 2001 to 2021
Authors: Lakmali Anthony, Madeline Gillies
Abstract:
Background: Incisional and ventral hernias are highly prevalent, with primary ventral hernias occurring in approximately 20% of adults and incisional hernias developing in up to 30% of midline abdominal incisions. Recent data from the United States have shown an increasing incidence of elective incisional and ventral hernia repair (IVHR) and emergency repair of complicated hernias. This study examines Australian population trends in IVHR over a two-decade study period. Methods: This retrospective study was performed using procedure data from the Australian Institute of Health and Welfare, and population data from the Australian Bureau of Statistics captured between 2000 and 2021 to calculate incidence rates per 100,000 population by age and sex for selected subcategories of IVHR operations. Trends over time were evaluated using simple linear regression. Results: There were 809,308 IVHR operations performed in Australia during the study period. The cumulative incidence adjusted for the population was 182 per 100,000; this increased by 9.578 per year during the study period (95% CI = 8.431- 10.726, p<.001). IVHR for primary umbilical hernias experienced the most significant increase in population-adjusted incidence, 1.177 per year. (95% CI = 0.654- 1.701, p<.001). Emergency IVHR for incarcerated, obstructed, and strangulated hernias increased by 0.576 per year (95% CI = 0.510 -0.642, p<.001). Only 20.2% of IVHR procedures were performed as day surgery. Conclusions: Australia has seen a significant increase in IVHR operations performed in the last 20 years, particularly those for primary ventral hernias. IVHR for hernias complicated by incarceration, obstruction, and strangulation also increased significantly. The proportion of IVHR operations performed as day surgery is well below the target set by the Royal Australasian College of Surgeons. With the increasing incidence of IVHR operations and an increasing proportion of these being emergent, elective IVHR should be performed as day surgery when it is safe.Keywords: ventral, incisional, hernia, trends
Procedia PDF Downloads 7536886 Improving Temporal Correlations in Empirical Orthogonal Function Expansions for Data Interpolating Empirical Orthogonal Function Algorithm
Authors: Ping Bo, Meng Yunshan
Abstract:
Satellite-derived sea surface temperature (SST) is a key parameter for many operational and scientific applications. However, the disadvantage of SST data is a high percentage of missing data which is mainly caused by cloud coverage. Data Interpolating Empirical Orthogonal Function (DINEOF) algorithm is an EOF-based technique for reconstructing the missing data and has been widely used in oceanographic field. The reconstruction of SST images within a long time series using DINEOF can cause large discontinuities and one solution for this problem is to filter the temporal covariance matrix to reduce the spurious variability. Based on the previous researches, an algorithm is presented in this paper to improve the temporal correlations in EOF expansion. Similar with the previous researches, a filter, such as Laplacian filter, is implemented on the temporal covariance matrix, but the temporal relationship between two consecutive images which is used in the filter is considered in the presented algorithm, for example, two images in the same season are more likely correlated than those in the different seasons, hence the latter one is less weighted in the filter. The presented approach is tested for the monthly nighttime 4-km Advanced Very High Resolution Radiometer (AVHRR) Pathfinder SST for the long-term period spanning from 1989 to 2006. The results obtained from the presented algorithm are compared to those from the original DINEOF algorithm without filtering and from the DINEOF algorithm with filtering but without taking temporal relationship into account.Keywords: data interpolating empirical orthogonal function, image reconstruction, sea surface temperature, temporal filter
Procedia PDF Downloads 32536885 Expression of PGC-1 Alpha Isoforms in Response to Eccentric and Concentric Resistance Training in Healthy Subjects
Authors: Pejman Taghibeikzadehbadr
Abstract:
Background and Aim: PGC-1 alpha is a transcription factor that was first detected in brown adipose tissue. Since its discovery, PGC-1 alpha has been known to facilitate beneficial adaptations such as mitochondrial biogenesis and increased angiogenesis in skeletal muscle following aerobic exercise. Therefore, the purpose of this study was to investigate the expression of PGC-1 alpha isoforms in response to eccentric and concentric resistance training in healthy subjects. Materials and Methods: Ten healthy men were randomly divided into two groups (5 patients in eccentric group - 5 in eccentric group). Isokinetic contraction protocols included eccentric and concentric knee extension with maximum power and angular velocity of 60 degrees per second. The torques assigned to each subject were considered to match the workload in both protocols, with a rotational speed of 60 degrees per second. Contractions consisted of a maximum of 12 sets of 10 repetitions for the right leg, a rest time of 30 seconds between each set. At the beginning and end of the study, biopsy of the lateral broad muscle tissue was performed. Biopsies were performed in both distal and proximal directions of the lateral flank. To evaluate the expression of PGC1α-1 and PGC1α-4 genes, tissue analysis was performed in each group using Real-Time PCR technique. Data were analyzed using dependent t-test and covariance test. SPSS21 software and Exell 2013 software were used for data analysis. Results: The results showed that intra-group changes of PGC1α-1 after one session of activity were not significant in eccentric (p = 0.168) and concentric (p = 0.959) groups. Also, inter-group changes showed no difference between the two groups (p = 0.681). Also, intra-group changes of PGC1α-4 after one session of activity were significant in an eccentric group (p = 0.012) and concentric group (p = 0.02). Also, inter-group changes showed no difference between the two groups (p = 0.362). Conclusion: It seems that the lack of significant changes in the desired variables due to the lack of exercise pressure is sufficient to stimulate the increase of PGC1α-1 and PGC1α-4. And with regard to reviewing the answer, it seems that the compatibility debate has different results that need to be addressed.Keywords: eccentric contraction, concentric contraction, PGC1α-1 و PGC1α-4, human subject
Procedia PDF Downloads 7936884 Analysis of the Effects of Institutions on the Sub-National Distribution of Aid Using Geo-Referenced AidData
Authors: Savas Yildiz
Abstract:
The article assesses the performance of international aid donors to determine the sub-national distribution of their aid projects dependent on recipient countries’ governance. The present paper extends the scope from a cross-country perspective to a more detailed analysis by looking at the effects of institutional qualities on the sub-national distribution of foreign aid. The analysis examines geo-referenced aid project in 37 countries and 404 regions at the first administrative division level in Sub-Saharan Africa from the World Bank (WB) and the African Development Bank (ADB) that were approved between the years 2000 and 2011. To measure the influence of institutional qualities on the distribution of aid the following measures are used: control of corruption, government effectiveness, regulatory quality and rule of law from the World Governance Indicators (WGI) and the corruption perception index from Transparency International. Furthermore, to assess the importance of ethnic heterogeneity on the sub-national distribution of aid projects, the study also includes interaction terms measuring ethnic fragmentation. The regression results indicate a general skew of aid projects towards regions which hold capital cities, however, being incumbent presidents’ birth region does not increase the allocation of aid projects significantly. Nevertheless, with increasing quality of institutions aid projects are less skewed towards capital regions and the previously estimated coefficients loose significance in most cases. Higher ethnic fragmentation also seems to impede the possibility to allocate aid projects mainly in capital city regions and presidents’ birth places. Additionally, to assess the performance of the WB based on its own proclaimed goal to aim the poor in a country, the study also includes sub-national wealth data from the Demographic and Health Surveys (DSH), and finds that, even with better institutional qualities, regions with a larger share from the richest quintile receive significantly more aid than regions with a larger share of poor people. With increasing ethnic diversity, the allocation of aid projects towards regions where the richest citizens reside diminishes, but still remains high and significant. However, regions with a larger share of poor people still do not receive significantly more aid. This might imply that the sub-national distribution of aid projects increases in general with higher ethnic fragmentation, independent of the diverse regional needs. The results provide evidence that institutional qualities matter to undermine the influence of incumbent presidents on the allocation of aid projects towards their birth regions and capital regions. Moreover, even for countries with better institutional qualities the WB and the ADB do not seem to be able to aim the poor in a country with their aid projects. Even, if one considers need-based variables, such as infant mortality and child mortality rates, aid projects do not seem to be allocated in districts with a larger share of people in need. Therefore, the study provides further evidence using more detailed information on the sub-national distribution of aid projects that aid is not being allocated effectively towards regions with a larger share of poor people to alleviate poverty in recipient countries directly. Institutions do not have any significant influence on the sub-national distribution of aid towards the poor.Keywords: aid allocation, georeferenced data, institutions, spatial analysis
Procedia PDF Downloads 11936883 High-Performance Liquid Chromatographic Method with Diode Array Detection (HPLC-DAD) Analysis of Naproxen and Omeprazole Active Isomers
Authors: Marwa Ragab, Eman El-Kimary
Abstract:
Chiral separation and analysis of omeprazole and naproxen enantiomers in tablets were achieved using high-performance liquid chromatographic method with diode array detection (HPLC-DAD). Kromasil Cellucoat chiral column was used as a stationary phase for separation and the eluting solvent consisted of hexane, isopropanol and trifluoroacetic acid in a ratio of: 90, 9.9 and 0.1, respectively. The chromatographic system was suitable for the enantiomeric separation and analysis of active isomers of the drugs. Resolution values of 2.17 and 3.84 were obtained after optimization of the chromatographic conditions for omeprazole and naproxen isomers, respectively. The determination of S-isomers of each drug in their dosage form was fully validated.Keywords: chiral analysis, esomeprazole, S-Naproxen, HPLC-DAD
Procedia PDF Downloads 30136882 Discourse Analysis: Where Cognition Meets Communication
Authors: Iryna Biskub
Abstract:
The interdisciplinary approach to modern linguistic studies is exemplified by the merge of various research methods, which sometimes causes complications related to the verification of the research results. This methodological confusion can be resolved by means of creating new techniques of linguistic analysis combining several scientific paradigms. Modern linguistics has developed really productive and efficient methods for the investigation of cognitive and communicative phenomena of which language is the central issue. In the field of discourse studies, one of the best examples of research methods is the method of Critical Discourse Analysis (CDA). CDA can be viewed both as a method of investigation, as well as a critical multidisciplinary perspective. In CDA the position of the scholar is crucial from the point of view exemplifying his or her social and political convictions. The generally accepted approach to obtaining scientifically reliable results is to use a special well-defined scientific method for researching special types of language phenomena: cognitive methods applied to the exploration of cognitive aspects of language, whereas communicative methods are thought to be relevant only for the investigation of communicative nature of language. In the recent decades discourse as a sociocultural phenomenon has been the focus of careful linguistic research. The very concept of discourse represents an integral unity of cognitive and communicative aspects of human verbal activity. Since a human being is never able to discriminate between cognitive and communicative planes of discourse communication, it doesn’t make much sense to apply cognitive and communicative methods of research taken in isolation. It is possible to modify the classical CDA procedure by means of mapping human cognitive procedures onto the strategic communicative planning of discourse communication. The analysis of the electronic petition 'Block Donald J Trump from UK entry. The signatories believe Donald J Trump should be banned from UK entry' (584, 459 signatures) and the parliamentary debates on it has demonstrated the ability to map cognitive and communicative levels in the following way: the strategy of discourse modeling (communicative level) overlaps with the extraction of semantic macrostructures (cognitive level); the strategy of discourse management overlaps with the analysis of local meanings in discourse communication; the strategy of cognitive monitoring of the discourse overlaps with the formation of attitudes and ideologies at the cognitive level. Thus, the experimental data have shown that it is possible to develop a new complex methodology of discourse analysis, where cognition would meet communication, both metaphorically and literally. The same approach may appear to be productive for the creation of computational models of human-computer interaction, where the automatic generation of a particular type of a discourse could be based on the rules of strategic planning involving cognitive models of CDA.Keywords: cognition, communication, discourse, strategy
Procedia PDF Downloads 25436881 Shared Beliefs and Behavioral Labels in Bullying among Middle Schoolers: Qualitative Analysis of Peer Group Dynamics
Authors: Malgorzata Wojcik
Abstract:
Groups are a powerful and significant part of human development. They serve as major emergent microsocial structures in children’s and youth’s ecological system. During middle and secondary school, peer groups become a particularly salient influence. While they promote a range of prosocial and positive emotional and behavioral attributes, they can also elicit negative or antisocial attributes, effectively “bringing out the worst” in some individuals. The grounded theory approach was employed to guide data collection and analysis, as it allows for a deeper understanding of the group processes and students’ perspectives on complex intragroup relations. Students’ perspectives on bullying cases were investigated by observing daily interactions among those involved and interviewing 47 students. The results complement theories of labeling in bullying by showing that all students self-label themselves and find it difficult to break patterns of behaviors related to bullying, such as supporting the bully or not defending the victim. In terms of the practical implications, the findings indicate that it could be beneficial to use non-punitive, restorative anti-bullying interventions that implement peer influence to transform bullying relations by removing behavioral labels.Keywords: bullying, peer group, victimization, class reputation
Procedia PDF Downloads 11736880 Sparse Unmixing of Hyperspectral Data by Exploiting Joint-Sparsity and Rank-Deficiency
Authors: Fanqiang Kong, Chending Bian
Abstract:
In this work, we exploit two assumed properties of the abundances of the observed signatures (endmembers) in order to reconstruct the abundances from hyperspectral data. Joint-sparsity is the first property of the abundances, which assumes the adjacent pixels can be expressed as different linear combinations of same materials. The second property is rank-deficiency where the number of endmembers participating in hyperspectral data is very small compared with the dimensionality of spectral library, which means that the abundances matrix of the endmembers is a low-rank matrix. These assumptions lead to an optimization problem for the sparse unmixing model that requires minimizing a combined l2,p-norm and nuclear norm. We propose a variable splitting and augmented Lagrangian algorithm to solve the optimization problem. Experimental evaluation carried out on synthetic and real hyperspectral data shows that the proposed method outperforms the state-of-the-art algorithms with a better spectral unmixing accuracy.Keywords: hyperspectral unmixing, joint-sparse, low-rank representation, abundance estimation
Procedia PDF Downloads 261