Search results for: sample average approximation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10519

Search results for: sample average approximation

9829 Experimental Investigation on Effect of the Zirconium + Magnesium Coating of the Piston and Valve of the Single-Cylinder Diesel Engine to the Engine Performance and Emission

Authors: Erdinç Vural, Bülent Özdalyan, Serkan Özel

Abstract:

The four-stroke single cylinder diesel engine has been used in this study, the pistons and valves of the engine have been stabilized, the aluminum oxide (Al2O3) in different ratios has been added in the power of zirconium (ZrO2) magnesium oxide (MgO), and has been coated with the plasma spray method. The pistons and valves of the combustion chamber of the engine are coated with 5 different (ZrO2 + MgO), (ZrO2 + MgO + 25% Al2O3), (ZrO2 + MgO + 50% Al2O3), (ZrO2 + MgO + 75% Al2O3), (Al2O3) sample. The material tests have been made for each of the coated engine parts with the scanning electron microscopy (SEM), energy dispersive X-ray spectroscopy (EDX) and X-ray diffraction (XRD) using Cu Kα radiation surface analysis methods. The engine tests have been repeated for each sample in any electric dynamometer in full power 1600 rpm, 2000 rpm, 2400 rpm and 2800 rpm engine speeds. The material analysis and engine tests have shown that the best performance has been performed with (ZrO2 + MgO + 50% Al2O3). Thus, there is no significant change in HC and Smoke emissions, but NOx emission is increased, as the engine improves power, torque, specific fuel consumption and CO emissions in the tests made with sample A3.

Keywords: ceramic coating, material characterization, engine performance, exhaust emissions

Procedia PDF Downloads 358
9828 Searching the Efficient Frontier for the Coherent Covering Location Problem

Authors: Felipe Azocar Simonet, Luis Acosta Espejo

Abstract:

In this article, we will try to find an efficient boundary approximation for the bi-objective location problem with coherent coverage for two levels of hierarchy (CCLP). We present the mathematical formulation of the model used. Supported efficient solutions and unsupported efficient solutions are obtained by solving the bi-objective combinatorial problem through the weights method using a Lagrangean heuristic. Subsequently, the results are validated through the DEA analysis with the GEM index (Global efficiency measurement).

Keywords: coherent covering location problem, efficient frontier, lagragian relaxation, data envelopment analysis

Procedia PDF Downloads 318
9827 Comparison Analysis of Multi-Channel Echo Cancellation Using Adaptive Filters

Authors: Sahar Mobeen, Anam Rafique, Irum Baig

Abstract:

Acoustic echo cancellation in multichannel is a system identification application. In real time environment, signal changes very rapidly which required adaptive algorithms such as Least Mean Square (LMS), Leaky Least Mean Square (LLMS), Normalized Least Mean square (NLMS) and average (AFA) having high convergence rate and stable. LMS and NLMS are widely used adaptive algorithm due to less computational complexity and AFA used of its high convergence rate. This research is based on comparison of acoustic echo (generated in a room) cancellation thorough LMS, LLMS, NLMS, AFA and newly proposed average normalized leaky least mean square (ANLLMS) adaptive filters.

Keywords: LMS, LLMS, NLMS, AFA, ANLLMS

Procedia PDF Downloads 543
9826 Analysis of Real Time Seismic Signal Dataset Using Machine Learning

Authors: Sujata Kulkarni, Udhav Bhosle, Vijaykumar T.

Abstract:

Due to the closeness between seismic signals and non-seismic signals, it is vital to detect earthquakes using conventional methods. In order to distinguish between seismic events and non-seismic events depending on their amplitude, our study processes the data that come from seismic sensors. The authors suggest a robust noise suppression technique that makes use of a bandpass filter, an IIR Wiener filter, recursive short-term average/long-term average (STA/LTA), and Carl short-term average (STA)/long-term average for event identification (LTA). The trigger ratio used in the proposed study to differentiate between seismic and non-seismic activity is determined. The proposed work focuses on significant feature extraction for machine learning-based seismic event detection. This serves as motivation for compiling a dataset of all features for the identification and forecasting of seismic signals. We place a focus on feature vector dimension reduction techniques due to the temporal complexity. The proposed notable features were experimentally tested using a machine learning model, and the results on unseen data are optimal. Finally, a presentation using a hybrid dataset (captured by different sensors) demonstrates how this model may also be employed in a real-time setting while lowering false alarm rates. The planned study is based on the examination of seismic signals obtained from both individual sensors and sensor networks (SN). A wideband seismic signal from BSVK and CUKG station sensors, respectively located near Basavakalyan, Karnataka, and the Central University of Karnataka, makes up the experimental dataset.

Keywords: Carl STA/LTA, features extraction, real time, dataset, machine learning, seismic detection

Procedia PDF Downloads 102
9825 Treatment of Poultry Slaughterhouse Wastewater by Mesophilic Static Granular Bed Reactor (SGBR) Coupled with UF Membrane

Authors: Moses Basitere, Marshal Sherene Sheldon, Seteno Karabo Obed Ntwampe, Debbie Dejager

Abstract:

In South Africa, Poultry slaughterhouses consume largest amount of freshwater and discharges high strength wastewater, which can be treated successfully at low cost using anaerobic digesters. In this study, the performance of bench-scale mesophilic Static Granular Bed Reactor (SGBR) containing fully anaerobic granules coupled with ultra-filtration (UF) membrane as a post-treatment for poultry slaughterhouse wastewater was investigated. The poultry slaughterhouse was characterized by chemical oxygen demand (COD) range between 2000 and 6000 mg/l, average biological oxygen demand (BOD) of 2375 mg/l and average fats, oil and grease (FOG) of 554 mg/l. A continuous SGBR anaerobic reactor was operated for 6 weeks at different hydraulic retention time (HRT) and an Organic loading rate. The results showed an average COD removal was greater than 90% for both the SGBR anaerobic digester and ultrafiltration membrane. The total suspended solids and fats oil and grease (FOG) removal was greater than 95%. The SGBR reactor coupled with UF membrane showed a greater potential to treat poultry slaughterhouse wastewater.

Keywords: chemical oxygen demand, poultry slaughterhouse wastewater, static granular bed reactor, ultrafiltration, wastewater

Procedia PDF Downloads 369
9824 Piezoelectric Micro-generator Characterization for Energy Harvesting Application

Authors: José E. Q. Souza, Marcio Fontana, Antonio C. C. Lima

Abstract:

This paper presents analysis and characterization of a piezoelectric micro-generator for energy harvesting application. A low-cost experimental prototype was designed to operate as piezoelectric micro-generator in the laboratory. An input acceleration of 9.8m/s2 using a sine signal (peak-to-peak voltage: 1V, offset voltage: 0V) at frequencies ranging from 10Hz to 160Hz generated a maximum average power of 432.4μW (linear mass position = 25mm) and an average power of 543.3μW (angular mass position = 35°). These promising results show that the prototype can be considered for low consumption load application as an energy harvesting micro-generator.

Keywords: piezoelectric, micro-generator, energy harvesting, cantilever beam

Procedia PDF Downloads 449
9823 Dominant Correlation Effects in Atomic Spectra

Authors: Hubert Klar

Abstract:

High double excitation of two-electron atoms has been investigated using hyperpherical coordinates within a modified adiabatic expansion technique. This modification creates a novel fictitious force leading to a spontaneous exchange symmetry breaking at high double excitation. The Pauli principle must therefore be regarded as approximation valid only at low excitation energy. Threshold electron scattering from high Rydberg states shows an unexpected time reversal symmetry breaking. At threshold for double escape we discover a broad (few eV) Cooper pair.

Keywords: correlation, resonances, threshold ionization, Cooper pair

Procedia PDF Downloads 334
9822 A Hedonic Valuation Approach to Valuing Combined Sewer Overflow Reductions

Authors: Matt S. Van Deren, Michael Papenfus

Abstract:

Seattle is one of the hundreds of cities in the United States that relies on a combined sewer system to collect and convey municipal wastewater. By design, these systems convey all wastewater, including industrial and commercial wastewater, human sewage, and stormwater runoff, through a single network of pipes. Serious problems arise for combined sewer systems during heavy precipitation events when treatment plants and storage facilities are unable to accommodate the influx of wastewater needing treatment, causing the sewer system to overflow into local waterways through sewer outfalls. CSOs (Combined Sewer Overflows) pose a serious threat to human and environmental health. Principal pollutants found in CSO discharge include microbial pathogens, comprising of bacteria, viruses, parasites, oxygen-depleting substances, suspended solids, chemicals or chemical mixtures, and excess nutrients, primarily nitrogen and phosphorus. While concentrations of these pollutants can vary between overflow events, CSOs have the potential to spread disease and waterborne illnesses, contaminate drinking water supplies, disrupt aquatic life, and effect a waterbody’s designated use. This paper estimates the economic impact of CSOs on residential property values. Using residential property sales data from Seattle, Washington, this paper employs a hedonic valuation model that controls for housing and neighborhood characteristics, as well as spatial and temporal effects, to predict a consumer’s willingness to pay for improved water quality near their homes. Initial results indicate that a 100,000-gallon decrease in the average annual overflow discharged from a sewer outfall within 300 meters of a home is associated with a 0.053% increase in the property’s sale price. For the average home in the sample, the price increase is estimated to be $18,860.23. These findings reveal some of the important economic benefits of improving water quality by reducing the frequency and severity of combined sewer overflows.

Keywords: benefits, hedonic, Seattle, sewer

Procedia PDF Downloads 164
9821 Human Health Risks Assessment of Particulate Air Pollution in Romania

Authors: Katalin Bodor, Zsolt Bodor, Robert Szep

Abstract:

The particulate matter (PM) smaller than 2.5 μm are less studied due to the limited availability of PM₂.₅, and less information is available on the health effects attributable to PM₁₀ in Central-Eastern Europe. The objective of the current study was to assess the human health risk and characterize the spatial and temporal variation of PM₂.₅ and PM₁₀ in eight Romanian regions between the 2009-2018 and. The PM concentrations showed high variability over time and spatial distribution. The highest concentration was detected in the Bucharest region in the winter period, and the lowest was detected in West. The relative risk caused by the PM₁₀ for all-cause mortality varied between 1.017 (B) and 1.025 (W), with an average 1.020. The results demonstrate a positive relative risk of cardiopulmonary and lung cancer disease due to exposure to PM₂.₅ on the national average 1.26 ( ± 0.023) and 1.42 ( ± 0.037), respectively.

Keywords: PM₂.₅, PM₁₀, relative risk, health effect

Procedia PDF Downloads 151
9820 Different Approaches to Teaching a Database Course to Undergraduate and Graduate Students

Authors: Samah Senbel

Abstract:

Database Design is a fundamental part of the Computer Science and Information technology curricula in any school, as well as in the study of management, business administration, and data analytics. In this study, we compare the performance of two groups of students studying the same database design and implementation course at Sacred Heart University in the fall of 2018. Both courses used the same textbook and were taught by the same professor, one for seven graduate students and one for 26 undergraduate students (juniors). The undergraduate students were aged around 20 years old with little work experience, while the graduate students averaged 35 years old and all were employed in computer-related or management-related jobs. The textbook used was 'Database Systems, Design, Implementation, and Management' by Coronel and Morris, and the course was designed to follow the textbook roughly a chapter per week. The first 6 weeks covered the design aspect of a database, followed by a paper exam. The next 6 weeks covered the implementation aspect of the database using SQL followed by a lab exam. Since the undergraduate students are on a 16 week semester, we spend the last three weeks of the course covering NoSQL. This part of the course was not included in this study. After the course was over, we analyze the results of the two groups of students. An interesting discrepancy was observed: In the database design part of the course, the average grade of the graduate students was 92%, while that of the undergraduate students was 77% for the same exam. In the implementation part of the course, we observe the opposite: the average grade of the graduate students was 65% while that of the undergraduate students was 73%. The overall grades were quite similar: the graduate average was 78% and that of the undergraduates was 75%. Based on these results, we concluded that having both classes follow the same time schedule was not beneficial, and an adjustment is needed. The graduates could spend less time on design and the undergraduates would benefit from more design time. In the fall of 2019, 30 students registered for the undergraduate course and 15 students registered for the graduate course. To test our conclusion, the undergraduates spend about 67% of time (eight classes) on the design part of the course and 33% (four classes) on the implementation part, using the exact exams as the previous year. This resulted in an improvement in their average grades on the design part from 77% to 83% and also their implementation average grade from 73% to 79%. In conclusion, we recommend using two separate schedules for teaching the database design course. For undergraduate students, it is important to spend more time on the design part rather than the implementation part of the course. While for the older graduate students, we recommend spending more time on the implementation part, as it seems that is the part they struggle with, even though they have a higher understanding of the design component of databases.

Keywords: computer science education, database design, graduate and undergraduate students, pedagogy

Procedia PDF Downloads 108
9819 Modeling Average Paths Traveled by Ferry Vessels Using AIS Data

Authors: Devin Simmons

Abstract:

At the USDOT’s Bureau of Transportation Statistics, a biannual census of ferry operators in the U.S. is conducted, with results such as route mileage used to determine federal funding levels for operators. AIS data allows for the possibility of using GIS software and geographical methods to confirm operator-reported mileage for individual ferry routes. As part of the USDOT’s work on the ferry census, an algorithm was developed that uses AIS data for ferry vessels in conjunction with known ferry terminal locations to model the average route travelled for use as both a cartographic product and confirmation of operator-reported mileage. AIS data from each vessel is first analyzed to determine individual journeys based on the vessel’s velocity, and changes in velocity over time. These trips are then converted to geographic linestring objects. Using the terminal locations, the algorithm then determines whether the trip represented a known ferry route. Given a large enough dataset, routes will be represented by multiple trip linestrings, which are then filtered by DBSCAN spatial clustering to remove outliers. Finally, these remaining trips are ready to be averaged into one route. The algorithm interpolates the point on each trip linestring that represents the start point. From these start points, a centroid is calculated, and the first point of the average route is determined. Each trip is interpolated again to find the point that represents one percent of the journey’s completion, and the centroid of those points is used as the next point in the average route, and so on until 100 points have been calculated. Routes created using this algorithm have shown demonstrable improvement over previous methods, which included the implementation of a LOESS model. Additionally, the algorithm greatly reduces the amount of manual digitizing needed to visualize ferry activity.

Keywords: ferry vessels, transportation, modeling, AIS data

Procedia PDF Downloads 157
9818 Investigation into Relationship between Spaced Repetitions and Problems Solving Efficiency

Authors: Sidharth Talan, Rajlakshmi G. Majumdar

Abstract:

Problem-solving skill is one the few skills which is constantly endeavored to improve upon by the professionals and academicians around the world in order to sustain themselves in the ever-growing competitive environment. The given paper focuses on evaluating a hypothesized relationship between the problems solving efficiency of an individual with spaced repetitions, conducted with a time interval of one day over a period of two weeks. The paper has utilized uni-variate regression analysis technique to assess the best fit curve that can explain the significant relationship between the given two variables. The paper has incorporated Anagrams solving as the appropriate testing process for the analysis. Since Anagrams solving involves rearranging a jumbled word to form a correct word, it projects to be an efficient process to observe the attention span, visual- motor coordination and the verbal ability of an individual. Based on the analysis for a sample population of 30, it was observed that problem-solving efficiency of an individual, measured in terms of the score in each test was found to be significantly correlated with time period measured in days.

Keywords: Anagrams, histogram plot, moving average curve, spacing effect

Procedia PDF Downloads 148
9817 Batteryless DCM Boost Converter for Kinetic Energy Harvesting Applications

Authors: Andrés Gomez-Casseres, Rubén Contreras

Abstract:

In this paper, a bidirectional boost converter operated in Discontinuous Conduction Mode (DCM) is presented as a suitable power conditioning circuit for tuning of kinetic energy harvesters without the need of a battery. A nonlinear control scheme, composed by two linear controllers, is used to control the average value of the input current, enabling the synthesization of complex loads. The converter, along with the control system, is validated through SPICE simulations using the LTspice tool. The converter model and the controller transfer functions are derived. From the simulation results, it was found that the input current distortion increases with the introduced phase shift and that, such distortion, is almost entirely present at the zero-crossing point of the input voltage.

Keywords: average current control, boost converter, electrical tuning, energy harvesting

Procedia PDF Downloads 744
9816 The Effect of Affirmative Action in Private Schools on Education Expenditure in India: A Quasi-Experimental Approach

Authors: Athira Vinod

Abstract:

Under the Right to Education Act (2009), the Indian government introduced an affirmative action policy aimed at the reservation of seats in private schools at the entry-level and free primary education for children from lower socio-economic backgrounds. Using exogenous variation in the status of being in a lower social category (disadvantaged groups) and the year of starting school, this study investigates the effect of exposure to the policy on the expenditure on private education. It employs a difference-in-difference strategy with the help of repeated cross-sectional household data from the National Sample Survey (NSS) of India. It also exploits regional variation in exposure by combining the household data with administrative data on schools from the District Information System for Education (DISE). The study compares the outcome across two age cohorts of disadvantaged groups, starting school at different times, that is, before and after the policy. Regional variation in exposure is proxied with a measure of enrolment rate under the policy, calculated at the district level. The study finds that exposure to the policy led to an average reduction in annual private school fees of ₹223. Similarly, a 5% increase in the rate of enrolment under the policy in a district was associated with a reduction in annual private school fees of ₹240. Furthermore, there was a larger effect of the policy among households with a higher demand for private education. However, the effect is not due to fees waived through direct enrolment under the policy but rather an increase in the supply of low-fee private schools in India. The study finds that after the policy, 79,870 more private schools entered the market due to an increased demand for private education. The new schools, on average, charged a lower fee than existing schools and had a higher enrolment of children exposed to the policy. Additionally, the district-level variation in the enrolment under the policy was very strongly correlated with the entry of new schools, which not only charged a low fee but also had a higher enrolment under the policy. Results suggest that few disadvantaged children were admitted directly under the policy, but many were attending private schools, which were largely low-fee. This implies that disadvantaged households were willing to pay a lower fee to secure a place in a private school even if they did not receive a free place under the policy.

Keywords: affirmative action, disadvantaged groups, private schools, right to education act, school fees

Procedia PDF Downloads 100
9815 Quantifying Mobility of Urban Inhabitant Based on Social Media Data

Authors: Yuyun, Fritz Akhmad Nuzir, Bart Julien Dewancker

Abstract:

Check-in locations on social media provide information about an individual’s location. The millions of units of data generated from these sites provide knowledge for human activity. In this research, we used a geolocation service and users’ texts posted on Twitter social media to analyze human mobility. Our research will answer the questions; what are the movement patterns of a citizen? And, how far do people travel in the city? We explore the people trajectory of 201,118 check-ins and 22,318 users over a period of one month in Makassar city, Indonesia. To accommodate individual mobility, the authors only analyze the users with check-in activity greater than 30 times. We used sampling method with a systematic sampling approach to assign the research sample. The study found that the individual movement shows a high degree of regularity and intensity in certain places. The other finding found that the average distance an urban inhabitant can travel per day is as far as 9.6 km.

Keywords: mobility, check-in, distance, Twitter

Procedia PDF Downloads 152
9814 Block Implicit Adams Type Algorithms for Solution of First Order Differential Equation

Authors: Asabe Ahmad Tijani, Y. A. Yahaya

Abstract:

The paper considers the derivation of implicit Adams-Moulton type method, with k=4 and 5. We adopted the method of interpolation and collocation of power series approximation to generate the continuous formula which was evaluated at off-grid and some grid points within the step length to generate the proposed block schemes, the schemes were investigated and found to be consistent and zero stable. Finally, the methods were tested with numerical experiments to ascertain their level of accuracy.

Keywords: Adam-Moulton Type (AMT), off-grid, block method, consistent and zero stable

Procedia PDF Downloads 470
9813 Sensory and Microbial Properties of Fresh and Canned Calocybe indica

Authors: Apotiola Z. O., Anyakorah C. I., Kuforiji O. O.

Abstract:

Sensory and microbial properties of fresh and canned Calocybe indica (milky mushroom) were evaluated. The mushroom was grown under a controlled environment with hardwood (Cola nitida) and rice bran substrate (4:1) canned in a brine solution of salt and citric acid. Analysis was carried out using standard methods. The overall acceptability ranged between 5.62 and 6.50, with sample S30 adjudged the best. In all, significant differences p<0.01 exist in the panelist judgment. Thus, the incorporation of salt and citric acid at 3.5g and 1.5g, respectively, improved sensory attributes such as texture, aroma, color, and overall acceptability. There was no coliform and fungi growth on the samples throughout the storage period. The bacterial count, on the other hand, was observed only in the fifth and sixth week of the storage period which varied between 0.2 to 0.9 x 103 cfu/g. The highest value was observed in sample S20 of the sixth week of storage, while the lowest value was recorded in sample S30 of the sixth week of storage. Based on 16S rRNA gene sequencing, bacterial species were taxonomically confirmed as Bacillus thuringiensis. The percentile compositions and Sequence ID of the bacterial species in the mushroom was 90%.

Keywords: bacterial count, microbial property, sensory, sawdust, texture

Procedia PDF Downloads 46
9812 A Performance Analysis of Different Scheduling Schemes in WiMAX

Authors: A. Youseef

Abstract:

One of the most aims of IEEE 802.16 (WiMAX) is to present high-speed wireless access to cover wide range coverage. The base station (BS) and the subscriber station (SS) are the main parts of WiMAX. WiMAX uses either Point-to-Multipoint (PMP) or mesh topologies. In the PMP mode, the SSs connect to the BS to gain access to the network. However, in the mesh mode, the SSs connect to each other to gain access to the BS. The main components of QoS management in the 802.16 standard are the admission control, buffer management, and packet scheduling. There are several researches proposed to create an efficient packet scheduling schemes. Therefore, we use QualNet 5.0.2 to study the performance of different scheduling schemes, such as WFQ, SCFQ, RR, and SP when the numbers of SSs increase. We find that when the number of SSs increases, the average jitter and average end-to-end delay is increased and the throughput is reduced.

Keywords: WiMAX, scheduling scheme, QoS, QualNet

Procedia PDF Downloads 441
9811 Shear Strength and Consolidation Behavior of Clayey Soil with Vertical and Radial Drainage

Authors: R. Pillai Aparna, S. R. Gandhi

Abstract:

Soft clay deposits having low strength and high compressibility are found all over the world. Preloading with vertical drains is a widely used method for improving such type of soils. The coefficient of consolidation, irrespective of the drainage type, plays an important role in the design of vertical drains and it controls accurate prediction of the rate of consolidation of soil. Also, the increase in shear strength of soil with consolidation is another important factor considered in preloading or staged construction. To our best knowledge no clear guidelines are available to estimate the increase in shear strength for a particular degree of consolidation (U) at various stages during the construction. Various methods are available for finding out the consolidation coefficient. This study mainly focuses on the variation of, consolidation coefficient which was found out using different methods and shear strength with pressure intensity. The variation of shear strength with the degree of consolidation was also studied. The consolidation test was done using two types of highly compressible clays with vertical, radial and a few with combined drainage. The test was carried out at different pressures intensities and for each pressure intensity, once the target degree of consolidation is achieved, vane shear test was done at different locations in the sample, in order to determine the shear strength. The shear strength of clayey soils under the application of vertical stress with vertical and radial drainage with target U value of 70% and 90% was studied. It was found that there is not much variation in cv or cr value beyond 80kPa pressure intensity. Correlations were developed between shear strength ratio and consolidation pressure based on laboratory testing under controlled condition. It was observed that the shear strength of sample with target U value of 90% is about 1.4 to 2 times than that of 70% consolidated sample. Settlement analysis was done using Asaoka’s and hyperbolic method. The variation of strength with respect to the depth of sample was also studied, using large-scale consolidation test. It was found, based on the present study that the gain in strength is more on the top half of the clay layer, and also the shear strength of the sample ensuring radial drainage is slightly higher than that of the vertical drainage.

Keywords: consolidation coefficient, degree of consolidation, PVDs, shear strength

Procedia PDF Downloads 217
9810 ISSR-PCR Based Genetic Diversity Analysis on Copper Tolerant versus Wild Type Strains of Unicellular alga Chlorella Vulgaris

Authors: Abdullah M. Alzahrani

Abstract:

The unicellular alga Chlorella vulgaris was isolated from Al-Asfar Lake, which is located in the Al-Ahsa province of Saudi Arabia. Two different isolates were sub-cultured under laboratory conditions. The wild type was grown under a regular concentration of copper, whereas the other isolate was grown under a progressively increasing copper concentration. An Inter Simple Sequence Repeats (ISSR) analysis was performed using DNA isolated from the wild type and tolerant strains. The sum of the scored bands of the wild type was 155, with 100 (64.5%) considered to be polymorphic bands, whereas the resistant strain displayed 147 bands, with 92 (62.6%) considered to be polymorphic bands. The sum of the scored bands of a mixed sample was 117 bands, of which only 4 (3.4%) were considered to be polymorphic. The average Nei's genetic diversity (h) and Shannon-Weiner diversity indices (I) were 0.3891 and 0.5394, respectively. These results clearly indicate that the adaptation to a high level of copper in Chlorella vulgaris is not merely physiological but rather driven by modifications at the genomic level.

Keywords: chlorella vulgaris, copper tolerance, genetic diversity, green algae

Procedia PDF Downloads 419
9809 High-Temperature X-Ray Powder Diffraction of Secondary Gypsum

Authors: D. Gazdič, I. Hájková, M. Fridrichová

Abstract:

This paper involved the performance of a high-temperature X-Ray powder diffraction analysis (XRD) of a sample of chemical gypsum generated in the production of titanium white; this gypsum originates by neutralizing highly acidic water with limestone suspension. Specifically, it was gypsum formed in the first stage of neutralization when the resulting material contains, apart from gypsum, a number of waste products resulting from the decomposition of ilmenite by sulphuric acid. So it can be described as red titanogypsum. By conducting the experiment using XRD apparatus Bruker D8 Advance with a Cu anode (λkα=1.54184 Å) equipped with high-temperature chamber Anton Paar HTK 16, it was possible to identify clearly in the sample each phase transition in the system of CaSO4•xH2O.

Keywords: anhydrite, gypsum, bassanite, hematite, XRD, powder, high-temperature

Procedia PDF Downloads 330
9808 Organizational Culture of a Public and a Private Hospital in Brazil

Authors: Fernanda Ludmilla Rossi Rocha, Thamiris Cavazzani Vegro, Silvia Helena Henriques Camelo, Carmen Silvia Gabriel, Andrea Bernardes

Abstract:

Introduction: Organizations are cultural, symbolic and imaginary systems composed by values and norms. These values and norms represent the organizational culture, which determines the behavior of the workers, guides the work practices and impacts the quality of care and the safety culture of health services worldwide. Objective: To analyze the organizational culture of a public and a private hospital in Brazil. Method: Descriptive study with quantitative approach developed in a public and in a private hospital of Brazil. Sample was composed by 281 nursing workers, of which 73 nurses and 208 nursing auxiliaries and technicians. The data collection instrument comprised the Brazilian Instrument for Assessing Organizational Culture. Data were collected from March to December 2013. Results: At the public hospital, the results showed an average score of 2.85 for the values concerning cooperative professionalism (CP); 3.02 for values related to hierarchical rigidity and the centralization of power (HR); 2.23 for individualistic professionalism and competition at work (IP); 2.22 for values related to satisfaction, well-being and motivation of workers (SW); 3.47 for external integration (EI); 2.03 for rewarding and training practices (RT); 2.75 for practices related to the promotion of interpersonal relationships (IR) About the private hospital, the results showed an average score of 3.24 for the CP; 2.83 for HR; 2.69 for IP; 2.71 for SW; 3.73 for EI; 2.56 for RT; 2.83 for IR at the hospital. Discussion: The analysis of organizational values of the studied hospitals shows that workers find the existence of hierarchical rigidity and the centralization of power in the institutions; believed there was cooperation at workplace, though they perceived individualism and competition; believed that values associated with the workers’ well-being, satisfaction and motivation were seldom acknowledged by the hospital; believed in the adoption of strategic planning actions within the institution, but considered interpersonal relationship promotion, continuous education and the rewarding of workers to be little valued by the institution. Conclusion: This work context can lead to professional dissatisfaction, compromising the quality of care and contributing to the occurrence of occupational diseases.

Keywords: nursing management, organizational culture, quality of care, interpersonal relationships

Procedia PDF Downloads 420
9807 Analyzing Irbid’s Food Waste as Feedstock for Anaerobic Digestion

Authors: Assal E. Haddad

Abstract:

Food waste samples from Irbid were collected from 5 different sources for 12 weeks to characterize their composition in terms of four food categories; rice, meat, fruits and vegetables, and bread. Average food type compositions were 39% rice, 6% meat, 34% fruits and vegetables, and 23% bread. Methane yield was also measured for all food types and was found to be 362, 499, 352, and 375 mL/g VS for rice, meat, fruits and vegetables, and bread, respectively. A representative food waste sample was created to test the actual methane yield and compare it to calculated one. Actual methane yield (414 mL/g VS) was greater than the calculated value (377 mL/g VS) based on food type proportions and their specific methane yield. This study emphasizes the effect of the types of food and their proportions in food waste on the final biogas production. Findings in this study provide representative methane emission factors for Irbid’s food waste, which represent as high as 68% of total Municipal Solid Waste (MSW) in Irbid, and also indicate the energy and economic value within the solid waste stream in Irbid.

Keywords: food waste, solid waste management, anaerobic digestion, methane yield

Procedia PDF Downloads 190
9806 Macroeconomic Effects and Dynamics of Natural Disaster Damages: Evidence from SETX on the Resiliency Hypothesis

Authors: Agim Kukelii, Gevorg Sargsyan

Abstract:

This study, focusing on the base regional area (county level), estimates the effect of natural disaster damages on aggregate personal income, aggregate wages, wages per worker, aggregate employment, and aggregate income transfer. The study further estimates the dynamics of personal income, employment, and wages under natural disaster shocks. Southeast Texas, located at the center of Golf Coast, is hit by meteorological and hydrological caused natural disasters yearly. On average, there are more than four natural disasters per year that cane an estimated damage average of 2.2% of real personal income. The study uses the panel data method to estimate the average effect of natural disasters on the area’s economy (personal income, wages, employment, and income transfer). It also uses Panel Vector Autoregressive (PVAR) model to study the dynamics of macroeconomic variables under natural disaster shocks. The study finds that the average effect of natural disasters is positive for personal income and income transfer and is negative for wages and employment. The PVAR and the impulse response function estimates reveal that natural disaster shocks cause a decrease in personal income, employment, and wages. However, the economy’s variables bounce back after three years. The novelty of this study rests on several aspects. First, this is the first study to investigate the effects of natural disasters on macroeconomic variables at a regional level. Second, the study uses direct measures of natural disaster damages. Third, the study estimates that the time that the local economy takes to absorb the natural disaster damages shocks is three years. This is a relatively good reaction to the local economy, therefore, adding to the “resiliency” hypothesis. The study has several implications for policymakers, businesses, and households. First, this study serves to increase the awareness of local stakeholders that natural disaster damages do worsen, macroeconomic variables, such as personal income, employment, and wages beyond the immediate damages to residential and commercial properties, physical infrastructure, and discomfort in daily lives. Second, the study estimates that these effects linger on the economy on average for three years, which would require policymakers to factor in the time area need to be on focus.

Keywords: natural disaster damages, macroeconomics effects, PVAR, panel data

Procedia PDF Downloads 75
9805 Study of Climate Change Process on Hyrcanian Forests Using Dendroclimatology Indicators (Case Study of Guilan Province)

Authors: Farzad Shirzad, Bohlol Alijani, Mehry Akbary, Mohammad Saligheh

Abstract:

Climate change and global warming are very important issues today. The process of climate change, especially changes in temperature and precipitation, is the most important issue in the environmental sciences. Climate change means changing the averages in the long run. Iran is located in arid and semi-arid regions due to its proximity to the equator and its location in the subtropical high pressure zone. In this respect, the Hyrcanian forest is a green necklace between the Caspian Sea and the south of the Alborz mountain range. In the forty-third session of UNESCO, it was registered as the second natural heritage of Iran. Beech is one of the most important tree species and the most industrial species of Hyrcanian forests. In this research, using dendroclimatology, the width of the tree ring, and climatic data of temperature and precipitation from Shanderman meteorological station located in the study area, And non-parametric Mann-Kendall statistical method to investigate the trend of climate change over a time series of 202 years of growth ringsAnd Pearson statistical method was used to correlate the growth of "ring" growth rings of beech trees with climatic variables in the region. The results obtained from the time series of beech growth rings showed that the changes in beech growth rings had a downward and negative trend and were significant at the level of 5% and climate change occurred. The average minimum, medium, and maximum temperatures and evaporation in the growing season had an increasing trend, and the annual precipitation had a decreasing trend. Using Pearson method during fitting the correlation of diameter of growth rings with temperature, for the average in July, August, and September, the correlation is negative, and the average temperature in July, August, and September is negative, and for the average The average maximum temperature in February was correlation-positive and at the level of 95% was significant, and with precipitation, in June the correlation was at the level of 95% positive and significant.

Keywords: climate change, dendroclimatology, hyrcanian forest, beech

Procedia PDF Downloads 88
9804 Studies on the Effect of Dehydration Techniques, Treatments, Packaging Material and Methods on the Quality of Buffalo Meat during Ambient Temperature Storage

Authors: Tariq Ahmad Safapuri, Saghir Ahmad, Farhana Allai

Abstract:

The present study was conducted to evaluate the effect dehydration techniques (polyhouse and tray drying), different treatment (SHMP, SHMP+ salt, salt + turmeric), different packaging material (HDPE, combination film), and different packaging methods (air, vacuum, CO2 Flush) on quality of dehydrated buffalo meat during ambient temperature storage. The quality measuring parameters included physico-chemical characteristics i.e. pH, rehydration ratio, moisture content and microbiological characteristics viz total plate content. It was found that the treatment of (SHMP, SHMP + salt, salt + turmeric increased the pH. Moisture Content of dehydrated meat samples were found in between 7.20% and 5.54%.the rehydration ratio of salt+ turmeric treated sample was found to be highest and lowest for controlled meat sample. the bacterial count log TPC/g of salt + turmeric and tray dried was lowest i.e. 1.80.During ambient temperature storage ,there was no considerable change in pH of dehydrated sample till 150 days. however the moisture content of samples increased in different packaging system in different manner. The highest moisture rise was found in case of controlled meat sample HDPE/air packed while the lowest increase was reported for SHMP+ Salt treated Packed by vacuum in combination film packed sample. Rehydration ratio was found considerably affected in case of HDPE and air packed sample dehydrated in polyhouse after 150 days of ambient storage. While there was a very little change in the rehydration ratio of meat samples packed in combination film CO2 flush system. The TPC was found under safe limit even after 150 days of storage. The microbial count was found to be lowest for salt+ turmeric treated samples after 150 days of storage.

Keywords: ambient temperature, dehydration technique, rehydration ratio, SHMP (sodium hexa meta phosphate), HDPE (high density polyethelene)

Procedia PDF Downloads 403
9803 Normalized Compression Distance Based Scene Alteration Analysis of a Video

Authors: Lakshay Kharbanda, Aabhas Chauhan

Abstract:

In this paper, an application of Normalized Compression Distance (NCD) to detect notable scene alterations occurring in videos is presented. Several research groups have been developing methods to perform image classification using NCD, a computable approximation to Normalized Information Distance (NID) by studying the degree of similarity in images. The timeframes where significant aberrations between the frames of a video have occurred have been identified by obtaining a threshold NCD value, using two compressors: LZMA and BZIP2 and defining scene alterations using Pixel Difference Percentage metrics.

Keywords: image compression, Kolmogorov complexity, normalized compression distance, root mean square error

Procedia PDF Downloads 325
9802 Living with a Partner with Depression: The Role of Dispositional Empathy in Psychological Resilience

Authors: Elizabeth O'Brien, Raegan Murphy

Abstract:

Research suggests that high levels of empathy in individuals with partners with mental health difficulties can lead to improved outcomes for their partner while compromising their own mental health. Specifically, it is proposed that the affective dimension of empathy diminishes resilience to the distress of a partner, whereas cognitive empathy (CE) enhances it. The relationship between different empathy dimensions and psychological resilience measures has not been investigated in partners of people with depression. Psychological inflexibility (PI) is a construct that can be understood as distress intolerance and is suggested to be an important feature of psychological resilience. The current study, therefore, aimed to investigate the differential role of dispositional empathy dimensions in PI for people living with a partner with depression. A cross-sectional design was employed in which 148 participants living with a partner with depression and 45 participants for a comparison sample were recruited using online platforms. Participants completed online surveys with measures relating to demographics, empathy, and PI. Scores were compared between the study and comparison samples. The study sample scored significantly lower for CE and affective empathy (AE) and significantly higher for PI than the comparison sample. Exploratory and regression analyses were run to examine associations between variables within the study sample. Analyses revealed that CE predicted the resilience outcome whilst AE did not. These results suggest that interventions for partners of people with depression that bolster the CE dimension alone may improve mental health outcomes for both members of the couple relationship.

Keywords: affective empathy, cognitive empathy, depression, partners, psychological inflexibility

Procedia PDF Downloads 118
9801 Application of Regularized Low-Rank Matrix Factorization in Personalized Targeting

Authors: Kourosh Modarresi

Abstract:

The Netflix problem has brought the topic of “Recommendation Systems” into the mainstream of computer science, mathematics, and statistics. Though much progress has been made, the available algorithms do not obtain satisfactory results. The success of these algorithms is rarely above 5%. This work is based on the belief that the main challenge is to come up with “scalable personalization” models. This paper uses an adaptive regularization of inverse singular value decomposition (SVD) that applies adaptive penalization on the singular vectors. The results show far better matching for recommender systems when compared to the ones from the state of the art models in the industry.

Keywords: convex optimization, LASSO, regression, recommender systems, singular value decomposition, low rank approximation

Procedia PDF Downloads 435
9800 Evaluation of Aquifer Protective Capacity and Soil Corrosivity Using Geoelectrical Method

Authors: M. T. Tsepav, Y. Adamu, M. A. Umar

Abstract:

A geoelectric survey was carried out in some parts of Angwan Gwari, an outskirt of Lapai Local Government Area on Niger State which belongs to the Nigerian Basement Complex, with the aim of evaluating the soil corrosivity, aquifer transmissivity and protective capacity of the area from which aquifer characterisation was made. The G41 Resistivity Meter was employed to obtain fifteen Schlumberger Vertical Electrical Sounding data along profiles in a square grid network. The data were processed using interpex 1-D sounding inversion software, which gives vertical electrical sounding curves with layered model comprising of the apparent resistivities, overburden thicknesses and depth. This information was used to evaluate longitudinal conductance and transmissivities of the layers. The results show generally low resistivities across the survey area and an average longitudinal conductance variation from 0.0237Siemens in VES 6 to 0.1261 Siemens in VES 15 with almost the entire area giving values less than 1.0 Siemens. The average transmissivity values range from 96.45 Ω.m2 in VES 4 to 299070 Ω.m2 in VES 1. All but VES 4 and VES14 had an average overburden greater than 400 Ω.m2, these results suggest that the aquifers are highly permeable to fluid movement within, leading to the possibility of enhanced migration and circulation of contaminants in the groundwater system and that the area is generally corrosive.

Keywords: geoelectric survey, corrosivity, protective capacity, transmissivity

Procedia PDF Downloads 324