Search results for: finite rate of innovation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11724

Search results for: finite rate of innovation

3354 Microbial Load of Fecal Material of Broiler Birds Administered with Lagenaria Breviflora Extract

Authors: Adeleye O. O., T. M. Obuotor, A. O. Kolawole, I. O. Opowoye, M. I. Olasoju, L. T. Egbeyale, R. A. Ajadi

Abstract:

This study investigated the effect of Lagenaria breviflora on broiler poultry birds, including its effect on the microbial count of the poultry droppings. A total of 240-day-old broiler chicks were randomly assigned to six groups, with four replicates per group. The first group was the control, while the other four groups were fed water containing 300g/L and 500g/L concentrations of Lagenaria breviflora twice and thrice daily. The microbial load was determined using the plate count method. The results showed that the administration of Lagenaria breviflora in the water of broiler birds significantly improved their growth performance with an average weight gain range of 1.845g - 2.241g. Mortality rate was at 0%. The study also found that Lagenaria breviflora had a significant effect on the microbial count of the poultry droppings with colony count values from 3.5 x 10-7 - 9.9 x10-7CFU/ml, The total coliforms (Escherichia coli, and Salmonella sp.) was obtained as 1 x 10 -5CFU/ml. The reduction in microbial counts of the poultry droppings could be attributed to the antimicrobial properties of Lagenaria breviflora, which contain phytochemicals reported to possess antimicrobial activity. Therefore, the inclusion of Lagenaria breviflora in the diets of broiler poultry could be an effective strategy for improving growth performance and immune function and reducing the microbial load of poultry droppings, which can help to mitigate the risk of disease transmission to humans and other animals.

Keywords: gut microbes, bacterial count, lagenaria breviflora, coliforms

Procedia PDF Downloads 84
3353 Porous Bluff-Body Disc on Improving the Gas-Mixing Efficiency

Authors: Shun-Chang Yen, You-Lun Peng, Kuo-Ching San

Abstract:

A numerical study on a bluff-body structure with multiple holes was conducted using ANSYS Fluent computational fluid dynamics analysis. The effects of the hole number and jet inclination angles were considered under a fixed gas flow rate and nonreactive gas. The bluff body with multiple holes can transform the axial momentum into a radial and tangential momentum as well as increase the swirl number (S). The concentration distribution in the mixing of a central carbon dioxide (CO2) jet and an annular air jet was utilized to analyze the mixing efficiency. Three bluff bodies with differing hole numbers (H = 3, 6, and 12) and three jet inclination angles (θ = 45°, 60°, and 90°) were designed for analysis. The Reynolds normal stress increases with the inclination angle. The Reynolds shear stress, average turbulence intensity, and average swirl number decrease with the inclination angle. For an unsymmetrical hole configuration (i.e., H = 3), the streamline patterns exhibited an unsymmetrical flow field. The highest mixing efficiency (i.e., the lowest integral gas fraction of CO2) occurred at H = 3. Furthermore, the highest swirl number coincided with the strongest effect on the mass fraction of CO2. Therefore, an unsymmetrical hole arrangement induced a high swirl flow behind the porous disc.

Keywords: bluff body with multiple holes, computational fluid dynamics, swirl-jet flow, mixing efficiency

Procedia PDF Downloads 346
3352 Crowdsourced Economic Valuation of the Recreational Benefits of Constructed Wetlands

Authors: Andrea Ghermandi

Abstract:

Constructed wetlands have long been recognized as sources of ancillary benefits such as support for recreational activities. To date, there is a lack of quantitative understanding of the extent and welfare impact of such benefits. Here, it is shown how geotagged, passively crowdsourced data from online social networks (e.g., Flickr and Panoramio) and Geographic Information Systems (GIS) techniques can: (1) be used to infer annual recreational visits to 273 engineered wetlands worldwide; and (2) be integrated with non-market economic valuation techniques (e.g., travel cost method) to infer the monetary value of recreation in these systems. Counts of social media photo-user-days are highly correlated with the number of observed visits in 62 engineered wetlands worldwide (Pearson’s r = 0.811; p-value < 0.001). The estimated, mean willingness to pay for access to 115 wetlands ranges between $5.3 and $374. In 50% of the investigated wetlands providing polishing treatment to advanced municipal wastewater, the present value of such benefits exceeds that of the capital, operation and maintenance costs (lifetime = 45 years; discount rate = 6%), indicating that such systems are sources of net societal benefits even before factoring in benefits derived from water quality improvement and storage. Based on the above results, it is argued that recreational benefits should be taken into account in the design and management of constructed wetlands, as well as when such green infrastructure systems are compared with conventional wastewater treatment solutions.

Keywords: constructed wetlands, cultural ecosystem services, ecological engineering, social media

Procedia PDF Downloads 120
3351 Effect of Shrinkage on Heat and Mass Transfer Parameters of Solar Dried Potato Samples of Variable Diameter

Authors: Kshanaprava Dhalsamant, Punyadarshini P. Tripathy, Shanker L. Shrivastava

Abstract:

Potato is chosen as the food product for carrying out the natural convection mixed-mode solar drying experiments since they are easily available and globally consumed. The convective heat and mass transfer coefficients along with effective diffusivity were calculated considering both shrinkage and without shrinkage for the potato cylinders of different geometry (8, 10 and 13 mm diameters and a constant length of 50 mm). The convective heat transfer coefficient (hc) without considering shrinkage effect were 24.28, 18.69, 15.89 W/m2˚C and hc considering shrinkage effect were 37.81, 29.21, 25.72 W/m2˚C for 8, 10 and 13 mm diameter samples respectively. Similarly, the effective diffusivity (Deff) without considering shrinkage effect were 3.20×10-9, 4.82×10-9, 2.48×10-8 m2/s and Deff considering shrinkage effect were 1.68×10-9, 2.56×10-9, 1.34×10-8 m2/s for 8, 10 and 13 mm diameter samples respectively and the mass transfer coefficient (hm) without considering the shrinkage effect were 5.16×10-7, 2.93×10-7, 2.59×10-7 m/s and hm considering shrinkage effect were 3.71×10-7, 2.04×10-7, 1.80×10-7 m/s for 8, 10 and 13 mm diameter samples respectively. Increased values of hc were obtained by considering shrinkage effect in all diameter samples because shrinkage results in decreasing diameter with time achieving in enhanced rate of water loss. The average values of Deff determined without considering the shrinkage effect were found to be almost double that with shrinkage effect. The reduction in hm values is due to the fact that with increasing sample diameter, the exposed surface area per unit mass decreases, resulting in a slower moisture removal. It is worth noting that considering shrinkage effect led to overestimation of hc values in the range of 55.72-61.86% and neglecting the shrinkage effect in the mass transfer analysis, the values of Deff and hm are overestimated in the range of 85.02-90.27% and 39.11-45.11%, respectively, for the range of sample diameter investigated in the present study.

Keywords: shrinkage, convective heat transfer coefficient, effectivive diffusivity, convective mass transfer coefficient

Procedia PDF Downloads 245
3350 Continuous Plug Flow and Discrete Particle Phase Coupling Using Triangular Parcels

Authors: Anders Schou Simonsen, Thomas Condra, Kim Sørensen

Abstract:

Various processes are modelled using a discrete phase, where particles are seeded from a source. Such particles can represent liquid water droplets, which are affecting the continuous phase by exchanging thermal energy, momentum, species etc. Discrete phases are typically modelled using parcel, which represents a collection of particles, which share properties such as temperature, velocity etc. When coupling the phases, the exchange rates are integrated over the cell, in which the parcel is located. This can cause spikes and fluctuating exchange rates. This paper presents an alternative method of coupling a discrete and a continuous plug flow phase. This is done using triangular parcels, which span between nodes following the dynamics of single droplets. Thus, the triangular parcels are propagated using the corner nodes. At each time step, the exchange rates are spatially integrated over the surface of the triangular parcels, which yields a smooth continuous exchange rate to the continuous phase. The results shows that the method is more stable, converges slightly faster and yields smooth exchange rates compared with the steam tube approach. However, the computational requirements are about five times greater, so the applicability of the alternative method should be limited to processes, where the exchange rates are important. The overall balances of the exchanged properties did not change significantly using the new approach.

Keywords: CFD, coupling, discrete phase, parcel

Procedia PDF Downloads 255
3349 Using ANN in Emergency Reconstruction Projects Post Disaster

Authors: Rasha Waheeb, Bjorn Andersen, Rafa Shakir

Abstract:

Purpose The purpose of this study is to avoid delays that occur in emergency reconstruction projects especially in post disaster circumstances whether if they were natural or manmade due to their particular national and humanitarian importance. We presented a theoretical and practical concepts for projects management in the field of construction industry that deal with a range of global and local trails. This study aimed to identify the factors of effective delay in construction projects in Iraq that affect the time and the specific quality cost, and find the best solutions to address delays and solve the problem by setting parameters to restore balance in this study. 30 projects were selected in different areas of construction were selected as a sample for this study. Design/methodology/approach This study discusses the reconstruction strategies and delay in time and cost caused by different delay factors in some selected projects in Iraq (Baghdad as a case study).A case study approach was adopted, with thirty construction projects selected from the Baghdad region, of different types and sizes. Project participants from the case projects provided data about the projects through a data collection instrument distributed through a survey. Mixed approach and methods were applied in this study. Mathematical data analysis was used to construct models to predict delay in time and cost of projects before they started. The artificial neural networks analysis was selected as a mathematical approach. These models were mainly to help decision makers in construction project to find solutions to these delays before they cause any inefficiency in the project being implemented and to strike the obstacles thoroughly to develop this industry in Iraq. This approach was practiced using the data collected through survey and questionnaire data collection as information form. Findings The most important delay factors identified leading to schedule overruns were contractor failure, redesigning of designs/plans and change orders, security issues, selection of low-price bids, weather factors, and owner failures. Some of these are quite in line with findings from similar studies in other countries/regions, but some are unique to the Iraqi project sample, such as security issues and low-price bid selection. Originality/value we selected ANN’s analysis first because ANN’s was rarely used in project management , and never been used in Iraq to finding solutions for problems in construction industry. Also, this methodology can be used in complicated problems when there is no interpretation or solution for a problem. In some cases statistical analysis was conducted and in some cases the problem is not following a linear equation or there was a weak correlation, thus we suggested using the ANN’s because it is used for nonlinear problems to find the relationship between input and output data and that was really supportive.

Keywords: construction projects, delay factors, emergency reconstruction, innovation ANN, post disasters, project management

Procedia PDF Downloads 154
3348 Application of Industrial Ecology to the INSPIRA Zone: Territory Planification and New Activities

Authors: Mary Hanhoun, Jilla Bamarni, Anne-Sophie Bougard

Abstract:

INSPIR’ECO is a 18-month research and innovation project that aims to specify and develop a tool to offer new services for industrials and territorial planners/managers based on Industrial Ecology Principles. This project is carried out on the territory of Salaise Sablons and the services are designed to be deployed on other territories. Salaise-Sablons area is located in the limit of 5 departments on a major European economic axis multimodal traffic (river, rail and road). The perimeter of 330 ha includes 90 hectares occupied by 20 companies, with a total of 900 jobs, and represents a significant potential basin of development. The project involves five multi-disciplinary partners (Syndicat Mixte INSPIRA, ENGIE, IDEEL, IDEAs Laboratory and TREDI). INSPIR’ECO project is based on the principles that local stakeholders need services to pool, share their activities/equipment/purchases/materials. These services aims to : 1. initiate and promote exchanges between existing companies and 2. identify synergies between pre-existing industries and future companies that could be implemented in INSPIRA. These eco-industrial synergies can be related to: the recovery / exchange of industrial flows (industrial wastewater, waste, by-products, etc.); the pooling of business services (collective waste management, stormwater collection and reuse, transport, etc.); the sharing of equipments (boiler, steam production, wastewater treatment unit, etc.) or resources (splitting jobs cost, etc.); and the creation of new activities (interface activities necessary for by-product recovery, development of products or services from a newly identified resource, etc.). These services are based on IT tool used by the interested local stakeholders that intends to allow local stakeholders to take decisions. Thus, this IT tool: - include an economic and environmental assessment of each implantation or pooling/sharing scenarios for existing or further industries; - is meant for industrial and territorial manager/planners - is designed to be used for each new industrial project. - The specification of the IT tool is made through an agile process all along INSPIR’ECO project fed with: - Users expectations thanks to workshop sessions where mock-up interfaces are displayed; - Data availability based on local and industrial data inventory. These input allow to specify the tool not only with technical and methodological constraints (notably the ones from economic and environmental assessments) but also with data availability and users expectations. A feedback on innovative resource management initiatives in port areas has been realized in the beginning of the project to feed the designing services step.

Keywords: development opportunities, INSPIR’ECO, INSPIRA, industrial ecology, planification, synergy identification

Procedia PDF Downloads 153
3347 From Design, Experience and Play Framework to Common Design Thinking Tools: Using Serious Modern Board Games

Authors: Micael Sousa

Abstract:

Board games (BGs) are thriving as new designs emerge from the hobby community to greater audiences all around the world. Although digital games are gathering most of the attention in game studies and serious games research fields, the post-digital movement helps to explain why in the world dominated by digital technologies, the analog experiences are still unique and irreplaceable to users, allowing innovation in new hybrid environments. The BG’s new designs are part of these post-digital and hybrid movements because they result from the use of powerful digital tools that enable the production and knowledge sharing about the BGs and their face-to-face unique social experiences. These new BGs, defined as modern by many authors, are providing innovative designs and unique game mechanics that are still not yet fully explored by the main serious games (SG) approaches. Even the most established frameworks settled to address SG, as fun games implemented to achieve predefined goals need more development, especially when considering modern BGs. Despite the many anecdotic perceptions, researchers are only now starting to rediscover BGs and demonstrating their potentials. They are proving that BGs are easy to adapt and to grasp by non-expert players in experimental approaches, with the possibility of easy-going adaptation to players’ profiles and serious objectives even during gameplay. Although there are many design thinking (DT) models and practices, their relations with SG frameworks are also underdeveloped, mostly because this is a new research field, lacking theoretical development and the systematization of the experimental practices. Using BG as case studies promise to help develop these frameworks. Departing from the Design, Experience, and Play (DPE) framework and considering the Common Design Think Tools (CDST), this paper proposes a new experimental framework for the adaptation and development of modern BG design for DT: the Design, Experience, and Play for Think (DPET) experimental framework. This is done through the systematization of the DPE and CDST approaches applied in two case studies, where two different sequences of adapted BG were employed to establish a DT collaborative process. These two sessions occurred with different participants and in different contexts, also using different sequences of games for the same DT approach. The first session took place at the Faculty of Economics at the University of Coimbra in a training session of serious games for project development. The second session took place in the Casa do Impacto through The Great Village Design Jam light. Both sessions had the same duration and were designed to progressively achieve DT goals, using BGs as SGs in a collaborative process. The results from the sessions show that a sequence of BGs, when properly adapted to address the DPET framework, can generate a viable and innovative process of collaborative DT that is productive, fun, and engaging. The DPET proposed framework intents to help establish how new SG solutions could be defined for new goals through flexible DT. Applications in other areas of research and development can also benefit from these findings.

Keywords: board games, design thinking, methodology, serious games

Procedia PDF Downloads 99
3346 Performance Evaluation of Wideband Code Division Multiplication Network

Authors: Osama Abdallah Mohammed Enan, Amin Babiker A/Nabi Mustafa

Abstract:

The aim of this study is to evaluate and analyze different parameters of WCDMA (wideband code division multiplication). Moreover, this study also incorporates brief yet throughout analysis of WCDMA’s components as well as its internal architecture. This study also examines different power controls. These power controls may include open loop power control, closed or inner group loop power control and outer loop power control. Different handover techniques or methods of WCDMA are also illustrated in this study. These handovers may include hard handover, inter system handover and soft and softer handover. Different duplexing techniques are also described in the paper. This study has also presented an idea about different parameters of WCDMA that leads the system towards QoS issues. This may help the operator in designing and developing adequate network configuration. In addition to this, the study has also investigated various parameters including Bit Energy per Noise Spectral Density (Eb/No), Noise rise, and Bit Error Rate (BER). After simulating these parameters, using MATLAB environment, it was investigated that, for a given Eb/No value the system capacity increase by increasing the reuse factor. Besides that, it was also analyzed that, noise rise is decreasing for lower data rates and for lower interference levels. Finally, it was examined that, BER increase by using one type of modulation technique than using other type of modulation technique.

Keywords: duplexing, handover, loop power control, WCDMA

Procedia PDF Downloads 203
3345 Establishing Econometric Modeling Equations for Lumpy Skin Disease Outbreaks in the Nile Delta of Egypt under Current Climate Conditions

Authors: Abdelgawad, Salah El-Tahawy

Abstract:

This paper aimed to establish econometrical equation models for the Nile delta region in Egypt, which will represent a basement for future predictions of Lumpy skin disease outbreaks and its pathway in relation to climate change. Data of lumpy skin disease (LSD) outbreaks were collected from the cattle farms located in the provinces representing the Nile delta region during 1 January, 2015 to December, 2015. The obtained results indicated that there was a significant association between the degree of the LSD outbreaks and the investigated climate factors (temperature, wind speed, and humidity) and the outbreaks peaked during the months of June, July, and August and gradually decreased to the lowest rate in January, February, and December. The model obtained depicted that the increment of these climate factors were associated with evidently increment on LSD outbreaks on the Nile Delta of Egypt. The model validation process was done by the root mean square error (RMSE) and means bias (MB) which compared the number of LSD outbreaks expected with the number of observed outbreaks and estimated the confidence level of the model. The value of RMSE was 1.38% and MB was 99.50% confirming that this established model described the current association between the LSD outbreaks and the change on climate factors and also can be used as a base for predicting the of LSD outbreaks depending on the climatic change on the future.

Keywords: LSD, climate factors, Nile delta, modeling

Procedia PDF Downloads 279
3344 Photocatalytic Degradation of Phenolic Compounds in Wastewater Using Magnetically Recoverable Catalyst

Authors: Ahmed K. Sharaby, Ahmed S. El-Gendy

Abstract:

Phenolic compounds (PCs) exist in the wastewater effluents of some industries such as oil refinery, pharmaceutical and cosmetics. Phenolic compounds are extremely hazardous pollutants that can cause severe problems to the aquatic life and human beings if disposed of without treatment. One of the most efficient treatment methods of PCs is photocatalytic degradation. The current work studies the performance of composite nanomaterial of titanium dioxide with magnetite as a photo-catalyst in the degradation of PCs. The current work aims at optimizing the synthesized photocatalyst dosage and contact time as part of the operational parameters at different initial concentrations of PCs and pH values in the wastewater. The study was performed in a lab-scale batch reactor under fixed conditions of light intensity and aeration rate. The initial concentrations of PCs and the pH values were in the range of (10-200 mg/l) and (3-9), respectively. Results of the study indicate that the dosage of the catalyst and contact time for total mineralization is proportional to the initial concentrations of PCs, while the optimum pH conditions for highly efficient degradation is at pH 3. Exceeding the concentration levels of the catalyst beyond certain limits leads to the decrease in the degradation efficiency due to the dissipation of light. The performance of the catalyst for degradation was also investigated in comparison to the pure TiO2 Degussa (P-25). The dosage required for the synthesized catalyst for photocatalytic degradation was approximately 1.5 times that needed from the pure titania.

Keywords: industrial, optimization, phenolic compounds, photocatalysis, wastewater

Procedia PDF Downloads 308
3343 Long-Term Structural Behavior of Resilient Materials for Reduction of Floor Impact Sound

Authors: Jung-Yoon Lee, Jongmun Kim, Hyo-Jun Chang, Jung-Min Kim

Abstract:

People’s tendency towards living in apartment houses is increasing in a densely populated country. However, some residents living in apartment houses are bothered by noise coming from the houses above. In order to reduce noise pollution, the communities are increasingly imposing a bylaw, including the limitation of floor impact sound, minimum thickness of floors, and floor soundproofing solutions. This research effort focused on the specific long-time deflection of resilient materials in the floor sound insulation systems of apartment houses. The experimental program consisted of testing nine floor sound insulation specimens subjected to sustained load for 45 days. Two main parameters were considered in the experimental investigation: three types of resilient materials and magnitudes of loads. The test results indicated that the structural behavior of the floor sound insulation systems under long-time load was quite different from that the systems under short-time load. The loading period increased the deflection of floor sound insulation systems and the increasing rate of the long-time deflection of the systems with ethylene vinyl acetate was smaller than that of the systems with low density ethylene polystyrene.

Keywords: resilient materials, floor sound insulation systems, long-time deflection, sustained load, noise pollution

Procedia PDF Downloads 259
3342 Creation and Evaluation of an Academic Blog of Tools for the Self-Correction of Written Production in English

Authors: Brady, Imelda Katherine, Da Cunha Fanego, Iria

Abstract:

Today's university students are considered digital natives and the use of Information Technologies (ITs) forms a large part of their study and learning. In the context of language studies, applications that help with revisions of grammar or vocabulary are particularly useful, especially if they are open access. There are studies that show the effectiveness of this type of application in the learning of English as a foreign language and that using IT can help learners become more autonomous in foreign language acquisition, given that these applications can enhance awareness of the learning process; this means that learners are less dependent on the teacher for corrective feedback. We also propose that the exploitation of these technologies also enhances the work of the language instructor wishing to incorporate IT into his/her practice. In this context, the aim of this paper is to present the creation of a repository of tools that provide support in the writing and correction of texts in English and the assessment of their usefulness on behalf of university students enrolled in the English Studies Degree. The project seeks to encourage the development of autonomous learning through the acquisition of skills linked to the self-correction of written work in English. To comply with the above, our methodology follows five phases. First of all, a selection of the main open-access online applications available for the correction of written texts in English is made: AutoCrit, Hemingway, Grammarly, LanguageTool, OutWrite, PaperRater, ProWritingAid, Reverso, Slick Write, Spell Check Plus and Virtual Writing Tutor. Secondly, the functionalities of each of these tools (spelling, grammar, style correction, etc.) are analyzed. Thirdly, explanatory materials (texts and video tutorials) are prepared on each tool. Fourth, these materials are uploaded into a repository of our university in the form of an institutional blog, which is made available to students and the general public. Finally, a survey was designed to collect students’ feedback. The survey aimed to analyse the usefulness of the blog and the quality of the explanatory materials as well as the degree of usefulness that students assigned to each of the tools offered. In this paper, we present the results of the analysis of data received from 33 students in the 1st semester of the 21-22 academic year. One result we highlight in our paper is that the students have rated this resource very highly, in addition to offering very valuable information on the perceived usefulness of the applications provided for them to review. Our work, carried out within the framework of a teaching innovation project funded by our university, emphasizes that teachers need to design methodological strategies that help their students improve the quality of their productions written in English and, by extension, to improve their linguistic competence.

Keywords: academic blog, open access tools, online self-correction, written production in English, university learning

Procedia PDF Downloads 91
3341 Efficient Liquid Desiccant Regeneration for Fresh Air Dehumidification Application

Authors: M. V. Rane, Tareke Tekia

Abstract:

Fresh Air Dehumidifier having a capacity of 1 TR has been developed by Heat Pump Laboratory at IITB. This fresh air dehumidifier is based on potassium formate liquid desiccant. The regeneration of the liquid desiccant can be done in two stages. The first stage of liquid desiccant regeneration involves the boiling of liquid desiccant inside the evacuated glass type solar thermal collectors. Further regeneration of liquid desiccant can be achieved using Low Temperature Regenerator, LTR. The coefficient of performance of the fresh air dehumidifier greatly depends on the performance of the major components such as high temperature regenerator, low temperature regenerator, fresh air dehumidifier, and solution heat exchangers. High effectiveness solution heat exchanger has been developed and tested. The solution heat exchanger is based on a patented aluminium extrusion with special passage geometry to enhance the heat transfer rate. Effectiveness up to 90% was achieved. Before final testing of the dehumidifier, major components have been tested individually. Testing of the solar thermal collector as hot water and steam generator reveals that efficiency up to 55% can be achieved. In this paper, the development of 1 TR fresh air dehumidifier with special focus on solution heat exchangers and solar thermal collector performance is presented.

Keywords: solar, liquid desiccant, dehumidification, air conditioning, regeneration, coefficient of performance

Procedia PDF Downloads 181
3340 Balanced Score Card a Tool to Improve Naac Accreditation – a Case Study in Indian Higher Education

Authors: CA Kishore S. Peshori

Abstract:

Introduction: India, a country with vast diversity and huge population is going to have largest young population by 2020. Higher education has and will always be the basic requirement for making a developing nation to a developed nation. To improve any system it needs to be bench-marked. There have been various tools for bench-marking the systems. Education is delivered in India by universities which are mainly funded by government. This universities for delivering the education sets up colleges which are again funded mainly by government. Recently however there has also been autonomy given to universities and colleges. Moreover foreign universities are waiting to enter Indian boundaries. With a large number of universities and colleges it has become more and more necessary to measure this institutes for bench-marking. There have been various tools for measuring the institute. In India college assessments have been made compulsory by UGC. Naac has been offically recognised as the accrediation criteria. The Naac criteria has been based on seven criterias namely: 1. Curricular assessments, 2. Teaching learning and evaluation, 3. Research Consultancy and Extension, 4. Infrastructure and learning resources, 5. Student support and progression, 6. Governance leadership and management, 7. Innovation and best practices. The Naac tries to bench mark the institution for identification, sustainability, dissemination and adaption of best practices. It grades the institution according to this seven criteria and the funding of institution is based on these grades. Many of the colleges are struggling to get best of grades but they have not come across a systematic tool to achieve the results. Balanced Scorecard developed by Kaplan has been a successful tool for corporates to develop best of practices so as to increase their financial performance and also retain and increase their customers so as to grow the organization to next level.It is time to test this tool for an educational institute. Methodology: The paper tries to develop a prototype for college based on the secondary data. Once a prototype is developed the researcher based on questionnaire will try to test this tool for successful implementation. The success of this research will depend on its implementation of BSC on an institute and its grading improved due to this successful implementation. Limitation of time is a major constraint in this research as Naac cycle takes minimum 4 years for accreditation and reaccreditation the methodology will limit itself to secondary data and questionnaire to be circulated to colleges along with the prototype model of BSC. Conclusion: BSC is a successful tool for enhancing growth of an organization. Educational institutes are no exception to these. BSC will only have to be realigned to suit the Naac criteria. Once this prototype is developed the success will be tested only on its implementation but this research paper will be the first step towards developing this tool and will also initiate the success by developing a questionnaire and getting and evaluating the responses for moving to the next level of actual implementation

Keywords: balanced scorecard, bench marking, Naac, UGC

Procedia PDF Downloads 263
3339 The Occurrence of Depression with Chronic Liver Disease

Authors: Roop Kiran, Muhammad Shoaib Zafar, Nazish Idrees Chaudhary

Abstract:

Depression is known to be the second most frequently occurring comorbid mental illness among patients suffering from chronic physical conditions. Around the world, depression is associated with chronic liver diseases as one of the dominant symptoms. This evidence brings attention to the research about various predictors for short life expectancy and poor quality of life in patients suffering from comorbid depression and CLD. Following are the objectives of this study i) measure the occurrence rate of comorbid depression among patients with CLD and ii) find the frequency of risk factors between patients with and without depression comorbid with CLD. This is a quantitative study with a cross-sectional design. The research data was collected through a measure called Hamilton Depression Rating Scale (HDRS) with a demographic Performa from 100 patients who visited the Department of Psychiatry for consultation at Mayo Hospital Lahore with a diagnosed CLD from the last four years. There were (42%) patients with CLD who had comorbid depression. Among depressed and non-depressed patients, significant differences were found (p<0.05) for unemployment in 25 (59.5%) males and 20 (34.5%) female patients, for co-morbidity in 25 (59.5%) males and 18 (31.0%) female patients, for illiteracy in 18 (42.9%) males and 13 (22.4%) female patients, for the history of CLD for more than the last 2years in 41 (97.6%) males and 47 (81.0%) female patients, for severity of CLD in 26 (61.9%) males and 20 (34.5%) female patients. This concludes that depression frequently occurs among patients with CLD. This study recommends considerable attention to plan preventative measures in the future and develop such intervention protocols that consider the management of risk factors that significantly influence comorbid depression with CLD.

Keywords: psychiatry, comorbid, health, quality of life

Procedia PDF Downloads 186
3338 Causes Analysis of Vacuum Consolidation Failure to Soft Foundation Filled by Newly Dredged Mud

Authors: Bao Shu-Feng, Lou Yan, Dong Zhi-Liang, Mo Hai-Hong, Chen Ping-Shan

Abstract:

For soft foundation filled by newly dredged mud, after improved by Vacuum Preloading Technology (VPT), the soil strength was increased only a little, the effective improved depth was small, and the ground bearing capacity is still low. To analyze the causes in depth, it was conducted in laboratory of several comparative single well model experiments of VPT. It was concluded: (1) it mainly caused serious clogging problem and poor drainage performance in vertical drains of high content of fine soil particles and strong hydrophilic minerals in dredged mud, too fast loading rate at the early stage of vacuum preloading (namely rapidly reaching-80kPa) and too small characteristic opening size of the filter of the existed vertical drains; (2) it commonly reduced the drainage efficiency of drainage system, in turn weaken vacuum pressure in soils and soil improvement effect of the greater partial loss and friction loss of vacuum pressure caused by larger curvature of vertical drains and larger transfer resistance of vacuum pressure in horizontal drain.

Keywords: newly dredged mud, single well model experiments of vacuum preloading technology, poor drainage performance of vertical drains, poor soil improvement effect, causes analysis

Procedia PDF Downloads 274
3337 Hydrogen Sulfide Removal from Biogas Using Biofilm on Packed Bed of Salak Fruit Seeds

Authors: Retno A. S. Lestari, Wahyudi B. Sediawan, Siti Syamsiah, Sarto

Abstract:

Sulfur-oxidizing bacteria were isolated and then grown on snakefruits seeds forming biofilm. Their performance in sulfide removal were experimentally observed. Snakefruit seeds were then used as packing material in a cylindrical tube. Biological treatment of hydrogen sulfide from biogas was investigated using biofilm on packed bed of snakefruits seeds. Biogas containing 27,9512 ppm of hydrogen sulfide was flown through the bed. Then the hydrogen sulfide concentrations in the outlet at various times were analyzed. A set of simple kinetics model for the rate of the sulfide removal and the bacterial growth was proposed. The axial sulfide concentration gradient in the flowing liquid are assumed to be steady-state. Mean while the biofilm grows on the surface of the seeds and the oxidation takes place in the biofilm. Since the biofilm is very thin, the sulfide concentration in the biofilm is assumed to be uniform. The simultaneous ordinary differential equations obtained were then solved numerically using Runge-Kutta method. The acuracy of the model proposed was tested by comparing the calcultion results using the model with the experimental data obtained. It turned out that the model proposed can be applied to describe the removal of sulfide liquid using bio-filter in packed bed. The values of the parameters were also obtained by curve-fitting. The biofilter could remove 89,83 % of the inlet of hydrogen sulfide from biogas for 2.5 h, and optimum loading of 8.33 ml/h.

Keywords: Sulfur-oxidizing bacteria, snakefruits seeds, biofilm, packing material, biogas

Procedia PDF Downloads 398
3336 Optimisation of Energy Harvesting for a Composite Aircraft Wing Structure Bonded with Discrete Macro Fibre Composite Sensors

Authors: Ali H. Daraji, Ye Jianqiao

Abstract:

The micro electrical devices of the wireless sensor network are continuously developed and become very small and compact with low electric power requirements using limited period life conventional batteries. The low power requirement for these devices, cost of conventional batteries and its replacement have encouraged researcher to find alternative power supply represented by energy harvesting system to provide an electric power supply with infinite period life. In the last few years, the investigation of energy harvesting for structure health monitoring has increased to powering wireless sensor network by converting waste mechanical vibration into electricity using piezoelectric sensors. Optimisation of energy harvesting is an important research topic to ensure a flowing of efficient electric power from structural vibration. The harvesting power is mainly based on the properties of piezoelectric material, dimensions of piezoelectric sensor, its position on a structure and value of an external electric load connected between sensor electrodes. Larger surface area of sensor is not granted larger power harvesting when the sensor area is covered positive and negative mechanical strain at the same time. Thus lead to reduction or cancellation of piezoelectric output power. Optimisation of energy harvesting is achieved by locating these sensors precisely and efficiently on the structure. Limited published work has investigated the energy harvesting for aircraft wing. However, most of the published studies have simplified the aircraft wing structure by a cantilever flat plate or beam. In these studies, the optimisation of energy harvesting was investigated by determination optimal value of an external electric load connected between sensor electrode terminals or by an external electric circuit or by randomly splitting piezoelectric sensor to two segments. However, the aircraft wing structures are complex than beam or flat plate and mostly constructed from flat and curved skins stiffened by stringers and ribs with more complex mechanical strain induced on the wing surfaces. This aircraft wing structure bonded with discrete macro fibre composite sensors was modelled using multiphysics finite element to optimise the energy harvesting by determination of the optimal number of sensors, location and the output resistance load. The optimal number and location of macro fibre sensors were determined based on the maximization of the open and close loop sensor output voltage using frequency response analysis. It was found different optimal distribution, locations and number of sensors bounded on the top and the bottom surfaces of the aircraft wing.

Keywords: energy harvesting, optimisation, sensor, wing

Procedia PDF Downloads 293
3335 Myanmar Character Recognition Using Eight Direction Chain Code Frequency Features

Authors: Kyi Pyar Zaw, Zin Mar Kyu

Abstract:

Character recognition is the process of converting a text image file into editable and searchable text file. Feature Extraction is the heart of any character recognition system. The character recognition rate may be low or high depending on the extracted features. In the proposed paper, 25 features for one character are used in character recognition. Basically, there are three steps of character recognition such as character segmentation, feature extraction and classification. In segmentation step, horizontal cropping method is used for line segmentation and vertical cropping method is used for character segmentation. In the Feature extraction step, features are extracted in two ways. The first way is that the 8 features are extracted from the entire input character using eight direction chain code frequency extraction. The second way is that the input character is divided into 16 blocks. For each block, although 8 feature values are obtained through eight-direction chain code frequency extraction method, we define the sum of these 8 feature values as a feature for one block. Therefore, 16 features are extracted from that 16 blocks in the second way. We use the number of holes feature to cluster the similar characters. We can recognize the almost Myanmar common characters with various font sizes by using these features. All these 25 features are used in both training part and testing part. In the classification step, the characters are classified by matching the all features of input character with already trained features of characters.

Keywords: chain code frequency, character recognition, feature extraction, features matching, segmentation

Procedia PDF Downloads 305
3334 Analysis of Digital Transformation in Banking: The Hungarian Case

Authors: Éva Pintér, Péter Bagó, Nikolett Deutsch, Miklós Hetényi

Abstract:

The process of digital transformation has a profound influence on all sectors of the worldwide economy and the business environment. The influence of blockchain technology can be observed in the digital economy and e-government, rendering it an essential element of a nation's growth strategy. The banking industry is experiencing significant expansion and development of financial technology firms. Utilizing developing technologies such as artificial intelligence (AI), machine learning (ML), and big data (BD), these entrants are offering more streamlined financial solutions, promptly addressing client demands, and presenting a challenge to incumbent institutions. The advantages of digital transformation are evident in the corporate realm, and firms that resist its adoption put their survival at risk. The advent of digital technologies has revolutionized the business environment, streamlining processes and creating opportunities for enhanced communication and collaboration. Thanks to the aid of digital technologies, businesses can now swiftly and effortlessly retrieve vast quantities of information, all the while accelerating the process of creating new and improved products and services. Big data analytics is generally recognized as a transformative force in business, considered the fourth paradigm of science, and seen as the next frontier for innovation, competition, and productivity. Big data, an emerging technology that is shaping the future of the banking sector, offers numerous advantages to banks. It enables them to effectively track consumer behavior and make informed decisions, thereby enhancing their operational efficiency. Banks may embrace big data technologies to promptly and efficiently identify fraud, as well as gain insights into client preferences, which can then be leveraged to create better-tailored products and services. Moreover, the utilization of big data technology empowers banks to develop more intelligent and streamlined models for accurately recognizing and focusing on the suitable clientele with pertinent offers. There is a scarcity of research on big data analytics in the banking industry, with the majority of existing studies only examining the advantages and prospects associated with big data. Although big data technologies are crucial, there is a dearth of empirical evidence about the role of big data analytics (BDA) capabilities in bank performance. This research addresses a gap in the existing literature by introducing a model that combines the resource-based view (RBV), the technical organization environment framework (TOE), and dynamic capability theory (DC). This study investigates the influence of Big Data Analytics (BDA) utilization on the performance of market and risk management. This is supported by a comparative examination of Hungarian mobile banking services.

Keywords: big data, digital transformation, dynamic capabilities, mobile banking

Procedia PDF Downloads 43
3333 Radial Basis Surrogate Model Integrated to Evolutionary Algorithm for Solving Computation Intensive Black-Box Problems

Authors: Abdulbaset Saad, Adel Younis, Zuomin Dong

Abstract:

For design optimization with high-dimensional expensive problems, an effective and efficient optimization methodology is desired. This work proposes a series of modification to the Differential Evolution (DE) algorithm for solving computation Intensive Black-Box Problems. The proposed methodology is called Radial Basis Meta-Model Algorithm Assisted Differential Evolutionary (RBF-DE), which is a global optimization algorithm based on the meta-modeling techniques. A meta-modeling assisted DE is proposed to solve computationally expensive optimization problems. The Radial Basis Function (RBF) model is used as a surrogate model to approximate the expensive objective function, while DE employs a mechanism to dynamically select the best performing combination of parameters such as differential rate, cross over probability, and population size. The proposed algorithm is tested on benchmark functions and real life practical applications and problems. The test results demonstrate that the proposed algorithm is promising and performs well compared to other optimization algorithms. The proposed algorithm is capable of converging to acceptable and good solutions in terms of accuracy, number of evaluations, and time needed to converge.

Keywords: differential evolution, engineering design, expensive computations, meta-modeling, radial basis function, optimization

Procedia PDF Downloads 385
3332 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 183
3331 Investigating the Post-Liver Transplant Complications and Their Management in Children Referred to the Children’s Medical Center

Authors: Hosein Alimadadi, Fatemeh Farahmand, Ali Jafarian, Nasir Fakhar, Mohammad Hassan Sohouli, Neda Raeesi

Abstract:

Backgroundsː Regarding the important role of liver transplantation as the only treatment in many cases of end-stage liver disease in children, the aim of this study is to investigate the complications of liver transplantation and their management in children referred to the Children's Medical Center. Methods: This study is a cross-sectional study on pediatric patients who have undergone liver transplants in the years 2016 to 2021. The indication for liver transplantation in this population was confirmed by a pediatric gastroenterologist, and a liver transplant was performed by a transplant surgeon. Finally, information about the patient before and after the transplantation was collected and recorded. Results: A total of 53 patients participated in this study, including 25 (47.2%) boys and 28 (52.8%) girls. The most common causes of liver transplantation were cholestatic and metabolic diseases. The most common early complication of liver transplantation in children was acute cellular rejection (ACR) and anastomotic biliary stricture. The most common late complication in these patients was an infection which was observed in 56.6% of patients. Among the drug side effects, neurotoxicity (convulsions) was seen more in patients, and 15.1% of the transplanted patients died. Conclusion: In this study, the most common early complication of liver transplantation in children was ACR and biliary stricture, and the most common late complication was infection. Neurotoxicity (convulsions) was the most common side effect of drugs.

Keywords: liver transplantation, complication, infection, survival rate

Procedia PDF Downloads 68
3330 Comparison Study of Machine Learning Classifiers for Speech Emotion Recognition

Authors: Aishwarya Ravindra Fursule, Shruti Kshirsagar

Abstract:

In the intersection of artificial intelligence and human-centered computing, this paper delves into speech emotion recognition (SER). It presents a comparative analysis of machine learning models such as K-Nearest Neighbors (KNN),logistic regression, support vector machines (SVM), decision trees, ensemble classifiers, and random forests, applied to SER. The research employs four datasets: Crema D, SAVEE, TESS, and RAVDESS. It focuses on extracting salient audio signal features like Zero Crossing Rate (ZCR), Chroma_stft, Mel Frequency Cepstral Coefficients (MFCC), root mean square (RMS) value, and MelSpectogram. These features are used to train and evaluate the models’ ability to recognize eight types of emotions from speech: happy, sad, neutral, angry, calm, disgust, fear, and surprise. Among the models, the Random Forest algorithm demonstrated superior performance, achieving approximately 79% accuracy. This suggests its suitability for SER within the parameters of this study. The research contributes to SER by showcasing the effectiveness of various machine learning algorithms and feature extraction techniques. The findings hold promise for the development of more precise emotion recognition systems in the future. This abstract provides a succinct overview of the paper’s content, methods, and results.

Keywords: comparison, ML classifiers, KNN, decision tree, SVM, random forest, logistic regression, ensemble classifiers

Procedia PDF Downloads 32
3329 Pibid and Experimentation: A High School Case Study

Authors: Chahad P. Alexandre

Abstract:

PIBID-Institutional Program of Scholarships to Encourage Teaching - is a Brazilian government program that counts today with 48.000 students. It's goal is to motivate the students to stay in the teaching undergraduate programs and to help fill the gap of 100.000 teachers that are needed today in the under graduated schools. The major lack of teachers today is in physics, chemistry, mathematics, and biology. At IFSP-Itapetininga we formatted our physics PIBID based on practical activities. Our students are divided in two São Paulo state government high schools in the same city. The project proposes class activities based on experimentation, observation and understanding of physical phenomena. The didactical experiments are always in relation with the content that the teacher is working, he is the supervisor of the program in the school. Always before an experiment is proposed a little questionnaire to learn about the students preconceptions and one is filled latter to evaluate if now concepts have been created. This procedure is made in order to compare their previous knowledge and how it changed after the experiment is developed. The primary goal of our project is to make the Physics class more attractive to the students and to develop in high school students the interest in learning physics and to show the relation of Physics to the day by day and to the technological world. The objective of the experimental activities is to facilitate the understanding of the concepts that are worked on classes because under experimentation the PIBID scholarship student stimulate the curiosity of the high school student and with this he can develop the capacity to understand and identify the physical phenomena with concrete examples. Knowing how to identify this phenomena and where they are present at the high school student life makes the learning process more significant and pleasant. This proposal make achievable to the students to practice science, to appropriate of complex, in the traditional classes, concepts and overcoming the common preconception that physics is something distant and that is present only on books. This preconception is extremely harmful in the process of scientific knowledge construction. This kind of learning – through experimentation – make the students not only accumulate knowledge but also appropriate it, also to appropriate experimental procedures and even the space that is provided by the school. The PIBID scholarship students, as future teachers also have the opportunity to try experimentation classes, to intervene in the classes and to have contact with their future career. This opportunity allows the students to make important reflection about the practices realized and consequently about the learning methods. Due to this project, we found out that the high school students stay more time focused in the experiment compared to the traditional explanation teachers´ class. As a result in a class, as a participative activity, the students got more involved and participative. We also found out that the physics under graduated students drop out percentage is smaller in our Institute than before the PIBID program started.

Keywords: innovation, projects, PIBID, physics, pre-service teacher experiences

Procedia PDF Downloads 333
3328 A Feasibility Study of Producing Biofuels from Textile Sludge by Torrefaction Technology

Authors: Hua-Shan Tai, Yu-Ting Zeng

Abstract:

In modern and industrial society, enormous amounts of sludge from various of industries are constantly produced; currently, most of the sludge are treated by landfill and incineration. However, both treatments are not ideal because of the limited land for landfill and the secondary pollution caused by incineration. Consequently, treating industrial sludge appropriately has become an urgent issue of environmental protection. In order to solve the problem of the massive sludge, this study uses textile sludge which is the major source of waste sludge in Taiwan as raw material for torrefaction treatments. To investigate the feasibility of producing biofuels from textile sludge by torrefaction, the experiments were conducted with temperatures at 150, 200, 250, 300, and 350°C, with heating rates of 15, 20, 25 and 30°C/min, and with residence time of 30 and 60 minutes. The results revealed that the mass yields after torrefaction were approximately in the range of 54.9 to 93.4%. The energy densification ratios were approximately in the range of 0.84 to 1.10, and the energy yields were approximately in the range of 45.9 to 98.3%. The volumetric densities were approximately in the range of 0.78 to 1.14, and the volumetric energy densities were approximately in the range of 0.65 to 1.18. To sum up, the optimum energy yield (98.3%) can be reached with terminal temperature at 150 °C, heating rate of 20°C/min, and residence time of 30 minutes, and the mass yield, energy densification ratio as well as volumetric energy density were 92.2%, 1.07, and 1.15, respectively. These results indicated that the solid products after torrefaction are easy to preserve, which not only enhance the quality of the product, but also achieve the purpose of developing the material into fuel.

Keywords: biofuel, biomass energy, textile sludge, torrefaction

Procedia PDF Downloads 313
3327 Therapeutic Efficacy and Safety Profile of Tolvaptan Administered in Hyponatremia Patients

Authors: Sree Vennela P., V. Samyuktha Bhardwaj

Abstract:

Hyponatremia is an electrolyte disturbance in which the sodium ion concentration in the serum is lower than normal. Sodium is the dominant extracellular cation (positive ion) and cannot freely cross from the interstitial space through the cell membrane, into the cell. Its homeostasis (stability of concentration) inside the cell is vital to the normal function of any cell. Normal serum sodium levels are between 135 and 145 mEq/L. Hyponatremia is defined as a serum level of less than 135 mEq/L and is considered severe when the serum level is below 125 mEq/L. In the vast majority of cases, Hyponatremia occurs as a result of excess body water diluting the serum sodium (salt level in the blood). Hyponatremia is often a complication of other medical illnesses in which excess water accumulates in the body at a higher rate than can be excreted (for example in congestive heart failure, syndrome of inappropriate antidiuretic hormone, SIADH, or polydipsia). Sometimes it may be a result of over-hydration (drinking too much water).Lack of sodium (salt) is very rarely the cause of Hyponatremia, although it can promote Hyponatremia indirectly. In particular, sodium loss can lead to a state of volume depletion (loss of blood volume in the body), with volume depletion serving as a signal for the release of ADH (anti-diuretic hormone). As a result of ADH-stimulated water retention (too much water in the body), blood sodium becomes diluted and Hyponatremia results.

Keywords: Tolvaptan, hyponatremia, syndrome of insufficient anti diuretic hormone (SIADH), euvolemic hyponatremia

Procedia PDF Downloads 252
3326 Analysis of the Learners’ Responses of the Adjusted Rorschach Comprehensive System: Critical Psychological Perspective

Authors: Mokgadi Moletsane-Kekae, Robert Kananga Mukuna

Abstract:

The study focuses on the analysis of the Adjusted Rorschach Comprehensive System’s responses. The purpose of the study is to analyse the participants’ rate responses of the Adjusted Rorschach Comprehensive System with regards to critical psychology approach. The use of critical psychology theory in this study was crucial because it responds to the current inadequate western theory or practice in the field of psychology. The participants were learners in previously disadvantaged school in the Western Cape, South Africa. The study adopted a qualitative approach and a case study design. The study was grounded on interpretivist paradigm. The sample size comprised six learners (three boys and three girls, aged of 14 years) from historically disadvantaged school. The Adjusted Rorschach Comprehensive System (ARCS) administration procedure, biographical information, semi-structured interviews, and observation were used to collect data. Data was analysed using thematic framework. The study found out that, factors that increased the response rates during the administration of ARCS were, language, seating arrangement, drawing, viewing, and describing. The study recommended that, psychological test designers take into consideration the philosophy or worldviews of the local people for whom the test is designed to minimize low response rates.

Keywords: adjusted rorschach comprehensive system, critical psychology, learners, responses

Procedia PDF Downloads 369
3325 Tempo-Spatial Pattern of Progress and Disparity in Child Health in Uttar Pradesh, India

Authors: Gudakesh Yadav

Abstract:

Uttar Pradesh is one of the poorest performing states of India in terms of child health. Using data from the three round of NFHS and two rounds of DLHS, this paper attempts to examine tempo-spatial change in child health and care practices in Uttar Pradesh and its regions. Rate-ratio, CI, multivariate, and decomposition analysis has been used for the study. Findings demonstrate that child health care practices have improved over the time in all regions of the state. However; western and southern region registered the lowest progress in child immunization. Nevertheless, there is no decline in prevalence of diarrhea and ARI over the period, and it remains critically high in the western and southern region. These regions also poorly performed in giving ORS, diarrhoea and ARI treatment. Public health services are least preferred for diarrhoea and ARI treatment. Results from decomposition analysis reveal that rural area, mother’s illiteracy and wealth contributed highest to the low utilization of the child health care practices consistently over the period of time. The study calls for targeted intervention for vulnerable children to accelerate child health care service utilization. Poor performing regions should be targeted and routinely monitored on poor child health indicators.

Keywords: Acute Respiratory Infection (ARI), decomposition, diarrhea, inequality, immunization

Procedia PDF Downloads 288