Search results for: quantity allocation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1604

Search results for: quantity allocation

284 Occurrence and Levels of Mycotoxins in On-Farm Stored Sesame in Major-Growing Districts of Ethiopia

Authors: S. Alemayehu, F. A. Abera, K. M. Ayimut, R. Mahroof, J. Harvey, B. Subramanyam

Abstract:

The occurrence of mycotoxins in sesame seeds poses a significant threat to food safety and the economy in Ethiopia. This study aimed to determine the levels and occurrence of mycotoxins in on-farm stored sesame seeds in major-growing districts of Ethiopia. A total of 470 sesame seed samples were collected from randomly selected farmers' storage structures in five major-growing districts using purposive sampling techniques. An enzyme-linked immunosorbent assay (ELISA) was used to analyze the collected samples for the presence of four mycotoxins: total aflatoxins (AFT), ochratoxin A (OTA), total fumonisins (FUM), and deoxynivalenol (DON). The study found that all samples contained varying levels of mycotoxins, with AFT and DON being the most prevalent. AFT concentrations in detected samples ranged from 2.5 to 27.8 parts per billion (ppb), with a mean concentration of 13.8 ppb. OTA levels ranged from 5.0 ppb to 9.7 ppb, with a mean level of 7.1 ppb. Total fumonisin concentrations ranged from 300 to 1300 ppb in all samples, with a mean of 800 ppb. DON concentrations ranged from 560 to 700 ppb in the analyzed samples. The majority (96.8%) of the samples were safe from AFT, FUM, and DON mean levels when compared to the Federal Drug Administration maximum limit. AFT-OTA, DON-OTA, AFT-FUM, FUM-DON, and FUM-OTA, respectively, had co-occurrence rates of 44.0, 38.3, 33.8, 30.2, 29.8 and 26.0% for mycotoxins. On average, 37.2% of the sesame samples had fungal infection, and seed germination rates ranged from 66.8% to 91.1%. The Limmu district had higher levels of total aflatoxins, kernel infection, and lower germination rates than other districts. The Wollega variety of sesame had higher kernel infection, total aflatoxins concentration, and lower germination rates than other varieties. Grain age had a statistically significant (p<0.05) effect on both kernel infection and germination. The storage methods used for sesame in major-growing districts of Ethiopia favor mycotoxin-producing fungi. As the levels of mycotoxins in sesame are of public health significance, stakeholders should come together to identify secure and suitable storage technologies to maintain the quantity and quality of sesame at the level of smallholder farmers. This study suggests the need for suitable storage technologies to maintain the quality of sesame and reduce the risk of mycotoxin contamination.

Keywords: districts, seed germination, kernel infection, moisture content, relative humidity, temperature

Procedia PDF Downloads 100
283 A Critical Discourse Analysis of Protesters in the Debates of Al Jazeera Channel of the Yemeni Revolution

Authors: Raya Sulaiman

Abstract:

Critical discourse analysis investigates how discourse is used to abuse power relationships. Political debates constitute discourses which mirror aspects of ideologies. The Arab world has been one of the most unsettled zones in the world and has dominated global politics due to the Arab revolutions which started in 2010. This study aimed at uncovering the ideological intentions in the formulation and circulation of hegemonic political ideology in the TV political debates of the 2011 to 2012 Yemen revolution, how ideology was used as a tool of hegemony. The study specifically examined the ideologies associated with the use of protesters as a social actor. Data of the study consisted of four debates (17350 words) from four live debate programs: The Opposite Direction, In Depth, Behind the News and the Revolution Talk that were staged at Al Jazeera TV channel between 2011 and 2012. Data was readily transcribed by Al Jazeera online. Al Jazeera was selected for the study because it is the most popular TV network in the Arab world and has a strong presence, especially during the Arab revolutions. Al Jazeera has also been accused of inciting protests across the Arab region. Two debate sites were identified in the data: government and anti-government. The government side represented the president Ali Abdullah Saleh and his regime while the anti-government side represented the gathering squares who demanded the president to ‘step down’. The study analysed verbal discourse aspects of the debates using critical discourse analysis: aspects from the Social Actor Network model of van Leeuwen. This framework provides a step-by-step analysis model, and analyses discourse from specific grammatical processes into broader semantic issues. It also provides representative findings since it considers discourse as representative and reconstructed in social practice. Study findings indicated that Al Jazeera and the anti-government had similarities in terms of the ideological intentions related to the protesters. Al Jazeera victimized and incited the protesters which were similar to the anti-government. Al Jazeera used assimilation, nominalization, and active role allocation as the linguistic aspects in order to reach its ideological intentions related to the protesters. Government speakers did not share the same ideological intentions with Al Jazeera. Study findings indicated that Al Jazeera had excluded the government from its debates causing a violation to its slogan, the opinion, and the other opinion. This study implies the powerful role of discourse in shaping ideological media intentions and influencing the media audience.

Keywords: Al Jazeera network, critical discourse analysis, ideology, Yemeni revolution

Procedia PDF Downloads 211
282 Adaptive Strategies to Nutrient Deficiency of Doubled Diploid Citrumelo 4475: A Prospective Study Based on Structural, Ultrastructural, Physiological and Biochemical Parameters

Authors: J. Oustric, L. Berti, J. Santini

Abstract:

Nowadays, the objective of durable agriculture, and in particular organic agriculture, is to reduce the level of fertilizer inputs used in crops. Limiting the quantity of fertilizer inputs would optimize the economical result and minimizing the environmental impact. Nutrient deficiency, particularly of a major nutrient (N, P, and K), can seriously affect fruit production and quality. In citrus crops, rootstock/scion combinations. In citrus crop, scion/rootstock combinations are used frequently to improve tolerance to various abiotic stresses. New rootstocks are needed to respond to these constraints, and the use of new tetraploid rootstocks better adapted to lower nutrient intake could offer a promising way forward. The aim of this work was to determine whether a better tolerance to nutrient deficiency could be observed in a doubled diploid seedling and whether this tolerance could be observed in common clementine scion if used as rootstocks. We selected diploid (CM2x) and doubled diploid (CM4x) Citrumelo 4475 seedlings and common clementine (C) grafted onto Citrumelo 4475 diploid (C/CM2x) and doubled diploid (C/CM4x) rootstocks. Nutrient deficiency effects on the seedlings and scion/rootstock combinations were analyzed by studying anatomical, structural and ultrastructural determinants (chlorosis, stomata, ostiole and cells and their organelles), photosynthetic properties (leaf net photosynthetic rate (Pₙₑₜ), stomatal conductance (gₛ), chlorophyll a fluorescence (Fᵥ/Fₘ)) and oxidative marker (malondialdehyde). Nutrient deficiency affected differently foliar tissues, physiological parameters, and oxidative metabolism in leaves of seedlings depending on their ploidy level and of common clementine scion depending on their rootstocks ploidy level. Both CM4x and C/CM4x presented lower foliar damages (chlorosis, chloroplasts, mitochondria, and plastoglobuli), photosynthesis processes alteration (Pₙₑₜ, gₛ, and Fᵥ/Fₘ), and malondialdehyde accumulation than CM2x and C/CM2x after nutrient deficiency. Doubled diploid Citrumelo 4475 can improve nutrient deficiency tolerance, and its use as a rootstock allows to confer this tolerance to the common clementine scion.

Keywords: nutrient deficiency, oxidative stress, photosynthesis, polyploid rootstocks

Procedia PDF Downloads 111
281 Improving the Digestibility of Agro-Industrial Co-Products by Treatment with Isolated Fungi in the Meknes-Morocco Region

Authors: Mohamed Benaddou, Mohammed Diouri

Abstract:

country, such as Morocco, generates a high quantity of agricultural and food industry residues. A large portion of these residues is disposed of by burning or landfilling. The valorization of this waste biomass as feed is an interesting alternative because it is therefore considered among the best sources of cheap carbohydrates. However, its nutritional yield without any pre-treatment is very low because lignin protects cellulose, the carbohydrate used as a source of energy by ruminants. Fungal treatment is an environmentally friendly, easy and inexpensive method. This study investigated the treatment of wheat straw (WS), cedar sawdust (CS) and olive pomace (OP) with fungi selected according to the source of Carbon for improving its digestibility. Two were selected in a culture medium in which cellulose was the only source of Carbon: Cosmospora Viridescens (C.vir) and Penicillium crustosum (P.crus), two were selected in a culture medium in which lignin is the only source of Carbon: Fusarium oxysporum (F.oxy) and Fusarium sp. (F. Sp), and two in a culture medium where cellulose and lignin are the two sources of Carbon at the same time: Fusarium solani (F. solani) and Penicillium chrysogenum (P.chryso). P.chryso degraded more CS cellulose. It is very important to notice that the delignification by F. Solani reached 70% after 12 weeks of treatment of wheat straw. Ligninase enzymatic was detected in F.solani, F.sp, F.oxysporum, which made it possible to delignify the treated substrates. Delignification by C.vir is negligible in all three substrates after 12 weeks of treatment. P.crus and P.chryso degraded the lignin very slightly in WC (it did not exceed 12% after 12 weeks of treatment) but in OP this delignification is slight reaching 25% and 13% for P.chryso and P.crus successively. P.chryso allowed 30% degradation of lignin from 4 weeks of treatment. The degradation of the lignin was able to reach the maximum within 8 weeks of treatment for most of the fungi except F. solani who continued the treatment after this period. Digestibility variation (IVTD.variation) is highly very significant from fungus to fungi, duration to time, substrate to substrate and its interactions (P <0.001). indeed, all the fungi increased digestibility after 12 weeks of treatment with a difference in the degree of this increase. F.solani and F.oxy increased digestibility more than the others. this digestibility exceeded 50% in CS and O.P but did not exceed 20% for WS after treatment with F.oxy. IVTD.Var was not exceeded 20% in W.S.cedar treated with P.chryso but reached 45% after 8 weeks of treatment in W.straw.

Keywords: lignin, cellulose, digestibility, fungi, treatment, lignocellulosic biomass

Procedia PDF Downloads 193
280 Promoting 'One Health' Surveillance and Response Approach Implementation Capabilities against Emerging Threats and Epidemics Crisis Impact in African Countries

Authors: Ernest Tambo, Ghislaine Madjou, Jeanne Y. Ngogang, Shenglan Tang, Zhou XiaoNong

Abstract:

Implementing national to community-based 'One Health' surveillance approach for human, animal and environmental consequences mitigation offers great opportunities and value-added in sustainable development and wellbeing. 'One Health' surveillance approach global partnerships, policy commitment and financial investment are much needed in addressing the evolving threats and epidemics crises mitigation in African countries. The paper provides insights onto how China-Africa health development cooperation in promoting “One Health” surveillance approach in response advocacy and mitigation. China-Africa health development initiatives provide new prospects in guiding and moving forward appropriate and evidence-based advocacy and mitigation management approaches and strategies in attaining Universal Health Coverage (UHC) and Sustainable Development Goals (SDGs). Early and continuous quality and timely surveillance data collection and coordinated information sharing practices in malaria and other diseases are demonstrated in Comoros, Zanzibar, Ghana and Cameroon. Improvements of variety of access to contextual sources and network of data sharing platforms are needed in guiding evidence-based and tailored detection and response to unusual hazardous events. Moreover, understanding threats and diseases trends, frontline or point of care response delivery is crucial to promote integrated and sustainable targeted local, national “One Health” surveillance and response approach needs implementation. Importantly, operational guidelines are vital in increasing coherent financing and national workforce capacity development mechanisms. Strengthening participatory partnerships, collaboration and monitoring strategies in achieving global health agenda effectiveness in Africa. At the same enhancing surveillance data information streams reporting and dissemination usefulness in informing policies decisions, health systems programming and financial mobilization and prioritized allocation pre, during and post threats and epidemics crises programs strengths and weaknesses. Thus, capitalizing on “One Health” surveillance and response approach advocacy and mitigation implementation is timely in consolidating Africa Union 2063 agenda and Africa renaissance capabilities and expectations.

Keywords: Africa, one health approach, surveillance, response

Procedia PDF Downloads 403
279 Development of Biodegradable Wound Healing Patch of Curcumin

Authors: Abhay Asthana, Shally Toshkhani, Gyati Shilakari

Abstract:

The objective of the present research work is to develop a topical biodegradable dermal patch based formulation to aid accelerated wound healing. It is always better for patient compliance to be able to reduce the frequency of dressings with improved drug delivery and overall therapeutic efficacy. In present study optimized formulation using biodegradable components was obtained evaluating polymers and excipients (HPMC K4M, Ethylcellulose, Povidone, Polyethylene glycol and Gelatin) to impart significant folding endurance, elasticity, and strength. Molten gelatin was used to get a mixture using ethylene glycol. Chitosan dissolved in acidic medium was mixed with stirring to Gelatin mixture. With continued stirring to the mixture Curcumin was added with the aid of DCM and Methanol in an optimized ratio of 60:40 to get homogenous dispersion. Polymers were dispersed with stirring in the final formulation. The mixture was sonicated casted to get the film form. All steps were carried out under strict aseptic conditions. The final formulation was a thin uniformly smooth textured film with dark brown-yellow color. The film was found to have folding endurance was around 20 to 21 times without a crack in an optimized formulation at RT (23°C). The drug content was in range 96 to 102% and it passed the content uniform test. The final moisture content of the optimized formulation film was NMT 9.0%. The films passed stability study conducted at refrigerated conditions (4±0.2°C) and at room temperature (23 ± 2°C) for 30 days. Further, the drug content and texture remained undisturbed with stability study conducted at RT 23±2°C for 45 and 90 days. Percentage cumulative drug release was found to be 80% in 12h and matched the biodegradation rate as tested in vivo with correlation factor R2>0.9. In in vivo study administration of one dose in equivalent quantity per 2 days was applied topically. The data demonstrated a significant improvement with percentage wound contraction in contrast to control and plain drug respectively in given period. The film based formulation developed shows promising results in terms of stability and in vivo performance.

Keywords: wound healing, biodegradable, polymers, patch

Procedia PDF Downloads 459
278 Real Estate Trend Prediction with Artificial Intelligence Techniques

Authors: Sophia Liang Zhou

Abstract:

For investors, businesses, consumers, and governments, an accurate assessment of future housing prices is crucial to critical decisions in resource allocation, policy formation, and investment strategies. Previous studies are contradictory about macroeconomic determinants of housing price and largely focused on one or two areas using point prediction. This study aims to develop data-driven models to accurately predict future housing market trends in different markets. This work studied five different metropolitan areas representing different market trends and compared three-time lagging situations: no lag, 6-month lag, and 12-month lag. Linear regression (LR), random forest (RF), and artificial neural network (ANN) were employed to model the real estate price using datasets with S&P/Case-Shiller home price index and 12 demographic and macroeconomic features, such as gross domestic product (GDP), resident population, personal income, etc. in five metropolitan areas: Boston, Dallas, New York, Chicago, and San Francisco. The data from March 2005 to December 2018 were collected from the Federal Reserve Bank, FBI, and Freddie Mac. In the original data, some factors are monthly, some quarterly, and some yearly. Thus, two methods to compensate missing values, backfill or interpolation, were compared. The models were evaluated by accuracy, mean absolute error, and root mean square error. The LR and ANN models outperformed the RF model due to RF’s inherent limitations. Both ANN and LR methods generated predictive models with high accuracy ( > 95%). It was found that personal income, GDP, population, and measures of debt consistently appeared as the most important factors. It also showed that technique to compensate missing values in the dataset and implementation of time lag can have a significant influence on the model performance and require further investigation. The best performing models varied for each area, but the backfilled 12-month lag LR models and the interpolated no lag ANN models showed the best stable performance overall, with accuracies > 95% for each city. This study reveals the influence of input variables in different markets. It also provides evidence to support future studies to identify the optimal time lag and data imputing methods for establishing accurate predictive models.

Keywords: linear regression, random forest, artificial neural network, real estate price prediction

Procedia PDF Downloads 86
277 Detection and Molecular Identification of Bacteria Forming Polyhydroxyalkanoate and Polyhydroxybutyrate Isolated from Soil in Saudi Arabia

Authors: Ali Bahkali, Rayan Yousef Booq, Mohammad Khiyami

Abstract:

Soil samples were collected from five different regions in the Kingdom of Saudi Arabia. Microbiological methods included dilution methods and pour plates to isolate and purify bacteria soil. The ability of isolates to develop biopolymer was investigated on petri dishes containing elements and substance concentrations stimulating developing biopolymer. Fluorescent stains, Nile red and Nile blue were used to stain the bacterial cells developing biopolymers. In addition, Sudan black was used to detect biopolymers in bacterial cells. The isolates which developed biopolymers were identified based on their gene sequence of 1 6sRNA and their ability to grow and synthesize PHAs on mineral medium supplemented with 1% dates molasses as the only carbon source under nitrogen limitation. During the study 293 bacterial isolates were isolated and detected. Through the initial survey on the petri dishes, 84 isolates showed the ability to develop biopolymers. These bacterial colonies developed a pink color due to accumulation of the biopolymers in the cells. Twenty-three isolates were able to grow on dates molasses, three strains of which showed the ability to accumulate biopolymers. These strains included Bacillus sp., Ralstonia sp. and Microbacterium sp. They were detected by Nile blue A stain with fluorescence microscopy (OLYMPUS IX 51). Among the isolated strains Ralstonia sp. was selected after its ability to grow on molasses dates in the presence of a limited nitrogen source was detected. The optimum conditions for formation of biopolymers by isolated strains were investigated. Conditions studied included, best incubation duration (2 days), temperature (30°C) and pH (7-8). The maximum PHB production was raised by 1% (v1v) when using concentrations of dates molasses 1, 2, 3, 4 and 5% in MSM. The best inoculated with 1% old inoculum (1= OD). The ideal extraction method of PHA and PHB proved to be 0.4% sodium hypochlorite solution, producing a quantity of polymer 98.79% of the cell's dry weight. The maximum PHB production was 1.79 g/L recorded by Ralstonia sp. after 48 h, while it was 1.40 g/L produced by R.eutropha ATCC 17697 after 48 h.

Keywords: bacteria forming polyhydroxyalkanoate, detection, molecular, Saudi Arabia

Procedia PDF Downloads 328
276 Evaluation of a Driver Training Intervention for People on the Autism Spectrum: A Multi-Site Randomized Control Trial

Authors: P. Vindin, R. Cordier, N. J. Wilson, H. Lee

Abstract:

Engagement in community-based activities such as education, employment, and social relationships can improve the quality of life for individuals with Autism Spectrum Disorder (ASD). Community mobility is vital to attaining independence for individuals with ASD. Learning to drive and gaining a driver’s license is a critical link to community mobility; however, for individuals with ASD acquiring safe driving skills can be a challenging process. Issues related to anxiety, executive function, and social communication may affect driving behaviours. Driving training and education aimed at addressing barriers faced by learner drivers with ASD can help them improve their driving performance. A multi-site randomized controlled trial (RCT) was conducted to evaluate the effectiveness of an autism-specific driving training intervention for improving the on-road driving performance of learner drivers with ASD. The intervention was delivered via a training manual and interactive website consisting of five modules covering varying driving environments starting with a focus on off-road preparations and progressing through basic to complex driving skill mastery. Seventy-two learner drivers with ASD aged 16 to 35 were randomized using a blinded group allocation procedure into either the intervention or control group. The intervention group received 10 driving lessons with the instructors trained in the use of an autism-specific driving training protocol, whereas the control group received 10 driving lessons as usual. Learner drivers completed a pre- and post-observation drive using a standardized driving route to measure driving performance using the Driving Performance Checklist (DPC). They also completed anxiety, executive function, and social responsiveness measures. The findings showed that there were significant improvements in driving performance for both the intervention (d = 1.02) and the control group (d = 1.15). However, the differences were not significant between groups (p = 0.614) or study sites (p = 0.842). None of the potential moderator variables (anxiety, cognition, social responsiveness, and driving instructor experience) influenced driving performance. This study is an important step toward improving community mobility for individuals with ASD showing that an autism-specific driving training intervention can improve the driving performance of leaner drivers with ASD. It also highlighted the complexity of conducting a multi-site design even when sites were matched according to geography and traffic conditions. Driving instructors also need more and clearer information on how to communicate with learner drivers with restricted verbal expression.

Keywords: autism spectrum disorder, community mobility, driving training, transportation

Procedia PDF Downloads 112
275 Development of Antioxidant Rich Bakery Products by Applying Lysine and Maillard Reaction Products

Authors: Attila Kiss, Erzsébet Némedi, Zoltán Naár

Abstract:

Due to the rapidly growing number of conscious customers in the recent years, more and more people look for products with positive physiological effects which may contribute to the preservation of their health. In response to these demands Food Science Research Institute of Budapest develops and introduces into the market new functional foods of guaranteed positive effect that contain bioactive agents. New, efficient technologies are also elaborated in order to preserve the maximum biological effect of the produced foods. The main objective of our work was the development of new functional biscuits fortified with physiologically beneficial ingredients. Bakery products constitute the base of the food nutrients’ pyramid, thus they might be regarded as foodstuffs of the largest consumed quantity. In addition to the well-known and certified physiological benefits of lysine, as an essential amino acid, a series of antioxidant type compounds is formed as a consequence of the occurring Maillard-reaction. Progress of the evoked Maillard-reaction was studied by applying diverse sugars (glucose, fructose, saccharose, isosugar) and lysine at several temperatures (120-170°C). Interval of thermal treatment was also varied (10-30 min). The composition and production technologies were tailored in order to reach the maximum of the possible biological benefits, so as to the highest antioxidant capacity in the biscuits. Out of the examined sugar components, theextent of the Maillard-reaction-driven transformation of glucose was the most pronounced at both applied temperatures. For the precise assessment of the antioxidant activity of the products FRAP and DPPH methods were adapted and optimised. To acquire an authentic and extensive mechanism of the occurring transformations, Maillard-reaction products were identified, and relevant reaction pathways were revealed. GC-MS and HPLC-MS techniques were applied for the analysis of the 60 generated MRPs and characterisation of actual transformation processes. 3 plausible major transformation routes might have been suggested based on the analytical result and the deductive sequence of possible occurring conversions between lysine and the sugars.

Keywords: Maillard-reaction, lysine, antioxidant activity, GC-MS and HPLC-MS techniques

Procedia PDF Downloads 461
274 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation

Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke

Abstract:

Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.

Keywords: automatic calibration framework, approximate bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform

Procedia PDF Downloads 282
273 Transition towards a Market Society: Commodification of Public Health in India and Pakistan

Authors: Mayank Mishra

Abstract:

Market Economy can be broadly defined as economic system where supply and demand regulate the economy and in which decisions pertaining to production, consumption, allocation of resources, price and competition are made by collective actions of individuals or organisations with limited government intervention. On the other hand Market Society is one where instead of the economy being embedded in social relations, social relations are embedded in the economy. A market economy becomes a market society when all of land, labour and capital are commodified. This transition also has effect on people’s attitude and values. Such a transition commence impacting the non-material aspect of life such as public education, public health and the like. The inception of neoliberal policies in non-market norms altered the nature of social goods like public health that raised the following questions. What impact would the transition to a market society make on people in terms of accessibility to public health? Is healthcare a commodity that can be subjected to a competitive market place? What kind of private investments are being made in public health and how do private investments alter the nature of a public good like healthcare? This research problem will employ empirical-analytical approach that includes deductive reasoning which will be using the existing concept of market economy and market society as a foundation for the analytical framework and the hypotheses to be examined. The research also intends to inculcate the naturalistic elements of qualitative methodology which refers to studying of real world situations as they unfold. The research will analyse the existing literature available on the subject. Concomitantly the research intends to access the primary literature which includes reports from the World Bank, World Health Organisation (WHO) and the different departments of respective ministries of the countries for the analysis. This paper endeavours to highlight how the issue of commodification of public health would lead to perpetual increase in its inaccessibility leading to stratification of healthcare services where one can avail the better services depending on the extent of one’s ability to pay. Since the fundamental maxim of private investments is to churn out profits, these kinds of trends would pose a detrimental effect on the society at large perpetuating the lacuna between the have and the have-nots.The increasing private investments, both, domestic and foreign, in public health sector are leading to increasing inaccessibility of public health services. Despite the increase in various public health schemes the quality and impact of government public health services are on a continuous decline.

Keywords: commodity, India and Pakistan, market society, public health

Procedia PDF Downloads 291
272 Environmental Performance Improvement of Additive Manufacturing Processes with Part Quality Point of View

Authors: Mazyar Yosofi, Olivier Kerbrat, Pascal Mognol

Abstract:

Life cycle assessment of additive manufacturing processes has evolved significantly since these past years. A lot of existing studies mainly focused on energy consumption. Nowadays, new methodologies of life cycle inventory acquisition came through the literature and help manufacturers to take into account all the input and output flows during the manufacturing step of the life cycle of products. Indeed, the environmental analysis of the phenomena that occur during the manufacturing step of additive manufacturing processes is going to be well known. Now it becomes possible to count and measure accurately all the inventory data during the manufacturing step. Optimization of the environmental performances of processes can now be considered. Environmental performance improvement can be made by varying process parameters. However, a lot of these parameters (such as manufacturing speed, the power of the energy source, quantity of support materials) affect directly the mechanical properties, surface finish and the dimensional accuracy of a functional part. This study aims to improve the environmental performance of an additive manufacturing process without deterioration of the part quality. For that purpose, the authors have developed a generic method that has been applied on multiple parts made by additive manufacturing processes. First, a complete analysis of the process parameters is made in order to identify which parameters affect only the environmental performances of the process. Then, multiple parts are manufactured by varying the identified parameters. The aim of the second step is to find the optimum value of the parameters that decrease significantly the environmental impact of the process and keep the part quality as desired. Finally, a comparison between the part made by initials parameters and changed parameters is made. In this study, the major finding claims by authors is to reduce the environmental impact of an additive manufacturing process while respecting the three quality criterion of parts, mechanical properties, dimensional accuracy and surface roughness. Now that additive manufacturing processes can be seen as mature from a technical point of view, environmental improvement of these processes can be considered while respecting the part properties. The first part of this study presents the methodology applied to multiple academic parts. Then, the validity of the methodology is demonstrated on functional parts.

Keywords: additive manufacturing, environmental impact, environmental improvement, mechanical properties

Procedia PDF Downloads 269
271 Geometric, Energetic and Topological Analysis of (Ethanol)₉-Water Heterodecamers

Authors: Jennifer Cuellar, Angie L. Parada, Kevin N. S. Chacon, Sol M. Mejia

Abstract:

The purification of bio-ethanol through distillation methods is an unresolved issue at the biofuel industry because of the ethanol-water azeotrope formation, which increases the steps of the purification process and subsequently increases the production costs. Therefore, understanding the mixture nature at the molecular level could provide new insights for improving the current methods and/or designing new and more efficient purification methods. For that reason, the present study focuses on the evaluation and analysis of (ethanol)₉-water heterodecamers, as the systems with the minimum molecular proportion that represents the azeotropic concentration (96 %m/m in ethanol). The computational modelling was carried out with B3LYP-D3/6-311++G(d,p) in Gaussian 09. Initial explorations of the potential energy surface were done through two methods: annealing simulated runs and molecular dynamics trajectories besides intuitive structures obtained from smaller (ethanol)n-water heteroclusters, n = 7, 8 and 9. The energetic order of the seven stable heterodecamers determines the most stable heterodecamer (Hdec-1) as a structure forming a bicyclic geometry with the O-H---O hydrogen bonds (HBs) where the water is a double proton donor molecule. Hdec-1 combines 1 water molecule and the same quantity of every ethanol conformer; this is, 3 trans, 3 gauche 1 and 3 gauche 2; its abundance is 89%, its decamerization energy is -80.4 kcal/mol, i.e. 13 kcal/mol most stable than the less stable heterodecamer. Besides, a way to understand why methanol does not form an azeotropic mixture with water, analogous systems ((ethanol)10, (methanol)10, and (methanol)9-water)) were optimized. Topologic analysis of the electron density reveals that Hec-1 forms 33 weak interactions in total: 11 O-H---O, 8 C-H---O, 2 C-H---C hydrogen bonds and 12 H---H interactions. The strength and abundance of the most unconventional interactions (H---H, C-H---O and C-H---O) seem to explain the preference of the ethanol for forming heteroclusters instead of clusters. Besides, O-H---O HBs present a significant covalent character according to topologic parameters as the Laplacian of electron density and the relationship between potential and kinetic energy densities evaluated at the bond critical points; obtaining negatives values and values between 1 and 2, for those two topological parameters, respectively.

Keywords: ADMP, DFT, ethanol-water azeotrope, Grimme dispersion correction, simulated annealing, weak interactions

Procedia PDF Downloads 91
270 Synthesis of High-Pressure Performance Adsorbent from Coconut Shells Polyetheretherketone for Methane Adsorption

Authors: Umar Hayatu Sidik

Abstract:

Application of liquid base petroleum fuel (petrol and diesel) for transportation fuel causes emissions of greenhouse gases (GHGs), while natural gas (NG) reduces the emissions of greenhouse gases (GHGs). At present, compression and liquefaction are the most matured technology used for transportation system. For transportation use, compression requires high pressure (200–300 bar) while liquefaction is impractical. A relatively low pressure of 30-40 bar is achievable by adsorbed natural gas (ANG) to store nearly compressed natural gas (CNG). In this study, adsorbents for high-pressure adsorption of methane (CH4) was prepared from coconut shells and polyetheretherketone (PEEK) using potassium hydroxide (KOH) and microwave-assisted activation. Design expert software version 7.1.6 was used for optimization and prediction of preparation conditions of the adsorbents for CH₄ adsorption. Effects of microwave power, activation time and quantity of PEEK on the adsorbents performance toward CH₄ adsorption was investigated. The adsorbents were characterized by Fourier transform infrared spectroscopy (FTIR), thermogravimetric (TG) and derivative thermogravimetric (DTG) and scanning electron microscopy (SEM). The ideal CH4 adsorption capacities of adsorbents were determined using volumetric method at pressures of 5, 17, and 35 bar at an ambient temperature and 5 oC respectively. Isotherm and kinetics models were used to validate the experimental results. The optimum preparation conditions were found to be 15 wt% amount of PEEK, 3 minutes activation time and 300 W microwave power. The highest CH4 uptake of 9.7045 mmol CH4 adsorbed/g adsorbent was recorded by M33P15 (300 W of microwave power, 3 min activation time and 15 wt% amount of PEEK) among the sorbents at an ambient temperature and 35 bar. The CH4 equilibrium data is well correlated with Sips, Toth, Freundlich and Langmuir. Isotherms revealed that the Sips isotherm has the best fit, while the kinetics studies revealed that the pseudo-second-order kinetic model best describes the adsorption process. In all scenarios studied, a decrease in temperature led to an increase in adsorption of both gases. The adsorbent (M33P15) maintained its stability even after seven adsorption/desorption cycles. The findings revealed the potential of coconut shell-PEEK as CH₄ adsorbents.

Keywords: adsorption, desorption, activated carbon, coconut shells, polyetheretherketone

Procedia PDF Downloads 54
269 Assessing the Competitiveness of Green Charcoal Energy as an Alternative Source of Cooking Fuel in Uganda

Authors: Judith Awacorach, Quentin Gausset

Abstract:

Wood charcoal and firewood are the primary sources of energy for cooking fuel in most Sub-Saharan African countries, including Uganda. This leads to unsustainable forest use and to rapid deforestation. Green charcoal (made out of agricultural residues that are carbonized, reduced in char powder, and glued in briquettes, using a binder such as sugar molasse, cassava flour or clay) is a promising and sustainable alternative to wood charcoal and firewood. It is considered as renewable energy because the carbon emissions released by the combustion of green charcoal are immediately captured again in the next agricultural cycle. If practiced on a large scale, this has the potential to replace wood charcoal and stop deforestation. However, the uptake of green charcoal for cooking remains low in Uganda despite the introduction of the technology 15 years ago. The present paper reviews the barriers to the production and commercialization of green charcoal. The paper is based on the study of 13 production sites, recording the raw materials used, the production techniques, the quantity produced, the frequency of production, and the business model. Observations were made on each site, and interviews were conducted with the managers of the facilities and with one or two employees in the larger facilities. We also interviewed project administrators from four funding agencies interested in financing green charcoal production. The results of our research identify the main barriers as follows: 1) The price of green charcoal is not competitive (it is more labor and capital-intensive than wood charcoal). 2) There is a problem with quality control and labeling (one finds a wide variety of green charcoal with very different performances). 3) The carbonization of agricultural crop residues is a major bottleneck in green char production. Most briquettes are produced with wood charcoal dust or powder, which is a by-product of wood charcoal. As such, they increase the efficiency of wood charcoal but do not yet replace it. 4) There is almost no marketing chain for the product (most green charcoal is sold directly from producer to consumer without any middleman). 5) The financing institutions are reluctant to lend money for this kind of activity. 6) Storage can be challenging since briquettes can dissolve due to moisture. In conclusion, a number of important barriers need to be overcome before green charcoal can become a serious alternative to wood charcoal.

Keywords: briquettes, competitiveness, deforestation, green charcoal, renewable energy

Procedia PDF Downloads 28
268 Food Foam Characterization: Rheology, Texture and Microstructure Studies

Authors: Rutuja Upadhyay, Anurag Mehra

Abstract:

Solid food foams/cellular foods are colloidal systems which impart structure, texture and mouthfeel to many food products such as bread, cakes, ice-cream, meringues, etc. Their heterogeneous morphology makes the quantification of structure/mechanical relationships complex. The porous structure of solid food foams is highly influenced by the processing conditions, ingredient composition, and their interactions. Sensory perceptions of food foams are dependent on bubble size, shape, orientation, quantity and distribution and determines the texture of foamed foods. The state and structure of the solid matrix control the deformation behavior of the food, such as elasticity/plasticity or fracture, which in turn has an effect on the force-deformation curves. The obvious step in obtaining the relationship between the mechanical properties and the porous structure is to quantify them simultaneously. Here, we attempt to research food foams such as bread dough, baked bread and steamed rice cakes to determine the link between ingredients and the corresponding effect of each of them on the rheology, microstructure, bubble size and texture of the final product. Dynamic rheometry (SAOS), confocal laser scanning microscopy, flatbed scanning, image analysis and texture profile analysis (TPA) has been used to characterize the foods studied. In all the above systems, there was a common observation that when the mean bubble diameter is smaller, the product becomes harder as evidenced by the increase in storage and loss modulus (G′, G″), whereas when the mean bubble diameter is large the product is softer with decrease in moduli values (G′, G″). Also, the bubble size distribution affects texture of foods. It was found that bread doughs with hydrocolloids (xanthan gum, alginate) aid a more uniform bubble size distribution. Bread baking experiments were done to study the rheological changes and mechanisms involved in the structural transition of dough to crumb. Steamed rice cakes with xanthan gum (XG) addition at 0.1% concentration resulted in lower hardness with a narrower pore size distribution and larger mean pore diameter. Thus, control of bubble size could be an important parameter defining final food texture.

Keywords: food foams, rheology, microstructure, texture

Procedia PDF Downloads 317
267 Farmers Perception in Pesticide Usage in Curry Leaf (Murraya koeinigii (L.))

Authors: Swarupa Shashi Senivarapu Vemuri

Abstract:

Curry leaf (Murraya koeinigii (L.)) exported from India had insecticide residues above maximum residue limits, which are hazardous to consumer health and caused rejection of the commodity at the point of entry in Europe and middle east resulting in a check on export of curry leaf. Hence to study current pesticide usage patterns in major curry leaf growing areas, a survey on pesticide use pattern was carried out in curry leaf growing areas in Guntur districts of Andhra Pradesh during 2014-15, by interviewing farmers growing curry leaf utilizing the questionnaire to assess their knowledge and practices on crop cultivation, general awareness on pesticide recommendations and use. Education levels of farmers are less, where 13.96 per cent were only high school educated, and 13.96% were illiterates. 18.60% farmers were found cultivating curry leaf crop in less than 1 acre of land, 32.56% in 2-5 acres, 20.93% in 5-10 acres and 27.91% of the farmers in more than 10 acres of land. Majority of the curry leaf farmers (93.03%) used pesticide mixtures rather than applying single pesticide at a time, basically to save time, labour, money and to combat two or more pests with single spray. About 53.48% of farmers applied pesticides at 2 days interval followed by 34.89% of the farmers at 4 days interval, and about 11.63% of the farmers sprayed at weekly intervals. Only 27.91% of farmers thought that the quantity of pesticides used at their farm is adequate, 90.69% of farmers had perception that pesticides are helpful in getting good returns. 83.72% of farmers felt that crop change is the only way to control sucking pests which damages whole crop. About 4.65% of the curry leaf farmers opined that integrated pest management practices are alternative to pesticides and only 11.63% of farmers felt natural control as an alternative to pesticides. About 65.12% of farmers had perception that high pesticide dose will give higher yields. However, in general, Curry leaf farmers preferred to contact pesticide dealers (100%) and were not interested in contacting either agricultural officer or a scientist. Farmers were aware of endosulfan ban 93.04%), in contrast, only 65.12, per cent of farmers knew about the ban of monocrotophos on vegetables. Very few farmers knew about pesticide residues and decontamination by washing. Extension educational interventions are necessary to produce fresh curry leaf free from pesticide residues.

Keywords: Curry leaf, decontamination, endosulfan, leaf roller, psyllids, tetranychid mite

Procedia PDF Downloads 314
266 Estimation of Biomedical Waste Generated in a Tertiary Care Hospital in New Delhi

Authors: Priyanka Sharma, Manoj Jais, Poonam Gupta, Suraiya K. Ansari, Ravinder Kaur

Abstract:

Introduction: As much as the Health Care is necessary for the population, so is the management of the Biomedical waste produced. Biomedical waste is a wide terminology used for the waste material produced during the diagnosis, treatment or immunization of human beings and animals, in research or in the production or testing of biological products. Biomedical waste management is a chain of processes from the point of generation of Biomedical waste to its final disposal in the correct and proper way, assigned for that particular type of waste. Any deviation from the said processes leads to improper disposal of Biomedical waste which itself is a major health hazard. Proper segregation of Biomedical waste is the key for Biomedical Waste management. Improper disposal of BMW can cause sharp injuries which may lead to HIV, Hepatitis-B virus, Hepatitis-C virus infections. Therefore, proper disposal of BMW is of upmost importance. Health care establishments segregate the Biomedical waste and dispose it as per the Biomedical waste management rules in India. Objectives: This study was done to observe the current trends of Biomedical waste generated in a tertiary care Hospital in Delhi. Methodology: Biomedical waste management rounds were conducted in the hospital wards. Relevant details were collected and analysed and sites with maximum Biomedical waste generation were identified. All the data was cross checked with the commons collection site. Results: The total amount of waste generated in the hospital during January 2014 till December 2014 was 6,39,547 kg, of which 70.5% was General (non-hazardous) waste and the rest 29.5% was BMW which consisted highly infectious waste (12.2%), disposable plastic waste (16.3%) and sharps (1%). The maximum quantity of Biomedical waste producing sites were Obstetrics and Gynaecology wards with a total Biomedical waste production of 45.8%, followed by Paediatrics, Surgery and Medicine wards with 21.2 %, 4.6% and 4.3% respectively. The maximum average Biomedical waste generated was by Obstetrics and Gynaecology ward with 0.7 kg/bed/day, followed by Paediatrics, Surgery and Medicine wards with 0.29, 0.28 and 0.18 kg/bed/day respectively. Conclusions: Hospitals should pay attention to the sites which produce a large amount of BMW to avoid improper segregation of Biomedical waste. Also, induction and refresher training Program of Biomedical waste management should be conducted to avoid improper management of Biomedical waste. Healthcare workers should be made aware of risks of poor Biomedical waste management.

Keywords: biomedical waste, biomedical waste management, hospital-tertiary care, New Delhi

Procedia PDF Downloads 227
265 Quantum Coherence Sets the Quantum Speed Limit for Mixed States

Authors: Debasis Mondal, Chandan Datta, S. K. Sazim

Abstract:

Quantum coherence is a key resource like entanglement and discord in quantum information theory. Wigner- Yanase skew information, which was shown to be the quantum part of the uncertainty, has recently been projected as an observable measure of quantum coherence. On the other hand, the quantum speed limit has been established as an important notion for developing the ultra-speed quantum computer and communication channel. Here, we show that both of these quantities are related. Thus, cast coherence as a resource to control the speed of quantum communication. In this work, we address three basic and fundamental questions. There have been rigorous attempts to achieve more and tighter evolution time bounds and to generalize them for mixed states. However, we are yet to know (i) what is the ultimate limit of quantum speed? (ii) Can we measure this speed of quantum evolution in the interferometry by measuring a physically realizable quantity? Most of the bounds in the literature are either not measurable in the interference experiments or not tight enough. As a result, cannot be effectively used in the experiments on quantum metrology, quantum thermodynamics, and quantum communication and especially in Unruh effect detection et cetera, where a small fluctuation in a parameter is needed to be detected. Therefore, a search for the tightest yet experimentally realisable bound is a need of the hour. It will be much more interesting if one can relate various properties of the states or operations, such as coherence, asymmetry, dimension, quantum correlations et cetera and QSL. Although, these understandings may help us to control and manipulate the speed of communication, apart from the particular cases like the Josephson junction and multipartite scenario, there has been a little advancement in this direction. Therefore, the third question we ask: (iii) Can we relate such quantities with QSL? In this paper, we address these fundamental questions and show that quantum coherence or asymmetry plays an important role in setting the QSL. An important question in the study of quantum speed limit may be how it behaves under classical mixing and partial elimination of states. This is because this may help us to choose properly a state or evolution operator to control the speed limit. In this paper, we try to address this question and show that the product of the time bound of the evolution and the quantum part of the uncertainty in energy or quantum coherence or asymmetry of the state with respect to the evolution operator decreases under classical mixing and partial elimination of states.

Keywords: completely positive trace preserving maps, quantum coherence, quantum speed limit, Wigner-Yanase Skew information

Procedia PDF Downloads 332
264 Research on Quality Assurance in African Higher Education: A Bibliometric Mapping from 1999 to 2019

Authors: Luís M. João, Patrício Langa

Abstract:

The article reviews the literature on quality assurance (QA) in African higher education studies (HES) conducted through a bibliometric mapping of published papers between 1999 and 2019. Specifically, the article highlights the nuances of knowledge production in four scientific databases: Scopus, Web of Science (WoS), African Journal Online (AJOL), and Google Scholar. The analysis included 531 papers, of which 127 are from Scopus, 30 are from Web of Science, 85 are from African Journal Online, and 259 are from Google Scholar. In essence, 284 authors wrote these papers from 231 institutions and 69 different countries (i.e., Africa=54 and outside Africa=15). Results indicate the existing knowledge. This analysis allows the readers to understand the growth and development of the field during the two-decade period, identify key contributors, and observe potential trends or gaps in the research. The paper employs bibliometric mapping as its primary analytical lens. By utilizing this method, the study quantitatively assesses the publications related to QA in African HES, helping to identify patterns, collaboration networks, and disparities in research output. The bibliometric approach allows for a systematic and objective analysis of large datasets, offering a comprehensive view of the knowledge production in the field. Furthermore, the study highlights the lack of shared resources available to enhance quality in higher education institutions (HEIs) in Africa. This finding underscores the importance of promoting collaborative research efforts, knowledge exchange, and capacity building within the region to improve the overall quality of higher education. The paper argues that despite the growing quantity of QA research in African higher education, there are challenges related to citation impact and access to high-impact publication avenues for African researchers. It emphasises the need to promote collaborative research and resource-sharing to enhance the quality of HEIs in Africa. The analytical lenses of bibliometric mapping and the examination of publication players' scenarios contribute to a comprehensive understanding of the field and its implications for African higher education.

Keywords: Africa, bibliometric research, higher education studies, quality assurance, scientific database, systematic review

Procedia PDF Downloads 29
263 Sustainability and Awareness with Natural Dyes in Textile

Authors: Recep Karadag

Abstract:

Natural dyeing had started since pre-historical times for dyeing of textile materials. The natural dyeing had continued to beginning of 20th century. At the end of 19th century some synthetic dyes were synthesized. Although development of dyeing technologies and methods, natural dyeing was not developed in recent years. Despite rapid advances of synthetic dyestuff industries, natural dye processes have not developed. Therefore natural dyeing was not competed against synthetic dyes. At the same time, it was very difficult that large quantities of coloured textile was dyed with natural dyes And it was very difficult to get reproducible results in the natural dyeing using classical and traditional processes. However, natural dyeing has used slightly in the textile handicraft up to now. It is very important view that re-using of natural dyes to create awareness in textiles in recent years. Natural dyes have got many awareness and sustainability properties. Natural dyes are more eco-friendly than synthetic dyes. A lot of natural dyes have got antioxidant, antibacterial, antimicrobial, antifungal and anti –UV properties. It had been known that were obtained limited numbers colours with natural dyes in the past. On the contrary, colour scale is too wide with natural dyes. Except fluorescent colours, numerous colours can be obtained with natural dyes. Fastnesses of dyed textiles with natural dyes are good that there are light, washing, rubbing, etc. The fastness values can be improved depend on dyeing processes. Thanks to these properties mass production can be made with natural dyes in textiles. Therefore fabric dyeing machine was designed. This machine is too suitable for natural dyeing and mass production. Also any dyeing machine can be modified for natural dyeing. Although dye extraction and dyeing are made separately in the traditional natural dyeing processes and these procedures are become by designed this machine. Firstly, colouring compounds are extracted from natural dye resources, then dyeing is made with extracted colouring compounds. The colouring compounds are moderately dissolved in water. Less water is used in the extraction of colouring compounds from dye resources and dyeing with this new technique on the contrary much quantity water needs to use for dissolve of the colouring compounds in the traditional dyeing. This dyeing technique is very useful method for mass productions with natural dyes in traditional natural dyeing that use less energy, less dye materials, less water, etc. than traditional natural dyeing techniques. In this work, cotton, silk, linen and wool fabrics were dyed with some natural dye plants by the technique. According to the analysis very good results were obtained by this new technique. These results are shown sustainability and awareness of natural dyes for textiles.

Keywords: antibacterial, antimicrobial, natural dyes, sustainability

Procedia PDF Downloads 499
262 Towards a Vulnerability Model Assessment of The Alexandra Jukskei Catchment in South Africa

Authors: Vhuhwavho Gadisi, Rebecca Alowo, German Nkhonjera

Abstract:

This article sets out to detail an investigation of groundwater management in the Juksei Catchment of South Africa through spatial mapping of key hydrological relationships, interactions, and parameters in catchments. The Department of Water Affairs (DWA) noted gaps in the implementation of the South African National Water Act 1998: article 16, including the lack of appropriate models for dealing with water quantity parameters. For this reason, this research conducted a drastic GIS-based groundwater assessment to improve groundwater monitoring system in the Juksei River basin catchment of South Africa. The methodology employed was a mixed-methods approach/design that involved the use of DRASTIC analysis, questionnaire, literature review and observations to gather information on how to help people who use the Juskei River. GIS (geographical information system) mapping was carried out using a three-parameter DRASTIC (Depth to water, Recharge, Aquifer media, Soil media, Topography, Impact of the vadose zone, Hydraulic conductivity) vulnerability methodology. In addition, the developed vulnerability map was subjected to sensitivity analysis as a validation method. This approach included single-parameter sensitivity, sensitivity to map deletion, and correlation analysis of DRASTIC parameters. The findings were that approximately 5.7% (45km2) of the area in the northern part of the Juksei watershed is highly vulnerable. Approximately 53.6% (428.8 km^2) of the basin is also at high risk of groundwater contamination. This area is mainly located in the central, north-eastern, and western areas of the sub-basin. The medium and low vulnerability classes cover approximately 18.1% (144.8 km2) and 21.7% (168 km2) of the Jukskei River, respectively. The shallow groundwater of the Jukskei River belongs to a very vulnerable area. Sensitivity analysis indicated that water depth, water recharge, aquifer environment, soil, and topography were the main factors contributing to the vulnerability assessment. The conclusion is that the final vulnerability map indicates that the Juksei catchment is highly susceptible to pollution, and therefore, protective measures are needed for sustainable management of groundwater resources in the study area.

Keywords: contamination, DRASTIC, groundwater, vulnerability, model

Procedia PDF Downloads 65
261 Redefining Success Beyond Borders: A Deep Dive into Effective Methods to Boost Morale Among Virtual Workers for Exponential Project Performance

Authors: Florence Ibeh, David Oyewmi Oyekunle, David Boohene

Abstract:

The continuous advancement of information technology has completely transformed how businesses and organizations operate on a global scale. The widespread availability of virtual communication tools enables individuals to opt for remote work. While remote employment offers various benefits, such as facilitating corporate growth and enhancing customer support, it also presents distinct challenges. Therefore, investigating the intricacies of virtual team morale is crucial for ensuring the achievement of project objectives. For this study, content analysis of pre-existing secondary data was employed to examine the phenomenon. Essential elements vital for improving the success of projects within virtual teams were identified. These factors include technology adoption, creating a distraction-free work environment, effective leadership, trust-building, clear communication channels, well-defined task allocation, active team participation, and motivation. Furthermore, the study established a substantial correlation between morale levels and the participation and productivity of virtual team members. Higher levels of morale were associated with optimal performance among virtual teams. The study determined that the key factors for enhancing project performance in virtual teams are the adoption of technology, a focused environment, effective leadership, trust, communication, well-defined tasks, collaborative teamwork, and motivation. Additionally, the study discovered that modifying the optimal strategies employed by in-office teams can enhance the diminished morale prevalent in remote teams to sustain a high level of team morale for virtual teams. The findings of this study are highly significant in the dynamic field of project management. Currently, there is limited information regarding strategies that address challenges arising from external factors in virtual teams, such as ambient noise and disruptions caused by family members. The findings underscore the significance of selecting appropriate communication technologies, delineating distinct roles and responsibilities for virtual team members, and nurturing a culture of accountability and trust. Promoting seamless collaboration and instilling motivation among virtual team members are deemed highly effective in augmenting employee engagement and performance within virtual team setting.

Keywords: virtual teams, morale, project performance, distract-free environment, technology adaptation

Procedia PDF Downloads 61
260 Predicting the Impact of Scope Changes on Project Cost and Schedule Using Machine Learning Techniques

Authors: Soheila Sadeghi

Abstract:

In the dynamic landscape of project management, scope changes are an inevitable reality that can significantly impact project performance. These changes, whether initiated by stakeholders, external factors, or internal project dynamics, can lead to cost overruns and schedule delays. Accurately predicting the consequences of these changes is crucial for effective project control and informed decision-making. This study aims to develop predictive models to estimate the impact of scope changes on project cost and schedule using machine learning techniques. The research utilizes a comprehensive dataset containing detailed information on project tasks, including the Work Breakdown Structure (WBS), task type, productivity rate, estimated cost, actual cost, duration, task dependencies, scope change magnitude, and scope change timing. Multiple machine learning models are developed and evaluated to predict the impact of scope changes on project cost and schedule. These models include Linear Regression, Decision Tree, Ridge Regression, Random Forest, Gradient Boosting, and XGBoost. The dataset is split into training and testing sets, and the models are trained using the preprocessed data. Cross-validation techniques are employed to assess the robustness and generalization ability of the models. The performance of the models is evaluated using metrics such as Mean Squared Error (MSE) and R-squared. Residual plots are generated to assess the goodness of fit and identify any patterns or outliers. Hyperparameter tuning is performed to optimize the XGBoost model and improve its predictive accuracy. The feature importance analysis reveals the relative significance of different project attributes in predicting the impact on cost and schedule. Key factors such as productivity rate, scope change magnitude, task dependencies, estimated cost, actual cost, duration, and specific WBS elements are identified as influential predictors. The study highlights the importance of considering both cost and schedule implications when managing scope changes. The developed predictive models provide project managers with a data-driven tool to proactively assess the potential impact of scope changes on project cost and schedule. By leveraging these insights, project managers can make informed decisions, optimize resource allocation, and develop effective mitigation strategies. The findings of this research contribute to improved project planning, risk management, and overall project success.

Keywords: cost impact, machine learning, predictive modeling, schedule impact, scope changes

Procedia PDF Downloads 16
259 Automatic Aggregation and Embedding of Microservices for Optimized Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservices are a software development methodology in which applications are built by composing a set of independently deploy-able, small, modular services. Each service runs a unique process and it gets instantiated and deployed in one or more machines (we assume that different microservices are deployed into different machines). Microservices are becoming the de facto standard for developing distributed cloud applications due to their reduced release cycles. In principle, the responsibility of a microservice can be as simple as implementing a single function, which can lead to the following issues: - Resource fragmentation due to the virtual machine boundary. - Poor communication performance between microservices. Two composition techniques can be used to optimize resource fragmentation and communication performance: aggregation and embedding of microservices. Aggregation allows the deployment of a set of microservices on the same machine using a proxy server. Aggregation helps to reduce resource fragmentation, and is particularly useful when the aggregated services have a similar scalability behavior. Embedding deals with communication performance by deploying on the same virtual machine those microservices that require a communication channel (localhost bandwidth is reported to be about 40 times faster than cloud vendor local networks and it offers better reliability). Embedding can also reduce dependencies on load balancer services since the communication takes place on a single virtual machine. For example, assume that microservice A has two instances, a1 and a2, and it communicates with microservice B, which also has two instances, b1 and b2. One embedding can deploy a1 and b1 on machine m1, and a2 and b2 are deployed on a different machine m2. This deployment configuration allows each pair (a1-b1), (a2-b2) to communicate using the localhost interface without the need of a load balancer between microservices A and B. Aggregation and embedding techniques are complex since different microservices might have incompatible runtime dependencies which forbid them from being installed on the same machine. There is also a security concern since the attack surface between microservices can be larger. Luckily, container technology allows to run several processes on the same machine in an isolated manner, solving the incompatibility of running dependencies and the previous security concern, thus greatly simplifying aggregation/embedding implementations by just deploying a microservice container on the same machine as the aggregated/embedded microservice container. Therefore, a wide variety of deployment configurations can be described by combining aggregation and embedding to create an efficient and robust microservice architecture. This paper presents a formal method that receives a declarative definition of a microservice architecture and proposes different optimized deployment configurations by aggregating/embedding microservices. The first prototype is based on i2kit, a deployment tool also submitted to ICWS 2018. The proposed prototype optimizes the following parameters: network/system performance, resource usage, resource costs and failure tolerance.

Keywords: aggregation, deployment, embedding, resource allocation

Procedia PDF Downloads 180
258 An Empirical Analysis on the Evolution Characteristics and Textual Content of Campus Football Policy in China

Authors: Shangjun Zou, Zhiyuan Wang, Songhui You

Abstract:

Introduction In recent years, the Chinese government has issued several policies to promote the institutional reform and innovation of the development of campus football, but many problems have been exposed in the process of policy implementation. Therefore, this paper attempts to conduct an empirical analysis of the campus football policy texts to reveal the dynamic development of the microsystem in the process of policy evolution. Methods The selected policy contents are coded by constructing a two-dimensional analysis framework of campus football policy tool-policy objective. Specifically, the X dimension consists of three oriented policy tools: environment, supply and demand, while the Y dimension is divided into six aspects of policy objectives, including institution, competition, player teaching, coach training, resource guarantee and popularization. And the distribution differences of textual analysis units on X and Y dimensions are tested by using SPSS22.0 so as to evaluate the characteristics and development trend of campus football policy on respective subjects. Results 1) In the policy evolution process of campus football stepping into the 2.0 Era, there were no significant differences in the frequency distribution of policy tools(p=0.582) and policy objectives(p=0.603). The collaborative governance of multiple participants has become the primary trend, and the guiding role of Chinese Football Association has gradually become prominent. 2) There were significant differences in the distribution of policy tools before the evolution at a 95% confidence level(p=0.041). With environmental tools always maintaining the dominant position, the overall synergy of policy tools increased slightly. 3) There were significant differences in the distribution of policy objectives after the evolution at a 90% confidence level(p=0.069). The competition system of policy objective has not received enough attention while the construction of institution and resource guarantee system has been strengthened. Conclusion The upgraded version of campus football should adhere to the education concept of health first, promote the coordinated development of youth cultural learning and football skills, and strive to achieve more solid popularization, more scientific institution, more comprehensive resource guarantee and adequate integration. At the same time, it is necessary to strengthen the collaborative allocation of policy tools and reasonable planning of policy objectives so as to promote the high quality and sustainable development of campus football in the New Era. Endnote The policy texts selected in this paper are “Implementation Opinions on Accelerating the Development of Youth Campus Football” and “Action Plans for the Construction of Eight Systems of National Youth Campus Football”, which were promulgated on August 13, 2015 and September 25, 2020 respectively.

Keywords: campus football, content analysis, evolution characteristics, policy objective, policy tool

Procedia PDF Downloads 174
257 Urban Waste Management for Health and Well-Being in Lagos, Nigeria

Authors: Bolawole F. Ogunbodede, Mokolade Johnson, Adetunji Adejumo

Abstract:

High population growth rate, reactive infrastructure provision, inability of physical planning to cope with developmental pace are responsible for waste water crisis in the Lagos Metropolis. Septic tank is still the most prevalent waste-water holding system. Unfortunately, there is a dearth of septage treatment infrastructure. Public waste-water treatment system statistics relative to the 23 million people in Lagos State is worrisome. 1.85 billion Cubic meters of wastewater is generated on daily basis and only 5% of the 26 million population is connected to public sewerage system. This is compounded by inadequate budgetary allocation and erratic power supply in the last two decades. This paper explored community participatory waste-water management alternative at Oworonshoki Municipality in Lagos. The study is underpinned by decentralized Waste-water Management systems in built-up areas. The initiative accommodates 5 step waste-water issue including generation, storage, collection, processing and disposal through participatory decision making in two Oworonshoki Community Development Association (CDA) areas. Drone assisted mapping highlighted building footage. Structured interviews and focused group discussion of land lord associations in the CDA areas provided collaborator platform for decision-making. Water stagnation in primary open drainage channels and natural retention ponds in framing wetlands is traceable to frequent of climate change induced tidal influences in recent decades. Rise in water table resulting in septic-tank leakage and water pollution is reported to be responsible for the increase in the water born infirmities documented in primary health centers. This is in addition to unhealthy dumping of solid wastes in the drainage channels. The effect of uncontrolled disposal system renders surface waters and underground water systems unsafe for human and recreational use; destroys biotic life; and poisons the fragile sand barrier-lagoon urban ecosystems. Cluster decentralized system was conceptualized to service 255 households. Stakeholders agreed on public-private partnership initiative for efficient wastewater service delivery.

Keywords: health, infrastructure, management, septage, well-being

Procedia PDF Downloads 152
256 Selection and Identification of Some Spontaneous Plant Species Having the Ability to Grow Naturally on Crude Oil Contaminated Soil for a Possible Approach to Decontaminate and Rehabilitate an Industrial Area

Authors: Salima Agoun-Bahar, Ouzna Abrous-Belbachir, Souad Amelal

Abstract:

Industrial areas generally contain heavy metals; thus, negative consequences can appear in the medium and long term on the fauna and flora, but also on the food chain, which man constitutes the final link. The SONATRACH Company has become aware of the importance of environmental protection by setting up a rehabilitation program for polluted sites in order to avoid major ecological disasters and find both curative and preventive solutions. The aim of this work consists to study industrial pollution located around a crude oil storage tank in the Algiers refinery of Sidi R'cine and to select the plants which accumulate the most heavy metals for possible use in phytotechnology. Sampling of whole plants with their soil clod was realized around the pollution source at a depth of twenty centimeters, then transported to the laboratory to identify them. The quantification of heavy metals, lead, zinc, copper, and nickel was carried out by atomic absorption spectrophotometry with flame in the soil and at the level of the aerial and underground parts of the plants. Ten plant species were recorded in the polluted site, three of them belonging to the grass family with a dominance percentage higher than 50%, followed by three other species belonging to the Composite family represented by 12% and one species for each of the families Linaceae, Plantaginaceae, Papilionaceae, and Boraginaceae. Koeleria phleoïdes L. and Avena sterilis L. of the grass family seem to be the dominant plants, although they are quite far from the pollution source. Lead pollution of soils is the most pronounced for all stations, with values varying from 237.5 to 2682.5 µg.g⁻¹. Other peaks are observed for zinc (1177 µg.g⁻¹) and copper (635 µg.g⁻¹) at station 8 and nickel (1800 µg.g⁻¹) at station 10. Among the inventoried plants, some species accumulate a significant amount of metals: Trifolium sp and K.phleoides for lead and zinc, P.lanceolata and G.tomentosa for nickel, and A.clavatus for zinc. K.phloides is a very interesting species because it accumulates an important quantity of heavy metals, especially in its aerial part. This can be explained by its use of the phytoextraction technique, which will facilitate the recovery of the pollutants by the simple removal of shoots.

Keywords: heavy metals, industrial pollution, phytotechnology, rehabilitation

Procedia PDF Downloads 47
255 Technology Road Mapping in the Fourth Industrial Revolution: A Comprehensive Analysis and Strategic Framework

Authors: Abdul Rahman Hamdan

Abstract:

The Fourth Industrial Revolution (4IR) has brought unprecedented technological advancements that have disrupted many industries worldwide. In keeping up with the technological advances and rapid disruption by the introduction of many technological advancements brought forth by the 4IR, the use of technology road mapping has emerged as one of the critical tools for organizations to leverage. Technology road mapping can be used by many companies to guide them to become more adaptable and anticipate future transformation and innovation, and avoid being redundant or irrelevant due to the rapid changes in technological advancement. This research paper provides a comprehensive analysis of technology road mapping within the context of the 4IR. The objectives of the paper are to provide companies with practical insights and a strategic framework of technology road mapping for them to navigate the fast-changing nature of the 4IR. This study also contributes to the understanding and practice of technology road mapping in the 4IR and, at the same time, provides organizations with the necessary tools and critical insight to navigate the 4IR transformation by leveraging technology road mapping. Based on the literature review and case studies, the study analyses key principles, methodologies, and best practices in technology road mapping and integrates them with the unique characteristics and challenges of the 4IR. The research paper gives the background of the fourth industrial revolution. It explores the disruptive potential of technologies in the 4IR and the critical need for technology road mapping that consists of strategic planning and foresight to remain competitive and relevant in the 4IR era. It also highlights the importance of technology road mapping as an organisation’s proactive approach to align the organisation’s objectives and resources to their technology and product development in meeting the fast-evolving technological 4IR landscape. The paper also includes the theoretical foundations of technology road mapping and examines various methodological approaches, and identifies external stakeholders in the process, such as external experts, stakeholders, collaborative platforms, and cross-functional teams to ensure an integrated and robust technological roadmap for the organisation. Moreover, this study presents a comprehensive framework for technology road mapping in the 4IR by incorporating key elements and processes such as technology assessment, competitive intelligence, risk analysis, and resource allocation. It provides a framework for implementing technology road mapping from strategic planning, goal setting, and technology scanning to road mapping visualisation, implementation planning, monitoring, and evaluation. In addition, the study also addresses the challenges and limitations related to technology roadmapping in 4IR, including the gap analysis. In conclusion of the study, the study will propose a set of practical recommendations for organizations that intend to leverage technology road mapping as a strategic tool in the 4IR in driving innovation and becoming competitive in the current and future ecosystem.

Keywords: technology management, technology road mapping, technology transfer, technology planning

Procedia PDF Downloads 50