Search results for: stochastic production function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12027

Search results for: stochastic production function

1527 Advancing Environmental Remediation Through the Production of Functional Porous Materials from Phosphorite Residue Tailings

Authors: Ali Mohammed Yimer, Ayalew Assen, Youssef Belmabkhout

Abstract:

Environmental remediation is a pressing global concern, necessitating innovative strategies to address the challenges posed by industrial waste and pollution. This study aims to advance environmental remediation by developing cutting-edge functional porous materials from phosphorite residue tailings. Phosphorite mining activities generate vast amounts of waste, which pose significant environmental risks due to their contaminants. The proposed approach involved transforming these phosphorite residue tailings into valuable porous materials through a series of physico-chemical processes including milling, acid-base leaching, designing or templating as well as formation processes. The key components of the tailings were extracted and processed to produce porous arrays with high surface area and porosity. These materials were engineered to possess specific properties suitable for environmental remediation applications, such as enhanced adsorption capacity and selectivity for target contaminants. The synthesized porous materials were thoroughly characterized using advanced analytical techniques (XRD, SEM-EDX, N2 sorption, TGA, FTIR) to assess their structural, morphological, and chemical properties. The performance of the materials in removing various pollutants, including heavy metals and organic compounds, were evaluated through batch adsorption experiments. Additionally, the potential for material regeneration and reusability was investigated to enhance the sustainability of the proposed remediation approach. The outdoors of this research holds significant promise for addressing the environmental challenges associated with phosphorite residue tailings. By valorizing these waste materials into porous materials with exceptional remediation capabilities, this study contributes to the development of sustainable and cost-effective solutions for environmental cleanup. Furthermore, the utilization of phosphorite residue tailings in this manner offers a potential avenue for the remediation of other contaminated sites, thereby fostering a circular economy approach to waste management.

Keywords: functional porous materials, phosphorite residue tailings, adsorption, environmental remediation, sustainable solutions

Procedia PDF Downloads 41
1526 Application of Principal Component Analysis and Ordered Logit Model in Diabetic Kidney Disease Progression in People with Type 2 Diabetes

Authors: Mequanent Wale Mekonen, Edoardo Otranto, Angela Alibrandi

Abstract:

Diabetic kidney disease is one of the main microvascular complications caused by diabetes. Several clinical and biochemical variables are reported to be associated with diabetic kidney disease in people with type 2 diabetes. However, their interrelations could distort the effect estimation of these variables for the disease's progression. The objective of the study is to determine how the biochemical and clinical variables in people with type 2 diabetes are interrelated with each other and their effects on kidney disease progression through advanced statistical methods. First, principal component analysis was used to explore how the biochemical and clinical variables intercorrelate with each other, which helped us reduce a set of correlated biochemical variables to a smaller number of uncorrelated variables. Then, ordered logit regression models (cumulative, stage, and adjacent) were employed to assess the effect of biochemical and clinical variables on the order-level response variable (progression of kidney function) by considering the proportionality assumption for more robust effect estimation. This retrospective cross-sectional study retrieved data from a type 2 diabetic cohort in a polyclinic hospital at the University of Messina, Italy. The principal component analysis yielded three uncorrelated components. These are principal component 1, with negative loading of glycosylated haemoglobin, glycemia, and creatinine; principal component 2, with negative loading of total cholesterol and low-density lipoprotein; and principal component 3, with negative loading of high-density lipoprotein and a positive load of triglycerides. The ordered logit models (cumulative, stage, and adjacent) showed that the first component (glycosylated haemoglobin, glycemia, and creatinine) had a significant effect on the progression of kidney disease. For instance, the cumulative odds model indicated that the first principal component (linear combination of glycosylated haemoglobin, glycemia, and creatinine) had a strong and significant effect on the progression of kidney disease, with an effect or odds ratio of 0.423 (P value = 0.000). However, this effect was inconsistent across levels of kidney disease because the first principal component did not meet the proportionality assumption. To address the proportionality problem and provide robust effect estimates, alternative ordered logit models, such as the partial cumulative odds model, the partial adjacent category model, and the partial continuation ratio model, were used. These models suggested that clinical variables such as age, sex, body mass index, medication (metformin), and biochemical variables such as glycosylated haemoglobin, glycemia, and creatinine have a significant effect on the progression of kidney disease.

Keywords: diabetic kidney disease, ordered logit model, principal component analysis, type 2 diabetes

Procedia PDF Downloads 14
1525 Epigenetic and Archeology: A Quest to Re-Read Humanity

Authors: Salma A. Mahmoud

Abstract:

Epigenetic, or alteration in gene expression influenced by extragenetic factors, has emerged as one of the most promising areas that will address some of the gaps in our current knowledge in understanding patterns of human variation. In the last decade, the research investigating epigenetic mechanisms in many fields has flourished and witnessed significant progress. It paved the way for a new era of integrated research especially between anthropology/archeology and life sciences. Skeletal remains are considered the most significant source of information for studying human variations across history, and by utilizing these valuable remains, we can interpret the past events, cultures and populations. In addition to archeological, historical and anthropological importance, studying bones has great implications in other fields such as medicine and science. Bones also can hold within them the secrets of the future as they can act as predictive tools for health, society characteristics and dietary requirements. Bones in their basic forms are composed of cells (osteocytes) that are affected by both genetic and environmental factors, which can only explain a small part of their variability. The primary objective of this project is to examine the epigenetic landscape/signature within bones of archeological remains as a novel marker that could reveal new ways to conceptualize chronological events, gender differences, social status and ecological variations. We attempted here to address discrepancies in common variants such as methylome as well as novel epigenetic regulators such as chromatin remodelers, which to our best knowledge have not yet been investigated by anthropologists/ paleoepigenetists using plethora of techniques (biological, computational, and statistical). Moreover, extracting epigenetic information from bones will highlight the importance of osseous material as a vector to study human beings in several contexts (social, cultural and environmental), and strengthen their essential role as model systems that can be used to investigate and construct various cultural, political and economic events. We also address all steps required to plan and conduct an epigenetic analysis from bone materials (modern and ancient) as well as discussing the key challenges facing researchers aiming to investigate this field. In conclusion, this project will serve as a primer for bioarcheologists/anthropologists and human biologists interested in incorporating epigenetic data into their research programs. Understanding the roles of epigenetic mechanisms in bone structure and function will be very helpful for a better comprehension of their biology and highlighting their essentiality as interdisciplinary vectors and a key material in archeological research.

Keywords: epigenetics, archeology, bones, chromatin, methylome

Procedia PDF Downloads 89
1524 Documenting the 15th Century Prints with RTI

Authors: Peter Fornaro, Lothar Schmitt

Abstract:

The Digital Humanities Lab and the Institute of Art History at the University of Basel are collaborating in the SNSF research project ‘Digital Materiality’. Its goal is to develop and enhance existing methods for the digital reproduction of cultural heritage objects in order to support art historical research. One part of the project focuses on the visualization of a small eye-catching group of early prints that are noteworthy for their subtle reliefs and glossy surfaces. Additionally, this group of objects – known as ‘paste prints’ – is characterized by its fragile state of preservation. Because of the brittle substances that were used for their production, most paste prints are heavily damaged and thus very hard to examine. These specific material properties make a photographic reproduction extremely difficult. To obtain better results we are working with Reflectance Transformation Imaging (RTI), a computational photographic method that is already used in archaeological and cultural heritage research. This technique allows documenting how three-dimensional surfaces respond to changing lighting situations. Our first results show that RTI can capture the material properties of paste prints and their current state of preservation more accurately than conventional photographs, although there are limitations with glossy surfaces because the mathematical models that are included in RTI are kept simple in order to keep the software robust and easy to use. To improve the method, we are currently developing tools for a more detailed analysis and simulation of the reflectance behavior. An enhanced analytical model for the representation and visualization of gloss will increase the significance of digital representations of cultural heritage objects. For collaborative efforts, we are working on a web-based viewer application for RTI images based on WebGL in order to make acquired data accessible to a broader international research community. At the ICDH Conference, we would like to present unpublished results of our work and discuss the implications of our concept for art history, computational photography and heritage science.

Keywords: art history, computational photography, paste prints, reflectance transformation imaging

Procedia PDF Downloads 263
1523 Lipid Extraction from Microbial Cell by Electroporation Technique and Its Influence on Direct Transesterification for Biodiesel Synthesis

Authors: Abu Yousuf, Maksudur Rahman Khan, Ahasanul Karim, Amirul Islam, Minhaj Uddin Monir, Sharmin Sultana, Domenico Pirozzi

Abstract:

Traditional biodiesel feedstock like edible oils or plant oils, animal fats and cooking waste oil have been replaced by microbial oil in recent research of biodiesel synthesis. The well-known community of microbial oil producers includes microalgae, oleaginous yeast and seaweeds. Conventional transesterification of microbial oil to produce biodiesel is lethargic, energy consuming, cost-ineffective and environmentally unhealthy. This process follows several steps such as microbial biomass drying, cell disruption, oil extraction, solvent recovery, oil separation and transesterification. Therefore, direct transesterification of biodiesel synthesis has been studying for last few years. It combines all the steps in a single reactor and it eliminates the steps of biomass drying, oil extraction and separation from solvent. Apparently, it seems to be cost-effective and faster process but number of difficulties need to be solved to make it large scale applicable. The main challenges are microbial cell disruption in bulk volume and make faster the esterification reaction, because water contents of the medium sluggish the reaction rate. Several methods have been proposed but none of them is up to the level to implement in large scale. It is still a great challenge to extract maximum lipid from microbial cells (yeast, fungi, algae) investing minimum energy. Electroporation technique results a significant increase in cell conductivity and permeability caused due to the application of an external electric field. Electroporation is required to alter the size and structure of the cells to increase their porosity as well as to disrupt the microbial cell walls within few seconds to leak out the intracellular lipid to the solution. Therefore, incorporation of electroporation techniques contributed in direct transesterification of microbial lipids by increasing the efficiency of biodiesel production rate.

Keywords: biodiesel, electroporation, microbial lipids, transesterification

Procedia PDF Downloads 259
1522 Adaptation of the Scenario Test for Greek-speaking People with Aphasia: Reliability and Validity Study

Authors: Marina Charalambous, Phivos Phylactou, Thekla Elriz, Loukia Psychogios, Jean-Marie Annoni

Abstract:

Background: Evidence-based practices for the evaluation and treatment of people with aphasia (PWA) in Greek are mainly impairment-based. Functional and multimodal communication is usually under assessed and neglected by clinicians. This study explores the adaptation and psychometric testing of the Greek (GR) version of The Scenario Test. The Scenario Test assesses the everyday functional communication of PWA in an interactive multimodal communication setting with the support of an active communication facilitator. Aims: To define the reliability and validity of The Scenario Test GR and discuss its clinical value. Methods & Procedures: The Scenario Test-GR was administered to 54 people with chronic stroke (6+ months post-stroke): 32 PWA and 22 people with stroke without aphasia. Participants were recruited from Greece and Cyprus. All measures were performed in an interview format. Standard psychometric criteria were applied to evaluate reliability (internal consistency, test-retest, and interrater reliability) and validity (construct and known – groups validity) of the Scenario Test GR. Video analysis was performed for the qualitative examination of the communication modes used. Outcomes & Results: The Scenario Test-GR shows high levels of reliability and validity. High scores of internal consistency (Cronbach’s α = .95), test-retest reliability (ICC = .99), and interrater reliability (ICC = .99) were found. Interrater agreement in scores on individual items fell between good and excellent levels of agreement. Correlations with a tool measuring language function in aphasia (the Aphasia Severity Rating Scale of the Boston Diagnostic Aphasia Examination), a measure of functional communication (the Communicative Effectiveness Index), and two instruments examining the psychosocial impact of aphasia (the Stroke and Aphasia Quality of Life questionnaire and the Aphasia Impact Questionnaire) revealed good convergent validity (all ps< .05). Results showed good known – groups validity (Mann-Whitney U = 96.5, p < .001), with significantly higher scores for participants without aphasia compared to those with aphasia. Conclusions: The psychometric qualities of The Scenario Test-GR support the reliability and validity of the tool for the assessment of functional communication for Greek-speaking PWA. The Scenario Test-GR can be used to assess multimodal functional communication, orient aphasia rehabilitation goal setting towards the activity and participation level, and be used as an outcome measure of everyday communication. Future studies will focus on the measurement of sensitivity to change in PWA with severe non-fluent aphasia.

Keywords: the scenario test GR, functional communication assessment, people with aphasia (PWA), tool validation

Procedia PDF Downloads 108
1521 Variation in N₂ Fixation and N Contribution by 30 Groundnut (Arachis hypogaea L.) Varieties Grown in Blesbokfontein Mpumalanga Province, South Africa

Authors: Titus Y. Ngmenzuma, Cherian. Mathews, Feilx D. Dakora

Abstract:

In Africa, poor nutrient availability, particularly N and P, coupled with low soil moisture due to erratic rainfall, constitutes the major crop production constraints. Although inorganic fertilizers are an option for meeting crop nutrient requirements for increased grain yield, the high cost and scarcity of inorganic inputs make them inaccessible to resource-poor farmers in Africa. Because crops grown on such nutrient-poor soils are micronutrient deficient, incorporating N₂-fixing legumes into cropping systems can sustainably improve crop yield and nutrient accumulation in the grain. In Africa, groundnut can easily form an effective symbiosis with native soil rhizobia, leading to marked N contribution in cropping systems. In this study, field experiments were conducted at Blesbokfontein in Mpumalanga Province to assess N₂ fixation and N contribution by 30 groundnut varieties during the 2018/2019 planting season using the ¹⁵N natural abundance technique. The results revealed marked differences in shoot dry matter yield, symbiotic N contribution, soil N uptake and grain yield among the groundnut varieties. The percent N derived from fixation ranged from 37 to 44% for varieties ICGV131051 and ICGV13984. The amount of N-fixed ranged from 21 to 58 kg/ha for varieties Chinese and IS-07273, soil N uptake from 31 to 80 kg/ha for varieties IS-07947 and IS-07273, and grain yield from 193 to 393 kg/ha for varieties ICGV15033 and ICGV131096, respectively. Compared to earlier studies on groundnut in South Africa, this study has shown low N₂ fixation and N contribution to the cropping systems, possibly due to environmental factors such as low soil moisture. Because the groundnut varieties differed in their growth, symbiotic performance and grain yield, more field testing is required over a range of differing agro-ecologies to identify genotypes suitable for different cropping environments

Keywords: ¹⁵N natural abundance, percent N derived from fixation, amount of N-fixed, grain yield

Procedia PDF Downloads 170
1520 The Price of Knowledge in the Times of Commodification of Higher Education: A Case Study on the Changing Face of Education

Authors: Joanna Peksa, Faith Dillon-Lee

Abstract:

Current developments in the Western economies have turned some universities into corporate institutions driven by practices of production and commodity. Academia is increasingly becoming integrated into national economies as a result of students paying fees and is consequently using business practices in student retention and engagement. With these changes, pedagogy status as a priority within the institution has been changing in light of these new demands. New strategies have blurred the boundaries that separate a student from a client. This led to a change of the dynamic, disrupting the traditional idea of the knowledge market, and emphasizing the corporate aspect of universities. In some cases, where students are seen primarily as a customer, the purpose of academia is no longer to educate but sell a commodity and retain fee-paying students. This paper considers opposing viewpoints on the commodification of higher education, reflecting on the reality of maintaining a pedagogic grounding in an increasingly commercialized sector. By analysing a case study of the Student Success Festival, an event that involved academic and marketing teams, the differences are considered between the respective visions of the pedagogic arm of the university and the corporate. This study argues that the initial concept of the event, based on the principles of gamification, independent learning, and cognitive criticality, was more clearly linked to a grounded pedagogic approach. However, when liaising with the marketing team in a crucial step in the creative process, it became apparent that these principles were not considered a priority in terms of their remit. While the study acknowledges in the power of pedagogy, the findings show that a pact of concord is necessary between different stakeholders in order for students to benefit fully from their learning experience. Nevertheless, while issues of power prevail and whenever power is unevenly distributed, reaching a consensus becomes increasingly challenging and further research should closely monitor the developments in pedagogy in the UK higher education.

Keywords: economic pressure, commodification, pedagogy, gamification, public service, marketization

Procedia PDF Downloads 109
1519 Human Activities Damaging the Ecosystem of Isheri Ogun River, South West Nigeria

Authors: N. B. Ikenweiwe, A. A. Alimi, N. A. Bamidele, O. A. Ewumi, K. Fasina, S. O. Otubusin

Abstract:

A study on the physical, chemical and biological parameters of the lower course of Ogun River, Isheri-Olofin was carried out between January and December 2014 in order to determine the effects of the anthropogenic activities of the Kara abattoir and domestic waste depositions on the quality of the water. Water samples were taken twice each month at three selected stations A, B and C (based on characteristic features or activity levels) along the water course. Samples were analysed using standard methods for chemical and biological parameters the same day in the laboratory while physical parameters were determined in-situ with water parameters kit. Generally, results of Transparency, Dissolved Oxygen, Nitrates, TDS and Alkalinity fall below the permissible limits of WHO and FEPA standards for drinking and fish production. Results of phosphates, lead and cadmium were also low but still within the permissible limit. Only Temperature and pH were within limit. Low plankton community, (phytoplankton, zooplankton), which ranges from 3, 5 to 40, 23 were as a result of low levels of DO, transparency and phosphate. The presence of coliform bacteria of public health importance like Escherichia coli, Proteus vulgaris, Aeromonas sp., Shigella sp, Enterobacter aerogenes as well as gram negative bacteria Proteus morganii are mainly indicators of faecal pollution. Fish and other resources obtained from this water stand the risk of being contaminated with these organisms and man is at the receiving end. The results of the physical, chemical and some biological parameters of Isheri, Ogun River, according to this study showed that the live forms of aquatic and fisheries resources there are dwelling under stress as a result of deposition of bones, horns, faecal components, slurry of suspended solids, fat and blood into the water. Government should therefore establish good monitoring system against illegal waste depositions and create education programmes that will enlighten the community on the social, ecological and economic values of the river.

Keywords: damage, ecosystem, human activities, Isheri ogun river

Procedia PDF Downloads 518
1518 Synthesis, Characterization and Bioactivity of Methotrexate Conjugated Fluorescent Carbon Nanoparticles in vitro Model System Using Human Lung Carcinoma Cell Lines

Authors: Abdul Matin, Muhammad Ajmal, Uzma Yunus, Noaman-ul Haq, Hafiz M. Shohaib, Ambreen G. Muazzam

Abstract:

Carbon nanoparticles (CNPs) have unique properties that are useful for the diagnosis and treatment of cancer due to their precise properties like small size (ideal for delivery within the body) stability in solvent and tunable surface chemistry for targeted delivery. Here, highly fluorescent, monodispersed and water-soluble CNPs were synthesized directly from a suitable carbohydrate source (glucose and sucrose) by one-step acid assisted ultrasonic treatment at 35 KHz for 4 hours. This method is green, simple, rapid and economical and can be used for large scale production and applications. The average particle sizes of CNPs are less than 10nm and they emit bright and colorful green-blue fluorescence under the irradiation of UV-light at 365nm. The CNPs were characterized by scanning electron microscopy, fluorescent spectrophotometry, Fourier transform infrared spectrophotometry, ultraviolet-visible spectrophotometry and TGA analysis. Fluorescent CNPs were used as fluorescent probe and nano-carriers for anticancer drug. Functionalized CNPs (with ethylene diamine) were attached with anticancer drug-Methotrexate. In vitro bioactivity and biocompatibility of CNPs-drug conjugates was evaluated by LDH assay and Sulforhodamine B assay using human lung carcinoma cell lines (H157). Our results reveled that CNPs showed biocompatibility and CNPs-anticancer drug conjugates have shown potent cytotoxic effects and high antitumor activities in lung cancer cell lines. CNPs are proved to be excellent substitute for conventional drug delivery cargo systems and anticancer therapeutics in vitro. Our future studies will be more focused on using the same nanoparticles in vivo model system.

Keywords: carbon nanoparticles, carbon nanoparticles-methotrexate conjugates, human lung carcinoma cell lines, lactate dehydrogenase, methotrexate

Procedia PDF Downloads 286
1517 An Exploratory Study to Understand the Economic Opportunities from Climate Change

Authors: Sharvari Parikh

Abstract:

Climate change has always been looked upon as a threat. Increased use of fossil fuels, depletion of bio diversity, certain human activities, rising levels of Greenhouse Gas (GHG) emissions are the factors that have caused climate change. Climate change is creating new risks and aggravating the existing ones. The paper focuses on breaking the stereotypical perception of climate change and draws attention towards the constructive side of it. Researches around the world have concluded that climate change has provided us with many untapped opportunities. The next 15 years will be crucial, as it is in our hands whether we are able to grab these opportunities or just let the situation get worse. The world stands at a stage where we cannot think of making a choice between averting climate change and promoting growth and development. In fact, the solution to climate change itself has got economic opportunities. The data evidences from the paper show how we can create the opportunity to improve the lives of the world’s population at large through structural change which will promote environment friendly investments. Rising Investment in green energy and increased demand of climate friendly products has got ample of employment opportunities. Old technologies and machinery which are employed today lack efficiency and demand huge maintenance because of which we face high production cost. This can be drastically brought down by adaptation of Green technologies which are more accessible and affordable. Overall GDP of the world has been heavily affected in aggravating the problems arising out of increasing weather problems. Shifting to green economy can not only eliminate these costs but also build a sound economy. Accelerating the economy in direction of low-carbon future can lessen the burdens such as subsidies for fossil fuels, several public debts, unemployment, poverty, reduce healthcare expenses etc. It is clear that the world will be dragged into the ‘Darker phase’ if the current trends of fossil fuels and carbon are being consumed. Switching to Green economy is the only way in which we can lift the world from darker phase. Climate change has opened the gates for ‘Green and Clean economy’. It will also bring countries of the world together in achieving the common goal of Green Economy.

Keywords: climate change, economic opportunities, green economy, green technology

Procedia PDF Downloads 224
1516 Analyzing Transit Network Design versus Urban Dispersion

Authors: Hugo Badia

Abstract:

This research answers which is the most suitable transit network structure to serve specific demand requirements in an increasing urban dispersion process. Two main approaches of network design are found in the literature. On the one hand, a traditional answer, widespread in our cities, that develops a high number of lines to connect most of origin-destination pairs by direct trips; an approach based on the idea that users averse to transfers. On the other hand, some authors advocate an alternative design characterized by simple networks where transfer is essential to complete most of trips. To answer which of them is the best option, we use a two-step methodology. First, by means of an analytical model, three basic network structures are compared: a radial scheme, starting point for the other two structures, a direct trip-based network, and a transfer-based one, which represent the two alternative transit network designs. The model optimizes the network configuration with regard to the total cost for each structure. For a scenario of dispersion, the best alternative is the structure with the minimum cost. This dispersion degree is defined in a simple way considering that only a central area attracts all trips. If this area is small, we have a high concentrated mobility pattern; if this area is too large, the city is highly decentralized. In this first step, we can determine the area of applicability for each structure in function to that urban dispersion degree. The analytical results show that a radial structure is suitable when the demand is so centralized, however, when this demand starts to scatter, new transit lines should be implemented to avoid transfers. If the urban dispersion advances, the introduction of more lines is no longer a good alternative, in this case, the best solution is a change of structure, from direct trips to a network based on transfers. The area of applicability of each network strategy is not constant, it depends on the characteristics of demand, city and transport technology. In the second step, we translate analytical results to a real case study by the relationship between the parameters of dispersion of the model and direct measures of dispersion in a real city. Two dimensions of the urban sprawl process are considered: concentration, defined by Gini coefficient, and centralization by area based centralization index. Once it is estimated the real dispersion degree, we are able to identify in which area of applicability the city is located. In summary, from a strategic point of view, we can obtain with this methodology which is the best network design approach for a city, comparing the theoretical results with the real dispersion degree.

Keywords: analytical network design model, network structure, public transport, urban dispersion

Procedia PDF Downloads 217
1515 Occurrence of Broiler Chicken Breast White Striping Meat in Brazilian Commercial Plant

Authors: Talita Kato, Moises Grespan, Elza I. Ida, Massami Shimokomaki, Adriana L. Soares

Abstract:

White Striping (WS) is becoming a concern for the poultry industry, as it affects the look of breast broiler chicken meat leading it to rejection by the consumers. It is characterized by the appearance of varying degrees of white striations on the Pectoralis major muscle surface following the direction of the muscle fiber. The etiology of this myopathy is still unknown, however it is suggested to be associated with increased weight gain rate and age of the bird, attributing the phenomenon to the genetically bird’s selection for efficiently higher meat production. The aim of this study was to evaluate the occurrence of Pectoralis major WS in a commercial plant in southern Brazil and its chemical characterization. The breast meat samples (n=660) from birds of 47 days of age, were classified as: Normal NG (no apparent white striations), Moderate MG (when the fillets present thin lines <1 mm) and Severe SG (white striations present ˃1 mm thick covering a large part of the fillet surface). Thirty samples (n = 10 for each level of severity) were analyzed for pH, color (L*, a*, b*), proximate chemical composition (moisture, protein, ash and lipids contents) and hydroxyproline in order to determine the collagen content. The results revealed the occurrence for NG group was 16.97%, 51.67% for MG group and 31.36% for SG group. Although the total protein content did not differ significantly, the collagen index was 42% higher in favor to SG in relation to NG. Also the lipid fraction was 27% higher for SG group. The NG presented the lowest values of the parameters L* and a* (P ≤ 0.05), as there was no white striations on its surface and highest b* value in SG, because of the maximum lipid contents. These results indicate there was a contribution of the SG muscle cells to oversynthesize connective tissue components on the muscle fascia. In conclusion, this study revealed a high incidence of White Striping on broiler commercial line in Brazil thus, there is a need to identify the causes of this abnormality in order to diminish or to eliminate it.

Keywords: collagen content, commercial line, pectoralis major muscle, proximate composition

Procedia PDF Downloads 239
1514 Characterization of Defense-Related Genes and Metabolite Profiling in Oil Palm Elaeis guineensis during Interaction with Ganoderma boninense

Authors: Mohammad Nazri Abdul Bahari, Nurshafika Mohd Sakeh, Siti Nor Akmar Abdullah

Abstract:

Basal stem rot (BSR) is the most devastating disease in oil palm. Among the oil palm pathogenic fungi, the most prevalent and virulent species associated with BSR is Ganoderma boninense. Early detection of G. boninense attack in oil palm wherein physical symptoms has not yet appeared can offer opportunities to prevent the spread of the necrotrophic fungus. However, poor understanding of molecular defense responses and roles of antifungal metabolites in oil palm against G. boninense has complicated the resolving measures. Hence, characterization of defense-related molecular responses and production of antifungal compounds during early interaction with G. boninense is of utmost important. Four month-old oil palm (Elaeis guineensis) seedlings were artificially infected with G. boninense-inoculated rubber wood block via sitting technique. RNA of samples were extracted from roots and leaves tissues at 0, 3, 7 and 11 days post inoculation (d.p.i) followed with sequencing using RNA-Seq method. Differentially-expressed genes (DEGs) of oil palm-G. boninense interaction were identified, while changes in metabolite profile will be scrutinized related to the DEGs. The RNA-Seq data generated a total of 113,829,376 and 313,293,229 paired-end clean reads from untreated (0 d.p.i) and treated (3, 7, 11 d.p.i) samples respectively, each with two biological replicates. The paired-end reads were mapped to Elaeis guineensis reference genome to screen out non-oil palm genes and subsequently generated 74,794 coding sequences. DEG analysis of phytohormone biosynthetic genes in oil palm roots revealed that at p-value ≤ 0.01, ethylene and jasmonic acid may act in antagonistic manner with salicylic acid to coordinate defense response at early interaction with G. boninense. Findings on metabolite profiling of G. boninense-infected oil palm roots and leaves are hoped to explain the defense-related compounds elicited by Elaeis guineensis in response to G. boninense colonization. The study aims to shed light on molecular defense response of oil palm at early interaction with G. boninense and promote prevention measures against Ganoderma infection.

Keywords: Ganoderma boninense, metabolites, phytohormones, RNA-Seq

Procedia PDF Downloads 243
1513 Exploratory Analysis and Development of Sustainable Lean Six Sigma Methodologies Integration for Effective Operation and Risk Mitigation in Manufacturing Sectors

Authors: Chukwumeka Daniel Ezeliora

Abstract:

The Nigerian manufacturing sector plays a pivotal role in the country's economic growth and development. However, it faces numerous challenges, including operational inefficiencies and inherent risks that hinder its sustainable growth. This research aims to address these challenges by exploring the integration of Lean and Six Sigma methodologies into the manufacturing processes, ultimately enhancing operational effectiveness and risk mitigation. The core of this research involves the development of a sustainable Lean Six Sigma framework tailored to the specific needs and challenges of Nigeria's manufacturing environment. This framework aims to streamline processes, reduce waste, improve product quality, and enhance overall operational efficiency. It incorporates principles of sustainability to ensure that the proposed methodologies align with environmental and social responsibility goals. To validate the effectiveness of the integrated Lean Six Sigma approach, case studies and real-world applications within select manufacturing companies in Nigeria will be conducted. Data were collected to measure the impact of the integration on key performance indicators, such as production efficiency, defect reduction, and risk mitigation. The findings from this research provide valuable insights and practical recommendations for selected manufacturing companies in South East Nigeria. By adopting sustainable Lean Six Sigma methodologies, these organizations can optimize their operations, reduce operational risks, improve product quality, and enhance their competitiveness in the global market. In conclusion, this research aims to bridge the gap between theory and practice by developing a comprehensive framework for the integration of Lean and Six Sigma methodologies in Nigeria's manufacturing sector. This integration is envisioned to contribute significantly to the sector's sustainable growth, improved operational efficiency, and effective risk mitigation strategies, ultimately benefiting the Nigerian economy as a whole.

Keywords: lean six sigma, manufacturing, risk mitigation, sustainability, operational efficiency

Procedia PDF Downloads 175
1512 Automatic Aggregation and Embedding of Microservices for Optimized Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservices are a software development methodology in which applications are built by composing a set of independently deploy-able, small, modular services. Each service runs a unique process and it gets instantiated and deployed in one or more machines (we assume that different microservices are deployed into different machines). Microservices are becoming the de facto standard for developing distributed cloud applications due to their reduced release cycles. In principle, the responsibility of a microservice can be as simple as implementing a single function, which can lead to the following issues: - Resource fragmentation due to the virtual machine boundary. - Poor communication performance between microservices. Two composition techniques can be used to optimize resource fragmentation and communication performance: aggregation and embedding of microservices. Aggregation allows the deployment of a set of microservices on the same machine using a proxy server. Aggregation helps to reduce resource fragmentation, and is particularly useful when the aggregated services have a similar scalability behavior. Embedding deals with communication performance by deploying on the same virtual machine those microservices that require a communication channel (localhost bandwidth is reported to be about 40 times faster than cloud vendor local networks and it offers better reliability). Embedding can also reduce dependencies on load balancer services since the communication takes place on a single virtual machine. For example, assume that microservice A has two instances, a1 and a2, and it communicates with microservice B, which also has two instances, b1 and b2. One embedding can deploy a1 and b1 on machine m1, and a2 and b2 are deployed on a different machine m2. This deployment configuration allows each pair (a1-b1), (a2-b2) to communicate using the localhost interface without the need of a load balancer between microservices A and B. Aggregation and embedding techniques are complex since different microservices might have incompatible runtime dependencies which forbid them from being installed on the same machine. There is also a security concern since the attack surface between microservices can be larger. Luckily, container technology allows to run several processes on the same machine in an isolated manner, solving the incompatibility of running dependencies and the previous security concern, thus greatly simplifying aggregation/embedding implementations by just deploying a microservice container on the same machine as the aggregated/embedded microservice container. Therefore, a wide variety of deployment configurations can be described by combining aggregation and embedding to create an efficient and robust microservice architecture. This paper presents a formal method that receives a declarative definition of a microservice architecture and proposes different optimized deployment configurations by aggregating/embedding microservices. The first prototype is based on i2kit, a deployment tool also submitted to ICWS 2018. The proposed prototype optimizes the following parameters: network/system performance, resource usage, resource costs and failure tolerance.

Keywords: aggregation, deployment, embedding, resource allocation

Procedia PDF Downloads 180
1511 Ecocentric Principles for the Change of the Anthropocentric Design Within the Other Species Related Fields

Authors: Armando Cuspinera

Abstract:

Humans are nature itself, being with non-human species part of the same ecosystem, but the praxis reflects that not all relations are the same. In fields of design such as Biomimicry, Biodesign, and Biophilic design exist different approaches towards nature, nevertheless, anthropocentric principles such as domination, objectivization, or exploitation are defined in the same as ecocentric principles of inherent importance in life itself. Anthropocentrism has showed humanity with pollution of the earth, water, air, and the destruction of whole ecosystems from monocultures and rampant production of useless objects that life cannot outstand this unaware rhythm of life focused only for the human benefits. Even if by nature the biosphere is resilient, studies showed in the Paris Agreement explain that humanity will perish in an unconscious way of praxis. This is why is important to develop a differentiation between anthropocentric and ecocentricprinciples in the praxis of design, in order to enhance respect, valorization, and positive affectivity towards other life forms is necessary to analyze what principles are reproduced from each practice of design. It is only from the study of immaterial dimensions of design such as symbolism, epistemology, and ontology that the relation towards nature can be redesigned, and in order to do so, it must be studies from the dimensions of ontological design what principles –anthropocentric or ecocentric- through what the objects enhance or focus the perception humans have to its surrounding. The things we design also design us is the principle of ontological design, and in order to develop a way of ecological design in which is possible to consider other species as users, designers or collaborators is important to extend the studies and relation to other living forms from a transdisciplinary perspective of techniques, knowledge, practice, and disciplines in general. Materials, technologies, and any kind of knowledge have the principle of a tool: is not good nor bad, but is in the way of using it the possibilities that exist within them. The collaboration of disciplines and fields of study gives the opportunity to connect principles from other cultures such as Deep Ecology and Environmental Humanities in the development of methodologies of design that study nature, integrates their strategies to our own species, and considers life of other species as important as human life, and is only form the studies of ontological design that material and immaterial dimensions can be analyzed and imbued with structures that already exist in other fields.

Keywords: design, antropocentrism, ecocentrism, ontological design

Procedia PDF Downloads 132
1510 Algorithm for Improved Tree Counting and Detection through Adaptive Machine Learning Approach with the Integration of Watershed Transformation and Local Maxima Analysis

Authors: Jigg Pelayo, Ricardo Villar

Abstract:

The Philippines is long considered as a valuable producer of high value crops globally. The country’s employment and economy have been dependent on agriculture, thus increasing its demand for the efficient agricultural mechanism. Remote sensing and geographic information technology have proven to effectively provide applications for precision agriculture through image-processing technique considering the development of the aerial scanning technology in the country. Accurate information concerning the spatial correlation within the field is very important for precision farming of high value crops, especially. The availability of height information and high spatial resolution images obtained from aerial scanning together with the development of new image analysis methods are offering relevant influence to precision agriculture techniques and applications. In this study, an algorithm was developed and implemented to detect and count high value crops simultaneously through adaptive scaling of support vector machine (SVM) algorithm subjected to object-oriented approach combining watershed transformation and local maxima filter in enhancing tree counting and detection. The methodology is compared to cutting-edge template matching algorithm procedures to demonstrate its effectiveness on a demanding tree is counting recognition and delineation problem. Since common data and image processing techniques are utilized, thus can be easily implemented in production processes to cover large agricultural areas. The algorithm is tested on high value crops like Palm, Mango and Coconut located in Misamis Oriental, Philippines - showing a good performance in particular for young adult and adult trees, significantly 90% above. The s inventories or database updating, allowing for the reduction of field work and manual interpretation tasks.

Keywords: high value crop, LiDAR, OBIA, precision agriculture

Procedia PDF Downloads 384
1509 Just Not Seeing It: Exploring the Relationship between Inattention Blindness and Banner Blindness

Authors: Carie Cunningham, Krsiten Lynch

Abstract:

Despite a viewer’s thought that they may be paying attention, many times they are missing out on their surrounds-- a phenomenon referred to as inattentional blindness. Inattention blindness refers to the failure of an individual to orient their attention to a particular item in their visual field. This well-defined in the psychology literature. Similarly, this phenomenon has been evaluated in media types in advertising. In advertising, not comprehending/remembering items in one’s field of vision is known as banner blindness. On the other hand, banner blindness is a phenomenon that occurs when individuals habitually see a banner in a specific area on a webpage, and thus condition themselves to ignore those habitual areas. Another reason that individuals avoid these habitual areas (usually on the top or sides of a webpage) is due to the lack of personal relevance or pertinent information to the viewer. Banner blindness, while a web-based concept, may also relate this inattention blindness. This paper is proposing an analysis of the true similarities and differences between these concepts bridging the two dimensions of thinking together. Forty participants participated in an eye-tracking and post-survey experiment to test attention and memory measures in both a banner blindness and inattention blindness condition. The two conditions were conducted between subjects semi-randomized order. Half of participants were told to search through the content ignoring the advertising banners; the other half of participants were first told to search through the content ignoring the distractor icon. These groups were switched after 5 trials and then 5 more trials were completed. In review of the literature, sustainability communication was found to have many inconsistencies with message production and viewer awareness. For the purpose of this study, we used advertising materials as stimuli. Results suggest that there are gaps between the two concepts and that more research should be done testing these effects in a real world setting versus an online environment. This contributes to theory by exploring the overlapping concepts—inattention blindness and banner blindness and providing the advertising industry with support that viewers can still fall victim to ignoring items in their field of view even if not consciously, which will impact message development.

Keywords: attention, banner blindness, eye movement, inattention blindness

Procedia PDF Downloads 250
1508 Prediction of Springback in U-bending of W-Temper AA6082 Aluminum Alloy

Authors: Jemal Ebrahim Dessie, Lukács Zsolt

Abstract:

High-strength aluminum alloys have drawn a lot of attention because of the expanding demand for lightweight vehicle design in the automotive sector. Due to poor formability at room temperature, warm and hot forming have been advised. However, warm and hot forming methods need more steps in the production process and an advanced tooling system. In contrast, since ordinary tools can be used, forming sheets at room temperature in the W temper condition is advantageous. However, springback of supersaturated sheets and their thinning are critical challenges and must be resolved during the use of this technique. In this study, AA6082-T6 aluminum alloy was solution heat treated at different oven temperatures and times using a specially designed and developed furnace in order to optimize the W-temper heat treatment temperature. A U-shaped bending test was carried out at different time periods between W-temper heat treatment and forming operation. Finite element analysis (FEA) of U-bending was conducted using AutoForm aiming to validate the experimental result. The uniaxial tensile and unload test was performed in order to determine the kinematic hardening behavior of the material and has been optimized in the Finite element code using systematic process improvement (SPI). In the simulation, the effect of friction coefficient & blank holder force was considered. Springback parameters were evaluated by the geometry adopted from the NUMISHEET ’93 benchmark problem. It is noted that the change of shape was higher at the more extended time periods between W-temper heat treatment and forming operation. Die radius was the most influential parameter at the flange springback. However, the change of shape shows an overall increasing tendency on the sidewall as the increase of radius of the punch than the radius of the die. The springback angles on the flange and sidewall seem to be highly influenced by the coefficient of friction than blank holding force, and the effect becomes increases as increasing the blank holding force.

Keywords: aluminum alloy, FEA, springback, SPI, U-bending, W-temper

Procedia PDF Downloads 83
1507 Trehalose Application Increased Membrane Stability and Cell Viability to Affect Growth of Wheat Genotypes under Heat Stress

Authors: S. K. Thind, Aparjot Kaur

Abstract:

Heat stress is one of the major environmental factors drastically reducing wheat production. Crop heat tolerance can be enhanced by preconditioning of plants by exogenous application of osmoprotectants. Presently, the effect of trehalose pretreatment (at 1 mM, and 1.5 nM) under heat stress of 35±2˚C (moderate) and 40±2˚ (severe) for four and eight hour was conducted in wheat (Tricticum aestivum L.) genotypes viz. HD2967, PBW 175, PBW 343, PBW 621, and PBW 590. Heat stress affects wide spectrum of physiological processes within plants that are irreversibly damaged by stress. Membrane thermal stability (MTS) and cell viability was significantly decreased under heat stress for eight hours. Pretreatment with trehalose improved MTS and cell viability under stress and this effect was more promotory with higher concentration. Thermal stability of photosynthetic apparatus differed markedly between genotypes and Hill reaction activity was recorded more in PBW621 followed by C306 as compared with others. In all genotypes photolysis of water showed decline with increase in temperature stress. Trehalose pretreatment helped in sustaining Hill reaction activity probably by stabilizing the photosynthetic apparatus against heat-induced photo inhibition. Both plant growth and development were affected by temperature in both shoot and root under heat stress. The reduction was compensated partially by trehalose (1.5 mM) application. Adaption to heat stress is associated with the metabolic adjustment which led to accumulation of soluble sugars including non-reducing and reducing for their role in adaptive mechanism. Higher acid invertase activity in shoot of tolerant genotypes appeared to be a characteristic for stress tolerance. As sucrose synthase play central role in sink strength and in studied wheat genotype was positively related to dry matter accumulation. The duration of heat stress for eight hours had more severe effect on these parameters and trehalose application at 1.5 mM ameliorated it to certain extent.

Keywords: heat stress, Triticum aestivum, trehalose, membrane thermal stability, triphenyl tetrazolium chloride, reduction test, growth, sugar metabolism

Procedia PDF Downloads 309
1506 An Evolutionary Approach for QAOA for Max-Cut

Authors: Francesca Schiavello

Abstract:

This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.

Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization

Procedia PDF Downloads 42
1505 Life-Cycle Cost and Life-Cycle Assessment of Photovoltaic/Thermal Systems (PV/T) in Swedish Single-Family Houses

Authors: Arefeh Hesaraki

Abstract:

The application of photovoltaic-thermal hybrids (PVT), which delivers both electricity and heat simultaneously from the same system, has become more popular during the past few years. This study addresses techno-economic and environmental impacts assessment of photovoltaic/thermal systems combined with a ground-source heat pump (GSHP) for three single-family houses located in Stockholm, Sweden. Three case studies were: (1) A renovated building built in 1936, (2) A renovated building built in 1973, and (3) A new building built-in 2013. Two simulation programs of SimaPro 9.1 and IDA Indoor Climate and Energy 4.8 (IDA ICE) were applied to analyze environmental impacts and energy usage, respectively. The cost-effectiveness of the system was evaluated using net present value (NPV), internal rate of return (IRR), and discounted payback time (DPBT) methods. In addition to cost payback time, the studied PVT system was evaluated using the energy payback time (EPBT) method. EPBT presents the time that is needed for the installed system to generate the same amount of energy which was utilized during the whole lifecycle (fabrication, installation, transportation, and end-of-life) of the system itself. Energy calculation by IDA ICE showed that a 5 m² PVT was sufficient to create a balance between the maximum heat production and the domestic hot water consumption during the summer months for all three case studies. The techno-economic analysis revealed that combining a 5 m² PVT with GSHP in the second case study possess the smallest DPBT and the highest NPV and IRR among the three case studies. It means that DPBTs (IRR) were 10.8 years (6%), 12.6 years (4%), and 13.8 years (3%) for the second, first, and the third case study, respectively. Moreover, environmental assessment of embodied energy during cradle- to- grave life cycle of the studied PVT, including fabrication, delivery of energy and raw materials, manufacture process, installation, transportation, operation phase, and end of life, revealed approximately two years of EPBT in all cases.

Keywords: life-cycle cost, life-cycle assessment, photovoltaic/thermal, IDA ICE, net present value

Procedia PDF Downloads 95
1504 The Effects of Computer Game-Based Pedagogy on Graduate Students Statistics Performance

Authors: Eva Laryea, Clement Yeboah Authors

Abstract:

A pretest-posttest within subjects, experimental design was employed to examine the effects of a computerized basic statistics learning game on achievement and statistics-related anxiety of students enrolled in introductory graduate statistics course. Participants (N = 34) were graduate students in a variety of programs at state-funded research university in the Southeast United States. We analyzed pre-test posttest differences using paired samples t-tests for achievement and for statistics anxiety. The results of the t-test for knowledge in statistics were found to be statistically significant indicating significant mean gains for statistical knowledge as a function of the game-based intervention. Likewise, the results of the t-test for statistics-related anxiety were also statistically significant indicating a decrease in anxiety from pretest to posttest. The implications of the present study are significant for both teachers and students. For teachers, using computer games developed by the researchers can help to create a more dynamic and engaging classroom environment, as well as improve student learning outcomes. For students, playing these educational games can help to develop important skills such as problem solving, critical thinking, and collaboration. Students can develop interest in the subject matter and spend quality time to learn the course as they play the game without knowing that they are even learning the presupposed hard course. The future directions of the present study are promising, as technology continues to advance and become more widely available. Some potential future developments include the integration of virtual and augmented reality into educational games, the use of machine learning and artificial intelligence to create personalized learning experiences, and the development of new and innovative game-based assessment tools. It is also important to consider the ethical implications of computer game-based pedagogy, such as the potential for games to perpetuate harmful stereotypes and biases. As the field continues to evolve, it will be crucial to address these issues and work towards creating inclusive and equitable learning experiences for all students. This study has the potential to revolutionize the way basic statistics graduate students learn and offers exciting opportunities for future development and research. It is an important area of inquiry for educators, researchers, and policymakers, and will continue to be a dynamic and rapidly evolving field for years to come.

Keywords: pretest-posttest within subjects, experimental design, achievement, statistics-related anxiety

Procedia PDF Downloads 47
1503 Production of Pre-Reduction of Iron Ore Nuggets with Lesser Sulphur Intake by Devolatisation of Boiler Grade Coal

Authors: Chanchal Biswas, Anrin Bhattacharyya, Gopes Chandra Das, Mahua Ghosh Chaudhuri, Rajib Dey

Abstract:

Boiler coals with low fixed carbon and higher ash content have always challenged the metallurgists to develop a suitable method for their utilization. In the present study, an attempt is made to establish an energy effective method for the reduction of iron ore fines in the form of nuggets by using ‘Syngas’. By devolatisation (expulsion of volatile matter by applying heat) of boiler coal, gaseous product (enriched with reducing agents like CO, CO2, H2, and CH4 gases) is generated. Iron ore nuggets are reduced by this syngas. For that reason, there is no direct contact between iron ore nuggets and coal ash. It helps to control the minimization of the sulphur intake of the reduced nuggets. A laboratory scale devolatisation furnace designed with reduction facility is evaluated after in-depth studies and exhaustive experimentations including thermo-gravimetric (TG-DTA) analysis to find out the volatile fraction present in boiler grade coal, gas chromatography (GC) to find out syngas composition in different temperature and furnace temperature gradient measurements to minimize the furnace cost by applying one heating coil. The nuggets are reduced in the devolatisation furnace at three different temperatures and three different times. The pre-reduced nuggets are subjected to analytical weight loss calculations to evaluate the extent of reduction. The phase and surface morphology analysis of pre-reduced samples are characterized using X-ray diffractometry (XRD), energy dispersive x-ray spectrometry (EDX), scanning electron microscopy (SEM), carbon sulphur analyzer and chemical analysis method. Degree of metallization of the reduced nuggets is 78.9% by using boiler grade coal. The pre-reduced nuggets with lesser sulphur content could be used in the blast furnace as raw materials or coolant which would reduce the high quality of coke rate of the furnace due to its pre-reduced character. These can be used in Basic Oxygen Furnace (BOF) as coolant also.

Keywords: alternative ironmaking, coal gasification, extent of reduction, nugget making, syngas based DRI, solid state reduction

Procedia PDF Downloads 248
1502 Effect of Psychological Stress to the Mucosal IL-6 and Helicobacter pylori Activity in Functional Dyspepsia and Myocytes

Authors: Eryati Darwin, Arina Widya Murni, Adnil Edwin Nurdin

Abstract:

Background: Functional dyspepsia (FD) is a highly prevalent and heterogeneous disorder. Most patients with FD complain of symptoms related to the intake of meals. Psychological stress may promote peptic ulcer and had an effect on ulcers associated Hp, and may also trigger worsen symptoms in inflammatory disorders of the gastrointestinal. Cells in mucosal gastric stimulate the production of several cytokines, which might associated with Helicobacter pylori infection. The cascade of biological events leading to stress-induced FD remains poorly understood. Aim of Study: To determine the prion-flammatory cytokine IL-6, and Helicobacter pylori activity on mucosal gastric of FD and their association with psychological stress. Methods: The subjects of this study were dyspeptic patients who visited M. Djamil General Hospital and in two Community Health Centers in Padang. On the basis of the stress index scale to identify psychological stress by using Depression Anxiety and Stress Scale (DASS 42), subjects were divided into two groups of 20 each, stress groups and non-stress groups. All diagnoses were confirmed by review of cortisol and esophagogastroduodenoscopy reports. Gastric biopsy samples and peripheral blood were taken during diagnostic procedures. Immunohistochemistry methods were used to determine the expression of IL-6 and Hp in gastric mucosal. The data were statistically analyzed by univariate and bivariate analysis. All procedures of this study were approved by Research Ethics Committee of Medical Faculty Andalas University. Results: In this study, we enrolled 40 FD patients (26 woman and 14 men) in range between 35-56 years old. Cortisol level of blood FD patients as parameter of stress hormone which taken in the morning was significantly higher in stress group than non-stress group. The expression of IL-6 in gastric mucosa was significantly higher in stress group in compared to non-stress group (p<0,05). Helicobacter pylori activity in gastric mucosal in stress group were significantly higher than non-stress group. Conclusion: The present study showed that psychological stress can induce gastric mucosal inflammation and increase of Helicobacter pylori activity.

Keywords: functional dyspepsia, Helicobacter pylori, interleukin-6, psychological stress

Procedia PDF Downloads 265
1501 Theta-Phase Gamma-Amplitude Coupling as a Neurophysiological Marker in Neuroleptic-Naive Schizophrenia

Authors: Jun Won Kim

Abstract:

Objective: Theta-phase gamma-amplitude coupling (TGC) was used as a novel evidence-based tool to reflect the dysfunctional cortico-thalamic interaction in patients with schizophrenia. However, to our best knowledge, no studies have reported the diagnostic utility of the TGC in the resting-state electroencephalographic (EEG) of neuroleptic-naive patients with schizophrenia compared to healthy controls. Thus, the purpose of this EEG study was to understand the underlying mechanisms in patients with schizophrenia by comparing the TGC at rest between two groups and to evaluate the diagnostic utility of TGC. Method: The subjects included 90 patients with schizophrenia and 90 healthy controls. All patients were diagnosed with schizophrenia according to the criteria of Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV) by two independent psychiatrists using semi-structured clinical interviews. Because patients were either drug-naïve (first episode) or had not been taking psychoactive drugs for one month before the study, we could exclude the influence of medications. Five frequency bands were defined for spectral analyses: delta (1–4 Hz), theta (4–8 Hz), slow alpha (8–10 Hz), fast alpha (10–13.5 Hz), beta (13.5–30 Hz), and gamma (30-80 Hz). The spectral power of the EEG data was calculated with fast Fourier Transformation using the 'spectrogram.m' function of the signal processing toolbox in Matlab. An analysis of covariance (ANCOVA) was performed to compare the TGC results between the groups, which were adjusted using a Bonferroni correction (P < 0.05/19 = 0.0026). Receiver operator characteristic (ROC) analysis was conducted to examine the discriminating ability of the TGC data for schizophrenia diagnosis. Results: The patients with schizophrenia showed a significant increase in the resting-state TGC at all electrodes. The delta, theta, slow alpha, fast alpha, and beta powers showed low accuracies of 62.2%, 58.4%, 56.9%, 60.9%, and 59.0%, respectively, in discriminating the patients with schizophrenia from the healthy controls. The ROC analysis performed on the TGC data generated the most accurate result among the EEG measures, displaying an overall classification accuracy of 92.5%. Conclusion: As TGC includes phase, which contains information about neuronal interactions from the EEG recording, TGC is expected to be useful for understanding the mechanisms the dysfunctional cortico-thalamic interaction in patients with schizophrenia. The resting-state TGC value was increased in the patients with schizophrenia compared to that in the healthy controls and had a higher discriminating ability than the other parameters. These findings may be related to the compensatory hyper-arousal patterns of the dysfunctional default-mode network (DMN) in schizophrenia. Further research exploring the association between TGC and medical or psychiatric conditions that may confound EEG signals will help clarify the potential utility of TGC.

Keywords: quantitative electroencephalography (QEEG), theta-phase gamma-amplitude coupling (TGC), schizophrenia, diagnostic utility

Procedia PDF Downloads 120
1500 A System Dynamics Approach for Assessing Policy Impacts on Closed-Loop Supply Chain Efficiency: A Case Study on Electric Vehicle Batteries

Authors: Guannan Ren, Thomas Mazzuchi, Shahram Sarkani

Abstract:

Electric vehicle battery recycling has emerged as a critical process in the transition toward sustainable transportation. As the demand for electric vehicles continues to rise, so does the need to address the end-of-life management of their batteries. Electric vehicle battery recycling benefits resource recovery and supply chain stability by reclaiming valuable metals like lithium, cobalt, nickel, and graphite. The reclaimed materials can then be reintroduced into the battery manufacturing process, reducing the reliance on raw material extraction and the environmental impacts of waste. Current battery recycling rates are insufficient to meet the growing demands for raw materials. While significant progress has been made in electric vehicle battery recycling, many areas can still improve. Standardization of battery designs, increased collection and recycling infrastructures, and improved efficiency in recycling processes are essential for scaling up recycling efforts and maximizing material recovery. This work delves into key factors, such as regulatory frameworks, economic incentives, and technological processes, that influence the cost-effectiveness and efficiency of battery recycling systems. A system dynamics model that considers variables such as battery production rates, demand and price fluctuations, recycling infrastructure capacity, and the effectiveness of recycling processes is created to study how these variables are interconnected, forming feedback loops that affect the overall supply chain efficiency. Such a model can also help simulate the effects of stricter regulations on battery disposal, incentives for recycling, or investments in research and development for battery designs and advanced recycling technologies. By using the developed model, policymakers, industry stakeholders, and researchers may gain insights into the effects of applying different policies or process updates on electric vehicle battery recycling rates.

Keywords: environmental engineering, modeling and simulation, circular economy, sustainability, transportation science, policy

Procedia PDF Downloads 65
1499 Effects of Rising Cost of Building Materials in Nigeria: A Case Study of Adamawa State

Authors: Ibrahim Yerima Gwalem, Jamila Ahmed Buhari

Abstract:

In recent years, there has been an alarming rate of increase in the costs of building materials in Nigeria, and this ugly phenomenon threatens the contributions of the construction industry in national development. The purpose of this study was to assess the effects of the rising cost of building materials in Adamawa State Nigeria. Four research questions in line with the purpose of the study were raised to guide the study. Two null hypotheses were formulated and tested at 0.05 level of significance. The study adopted a survey research design. The population of the study comprises registered contractors, registered builders, selected merchants, and consultants in Adamawa state. Data were collected using researcher designed instrument tagged effects of the rising cost of building materials questionnaire (ERCBMQ). The instrument was subjected to face and content validation by two experts, one from Modibbo Adama University of Technology Yola and the other from Federal Polytechnic Mubi. The reliability of the instrument was determined by the Cronbach Alpha method and yielded a reliability index of 0.85 high enough to ascertain the reliability. Data collected from a field survey of 2019 was analyzed using mean and percentage. The means of the prices were used in the calculations of price indices and rates of inflation on building materials. Findings revealed that factors responsible for the rising cost of building materials are the exchange rate of the Nigeria Naira with a mean rating (MR) = 4.4; cost of fuel and power supply, MR = 4.3; and changes in government policies and legislation, MR = 4.2, while fluctuations in the construction cost with MR = 2.8; reduced volume of construction output, MR = 2.52; and risk of project abandonment, MRA = 2.51, were the three effects. The study concluded that adverse effects could result in a downward effect on the contributions of the construction industries on the gross domestic product (GDP) in the nation’s economy. Among the recommendations proffered include that the government should formulate a policy that will play down the agitations on the use of imported building materials by encouraging research in the production of local building materials.

Keywords: effects, rising, cost, building, materials

Procedia PDF Downloads 120
1498 Predictive Modelling of Curcuminoid Bioaccessibility as a Function of Food Formulation and Associated Properties

Authors: Kevin De Castro Cogle, Mirian Kubo, Maria Anastasiadi, Fady Mohareb, Claire Rossi

Abstract:

Background: The bioaccessibility of bioactive compounds is a critical determinant of the nutritional quality of various food products. Despite its importance, there is a limited number of comprehensive studies aimed at assessing how the composition of a food matrix influences the bioaccessibility of a compound of interest. This knowledge gap has prompted a growing need to investigate the intricate relationship between food matrix formulations and the bioaccessibility of bioactive compounds. One such class of bioactive compounds that has attracted considerable attention is curcuminoids. These naturally occurring phytochemicals, extracted from the roots of Curcuma longa, have gained popularity owing to their purported health benefits and also well known for their poor bioaccessibility Project aim: The primary objective of this research project is to systematically assess the influence of matrix composition on the bioaccessibility of curcuminoids. Additionally, this study aimed to develop a series of predictive models for bioaccessibility, providing valuable insights for optimising the formula for functional foods and provide more descriptive nutritional information to potential consumers. Methods: Food formulations enriched with curcuminoids were subjected to in vitro digestion simulation, and their bioaccessibility was characterized with chromatographic and spectrophotometric techniques. The resulting data served as the foundation for the development of predictive models capable of estimating bioaccessibility based on specific physicochemical properties of the food matrices. Results: One striking finding of this study was the strong correlation observed between the concentration of macronutrients within the food formulations and the bioaccessibility of curcuminoids. In fact, macronutrient content emerged as a very informative explanatory variable of bioaccessibility and was used, alongside other variables, as predictors in a Bayesian hierarchical model that predicted curcuminoid bioaccessibility accurately (optimisation performance of 0.97 R2) for the majority of cross-validated test formulations (LOOCV of 0.92 R2). These preliminary results open the door to further exploration, enabling researchers to investigate a broader spectrum of food matrix types and additional properties that may influence bioaccessibility. Conclusions: This research sheds light on the intricate interplay between food matrix composition and the bioaccessibility of curcuminoids. This study lays a foundation for future investigations, offering a promising avenue for advancing our understanding of bioactive compound bioaccessibility and its implications for the food industry and informed consumer choices.

Keywords: bioactive bioaccessibility, food formulation, food matrix, machine learning, probabilistic modelling

Procedia PDF Downloads 55