Search results for: outputs
162 The Politics of Disruption: Disrupting Polity to Influence Policy in Nigeria
Authors: Okechukwu B. C. Nwankwo
Abstract:
The surge of social protests sweeping through the globe is a contemporary phenomenon. Yet the phenomenon in itself is not new. Thus, various scholars have over the years developed conceptual frameworks for evaluating it. Adopting and adapting some of these frameworks this paper begins from a purely theoretical perspective exploring the concept and content of social protest within the specific context of Nigeria. It proceeds to build a typology of the phenomenon in terms of form, actors, origin, character, organisation, goal, dynamics, outcome and a whole lot of other variables that are context relevant for evaluating it in an operationally useful manner. The centrality of the context in which protest evolves is demonstrated. Adopting Easton’s systems theory, the paper builds on the assumption that protests emerge whenever and wherever political institutions and structures prove unable or unwilling to transform inputs in form of basic demands into outputs in form of responsive policies. It argues that protests in Nigeria are simply the crystallisation of opposition in the streets. Protests are thus extra-institutional politics. This is usually the case, as elsewhere, where there is no functional institutionalised opposition. Noting that protest, disruptive or otherwise, is an influence strategy, it argues that every single protest is a new opportunity for reform, for reorganisation of state capacities, for modifying rights and obligation of citizens and government to each other. Each reform outcome is, however, only a temporal antecedent. Its extensity gives signal for the next similar protest event. Through providing evidence on how protests in Nigeria create opportunity for reform, for more accountable, more effective governance, the paper shows the positive impact of protests and its importance even in the consolidation effort for the nation’s nascent democracy. Data on protest events will be based on media reports, especially print media.Keywords: democracy, dialectics, social protest, reform
Procedia PDF Downloads 134161 Neural Network-based Risk Detection for Dyslexia and Dysgraphia in Sinhala Language Speaking Children
Authors: Budhvin T. Withana, Sulochana Rupasinghe
Abstract:
The problem of Dyslexia and Dysgraphia, two learning disabilities that affect reading and writing abilities, respectively, is a major concern for the educational system. Due to the complexity and uniqueness of the Sinhala language, these conditions are especially difficult for children who speak it. The traditional risk detection methods for Dyslexia and Dysgraphia frequently rely on subjective assessments, making it difficult to cover a wide range of risk detection and time-consuming. As a result, diagnoses may be delayed and opportunities for early intervention may be lost. The project was approached by developing a hybrid model that utilized various deep learning techniques for detecting risk of Dyslexia and Dysgraphia. Specifically, Resnet50, VGG16 and YOLOv8 were integrated to detect the handwriting issues, and their outputs were fed into an MLP model along with several other input data. The hyperparameters of the MLP model were fine-tuned using Grid Search CV, which allowed for the optimal values to be identified for the model. This approach proved to be effective in accurately predicting the risk of Dyslexia and Dysgraphia, providing a valuable tool for early detection and intervention of these conditions. The Resnet50 model achieved an accuracy of 0.9804 on the training data and 0.9653 on the validation data. The VGG16 model achieved an accuracy of 0.9991 on the training data and 0.9891 on the validation data. The MLP model achieved an impressive training accuracy of 0.99918 and a testing accuracy of 0.99223, with a loss of 0.01371. These results demonstrate that the proposed hybrid model achieved a high level of accuracy in predicting the risk of Dyslexia and Dysgraphia.Keywords: neural networks, risk detection system, Dyslexia, Dysgraphia, deep learning, learning disabilities, data science
Procedia PDF Downloads 114160 A Sentence-to-Sentence Relation Network for Recognizing Textual Entailment
Authors: Isaac K. E. Ampomah, Seong-Bae Park, Sang-Jo Lee
Abstract:
Over the past decade, there have been promising developments in Natural Language Processing (NLP) with several investigations of approaches focusing on Recognizing Textual Entailment (RTE). These models include models based on lexical similarities, models based on formal reasoning, and most recently deep neural models. In this paper, we present a sentence encoding model that exploits the sentence-to-sentence relation information for RTE. In terms of sentence modeling, Convolutional neural network (CNN) and recurrent neural networks (RNNs) adopt different approaches. RNNs are known to be well suited for sequence modeling, whilst CNN is suited for the extraction of n-gram features through the filters and can learn ranges of relations via the pooling mechanism. We combine the strength of RNN and CNN as stated above to present a unified model for the RTE task. Our model basically combines relation vectors computed from the phrasal representation of each sentence and final encoded sentence representations. Firstly, we pass each sentence through a convolutional layer to extract a sequence of higher-level phrase representation for each sentence from which the first relation vector is computed. Secondly, the phrasal representation of each sentence from the convolutional layer is fed into a Bidirectional Long Short Term Memory (Bi-LSTM) to obtain the final sentence representations from which a second relation vector is computed. The relations vectors are combined and then used in then used in the same fashion as attention mechanism over the Bi-LSTM outputs to yield the final sentence representations for the classification. Experiment on the Stanford Natural Language Inference (SNLI) corpus suggests that this is a promising technique for RTE.Keywords: deep neural models, natural language inference, recognizing textual entailment (RTE), sentence-to-sentence relation
Procedia PDF Downloads 348159 Power Energy Management For A Grid-Connected PV System Using Rule-Base Fuzzy Logic
Authors: Nousheen Hashmi, Shoab Ahmad Khan
Abstract:
Active collaboration among the green energy sources and the load demand leads to serious issues related to power quality and stability. The growing number of green energy resources and Distributed-Generators need newer strategies to be incorporated for their operations to keep the power energy stability among green energy resources and micro-grid/Utility Grid. This paper presents a novel technique for energy power management in Grid-Connected Photovoltaic with energy storage system under set of constraints including weather conditions, Load Shedding Hours, Peak pricing Hours by using rule-based fuzzy smart grid controller to schedule power coming from multiple Power sources (photovoltaic, grid, battery) under the above set of constraints. The technique fuzzifies all the inputs and establishes fuzzify rule set from fuzzy outputs before defuzzification. Simulations are run for 24 hours period and rule base power scheduler is developed. The proposed fuzzy controller control strategy is able to sense the continuous fluctuations in Photovoltaic power generation, Load Demands, Grid (load Shedding patterns) and Battery State of Charge in order to make correct and quick decisions.The suggested Fuzzy Rule-based scheduler can operate well with vague inputs thus doesn’t not require any exact numerical model and can handle nonlinearity. This technique provides a framework for the extension to handle multiple special cases for optimized working of the system.Keywords: photovoltaic, power, fuzzy logic, distributed generators, state of charge, load shedding, membership functions
Procedia PDF Downloads 479158 A Deep Learning Model with Greedy Layer-Wise Pretraining Approach for Optimal Syngas Production by Dry Reforming of Methane
Authors: Maryam Zarabian, Hector Guzman, Pedro Pereira-Almao, Abraham Fapojuwo
Abstract:
Dry reforming of methane (DRM) has sparked significant industrial and scientific interest not only as a viable alternative for addressing the environmental concerns of two main contributors of the greenhouse effect, i.e., carbon dioxide (CO₂) and methane (CH₄), but also produces syngas, i.e., a mixture of hydrogen (H₂) and carbon monoxide (CO) utilized by a wide range of downstream processes as a feedstock for other chemical productions. In this study, we develop an AI-enable syngas production model to tackle the problem of achieving an equivalent H₂/CO ratio [1:1] with respect to the most efficient conversion. Firstly, the unsupervised density-based spatial clustering of applications with noise (DBSAN) algorithm removes outlier data points from the original experimental dataset. Then, random forest (RF) and deep neural network (DNN) models employ the error-free dataset to predict the DRM results. DNN models inherently would not be able to obtain accurate predictions without a huge dataset. To cope with this limitation, we employ reusing pre-trained layers’ approaches such as transfer learning and greedy layer-wise pretraining. Compared to the other deep models (i.e., pure deep model and transferred deep model), the greedy layer-wise pre-trained deep model provides the most accurate prediction as well as similar accuracy to the RF model with R² values 1.00, 0.999, 0.999, 0.999, 0.999, and 0.999 for the total outlet flow, H₂/CO ratio, H₂ yield, CO yield, CH₄ conversion, and CO₂ conversion outputs, respectively.Keywords: artificial intelligence, dry reforming of methane, artificial neural network, deep learning, machine learning, transfer learning, greedy layer-wise pretraining
Procedia PDF Downloads 86157 Exploring SL Writing and SL Sensitivity during Writing Tasks: Poor and Advanced Writing in a Context of Second Language other than English
Authors: Sandra Figueiredo, Margarida Alves Martins, Carlos Silva, Cristina Simões
Abstract:
This study integrates a larger research empirical project that examines second language (SL) learners’ profiles and valid procedures to perform complete and diagnostic assessment in schools. 102 learners of Portuguese as a SL aged 7 and 17 years speakers of distinct home languages were assessed in several linguistic tasks. In this article, we focused on writing performance in the specific task of narrative essay composition. The written outputs were measured using the score in six components adapted from an English SL assessment context (Alberta Education): linguistic vocabulary, grammar, syntax, strategy, socio-linguistic, and discourse. The writing processes and strategies in Portuguese language used by different immigrant students were analysed to determine features and diversity of deficits on authentic texts performed by SL writers. Differentiated performance was based on the diversity of the following variables: grades, previous schooling, home language, instruction in first language, and exposure to Portuguese as Second Language. Indo-Aryan languages speakers showed low writing scores compared to their peers and the type of language and respective cognitive mapping (such as Mandarin and Arabic) was the predictor, not linguistic distance. Home language instruction should also be prominently considered in further research to understand specificities of cognitive academic profile in a Romance languages learning context. Additionally, this study also examined the teachers representations that will be here addressed to understand educational implications of second language teaching in psychological distress of different minorities in schools of specific host countries.Keywords: home language, immigrant students, Portuguese language, second language, writing assessment
Procedia PDF Downloads 462156 Co-Operation in Hungarian Agriculture
Authors: Eszter Hamza
Abstract:
The competitiveness of economic operators is based on interoperability, which is relatively low in Hungary. The development of co-operation is high priority in Common Agricultural Policy 2014-2020. The aim of the paper to assess co-operations in Hungarian agriculture, estimate the economic outputs and benefits of co-operations, based on statistical data processing and literature. Further objective is to explore the potential of agricultural co-operation with the help of interviews and questionnaire survey. The research seeks to answer questions as to what fundamental factors play role in the development of co-operation, and what are the motivations of the actors and the key success factors and pitfalls. The results were analysed using econometric methods. In Hungarian agriculture we can find several forms of co-operation: cooperatives, producer groups (PG) and producer organizations (PO), machinery cooperatives, integrator companies, product boards and interbranch organisations. Despite the several appearance of the agricultural co-operation, their economic weight is significantly lower in Hungary than in western European countries. Considering the agricultural importance, the integrator companies represent the most weight among the co-operations forms. Hungarian farmers linked to co-operations or organizations mostly in relation to procurement and sales. Less than 30 percent of surveyed farmers are members of a producer organization or cooperative. The trust level is low among farmers. The main obstacle to the development of formalized co-operation, is producers' risk aversion and the black economy in agriculture. Producers often prefer informal co-operation instead of long-term contractual relationships. The Hungarian agricultural co-operations are characterized by non-dynamic development, but slow qualitative change. For the future, one breakout point could be the association of producer groups and organizations, which in addition to the benefits of market concentration, in the dissemination of knowledge, advisory network operation and innovation can act more effectively.Keywords: agriculture, co-operation, producer organisation, trust level
Procedia PDF Downloads 394155 Optimisation of Metrological Inspection of a Developmental Aeroengine Disc
Authors: Suneel Kumar, Nanda Kumar J. Sreelal Sreedhar, Suchibrata Sen, V. Muralidharan,
Abstract:
Fan technology is very critical and crucial for any aero engine technology. The fan disc forms a critical part of the fan module. It is an airworthiness requirement to have a metrological qualified quality disc. The current study uses a tactile probing and scanning on an articulated measuring machine (AMM), a bridge type coordinate measuring machine (CMM) and Metrology software for intermediate and final dimensional and geometrical verification during the prototype development of the disc manufactured through forging and machining process. The circumferential dovetails manufactured through the milling process are evaluated based on the evaluated and analysed metrological process. To perform metrological optimization a change of philosophy is needed making quality measurements available as fast as possible to improve process knowledge and accelerate the process but with accuracy, precise and traceable measurements. The offline CMM programming for inspection and optimisation of the CMM inspection plan are crucial portions of the study and discussed. The dimensional measurement plan as per the ASME B 89.7.2 standard to reach an optimised CMM measurement plan and strategy are an important requirement. The probing strategy, stylus configuration, and approximation strategy effects on the measurements of circumferential dovetail measurements of the developmental prototype disc are discussed. The results were discussed in the form of enhancement of the R &R (repeatability and reproducibility) values with uncertainty levels within the desired limits. The findings from the measurement strategy adopted for disc dovetail evaluation and inspection time optimisation are discussed with the help of various analyses and graphical outputs obtained from the verification process.Keywords: coordinate measuring machine, CMM, aero engine, articulated measuring machine, fan disc
Procedia PDF Downloads 107154 Empirical Study of Correlation between the Cost Performance Index Stability and the Project Cost Forecast Accuracy in Construction Projects
Authors: Amin AminiKhafri, James M. Dawson-Edwards, Ryan M. Simpson, Simaan M. AbouRizk
Abstract:
Earned value management (EVM) has been introduced as an integrated method to combine schedule, budget, and work breakdown structure (WBS). EVM provides various indices to demonstrate project performance including the cost performance index (CPI). CPI is also used to forecast final project cost at completion based on the cost performance during the project execution. Knowing the final project cost during execution can initiate corrective actions, which can enhance project outputs. CPI, however, is not constant during the project, and calculating the final project cost using a variable index is an inaccurate and challenging task for practitioners. Since CPI is based on the cumulative progress values and because of the learning curve effect, CPI variation dampens and stabilizes as project progress. Although various definitions for the CPI stability have been proposed in literature, many scholars have agreed upon the definition that considers a project as stable if the CPI at 20% completion varies less than 0.1 from the final CPI. While 20% completion point is recognized as the stability point for military development projects, construction projects stability have not been studied. In the current study, an empirical study was first conducted using construction project data to determine the stability point for construction projects. Early findings have demonstrated that a majority of construction projects stabilize towards completion (i.e., after 70% completion point). To investigate the effect of CPI stability on cost forecast accuracy, the correlation between CPI stability and project cost at completion forecast accuracy was also investigated. It was determined that as projects progress closer towards completion, variation of the CPI decreases and final project cost forecast accuracy increases. Most projects were found to have 90% accuracy in the final cost forecast at 70% completion point, which is inlined with findings from the CPI stability findings. It can be concluded that early stabilization of the project CPI results in more accurate cost at completion forecasts.Keywords: cost performance index, earned value management, empirical study, final project cost
Procedia PDF Downloads 156153 Exploration of Industrial Symbiosis Opportunities with an Energy Perspective
Authors: Selman Cagman
Abstract:
A detailed analysis is made within an organized industrial zone (OIZ) that has 1165 production facilities such as manufacturing of furniture, fabricated metal products (machinery and equipment), food products, plastic and rubber products, machinery and equipment, non-metallic mineral products, electrical equipment, textile products, and manufacture of wood and cork products. In this OIZ, a field study is done by choosing some facilities that can represent the whole OIZ sectoral distribution. In this manner, there are 207 facilities included to the site visit, and there is a 17 questioned survey carried out with each of them to assess their inputs, outputs, and waste amounts during manufacturing processes. The survey result identify that MDF/Particleboard and chipboard particles, textile, food, foam rubber, sludge (treatment sludge, phosphate-paint sludge, etc.), plastic, paper and packaging, scrap metal (aluminum shavings, steel shavings, iron scrap, profile scrap, etc.), slag (coal slag), ceramic fracture, ash from the fluidized bed are the wastes come from these facilities. As a result, there are 5 industrial symbiosis projects established with this study. One of the projects is a 2.840 kW capacity Integrated Biomass Based Waste Incineration-Energy Production Facility running on 35.000 tons/year of MDF particles and chipboard waste. Another project is a biogas plant with 225 tons/year whey, 100 tons/year of sesame husk, 40 tons/year of burnt wafer dough, and 2.000 tons/year biscuit waste. These two plants investment costs and operational costs are given in detail. The payback time of the 2.840 kW plant is almost 4 years and the biogas plant is around 6 years.Keywords: industrial symbiosis, energy, biogas, waste to incineration
Procedia PDF Downloads 107152 Feasibility of Solar Distillation as Household Water Supply in Saline Zones of Bangladesh
Authors: Md. Rezaul Karim, Md. Ashikur Rahman, Dewan Mahmud Mim
Abstract:
Scarcity of potable water as the result of rapid climate change and saltwater intrusion in groundwater has been a major problem in the coastal regions over the world. In equinoctial countries like Bangladesh, where sunlight is available for more than 10 hours a day, Solar Distillation provides a promising sustainable way for safe drinking water supply in coastal poor households with negligible major cost and difficulty of construction and maintenance. In this paper, two passive type solar stills- a Conventional Single Slope Solar still (CSS) and a Pyramid Solar Sill (PSS) is used and relationship is established between distill water output corresponding to four different factors- temperature, solar intensity, relative humidity and wind speed for Gazipur, Bangladesh. Comparison is analyzed between the two different still outputs for nine months period (January- September) and efficiency is calculated. Later a thermal mathematical model is developed and the distilled water output for Khulna, Bangladesh is computed. Again, difference between the output of the two cities- Gazipur and Khulna is demonstrated and finally an economic analysis is prepared. The distillation output has a positive correlation with temperature and solar intensity, inverse relation with relative humidity and wind speed has nugatory consequence. The maximum output of Conventional Solar Still is obtained 3.8 L/m2/day and Pyramid still is 4.3 L/m2/day for Gazipur and almost 15% more efficiency is found for Pyramid still. Productivity in Khulna is found almost 20% more than Gazipur. Based on economic analysis, taking 10 BDT, per liter, the net profit, benefit cost ratio, payback period all indicates that both stills are feasible but pyramid still is more feasible than Conventional Still. Finally, for a 3-4 member family, area of 4 m2 is suggested for Conventional Still and 3m2 for Pyramid Solar Still.Keywords: solar distillation, household water supply, saline zones, Bangladesh
Procedia PDF Downloads 271151 Impact Assessment of Climate Change on Water Resources in the Kabul River Basin
Authors: Tayib Bromand, Keisuke Sato
Abstract:
This paper presents the introduction to current water balance and climate change assessment in the Kabul river basin. The historical and future impacts of climate change on different components of water resources and hydrology in the Kabul river basin. The eastern part of Afghanistan, the Kabul river basin was chosen due to rapid population growth and land degradation to quantify the potential influence of Gobal Climate Change on its hydrodynamic characteristics. Luck of observed meteorological data was the main limitation of present research, few existed precipitation stations in the plain area of Kabul basin selected to compare with TRMM precipitation records, the result has been evaluated satisfactory based on regression and normal ratio methods. So the TRMM daily precipitation and NCEP temperature data set applied in the SWAT model to evaluate water balance for 2008 to 2012. Middle of the twenty – first century (2064) selected as the target period to assess impacts of climate change on hydrology aspects in the Kabul river basin. For this purpose three emission scenarios, A2, A1B and B1 and four GCMs, such as MIROC 3.2 (Med), CGCM 3.1 (T47), GFDL-CM2.0 and CNRM-CM3 have been selected, to estimate the future initial conditions of the proposed model. The outputs of the model compared and calibrated based on (R2) satisfactory. The assessed hydrodynamic characteristics and precipitation pattern. The results show that there will be significant impacts on precipitation patter such as decreasing of snowfall in the mountainous area of the basin in the Winter season due to increasing of 2.9°C mean annual temperature and land degradation due to deforestation.Keywords: climate change, emission scenarios, hydrological components, Kabul river basin, SWAT model
Procedia PDF Downloads 465150 Project Progress Prediction in Software Devlopment Integrating Time Prediction Algorithms and Large Language Modeling
Authors: Dong Wu, Michael Grenn
Abstract:
Managing software projects effectively is crucial for meeting deadlines, ensuring quality, and managing resources well. Traditional methods often struggle with predicting project timelines accurately due to uncertain schedules and complex data. This study addresses these challenges by combining time prediction algorithms with Large Language Models (LLMs). It makes use of real-world software project data to construct and validate a model. The model takes detailed project progress data such as task completion dynamic, team Interaction and development metrics as its input and outputs predictions of project timelines. To evaluate the effectiveness of this model, a comprehensive methodology is employed, involving simulations and practical applications in a variety of real-world software project scenarios. This multifaceted evaluation strategy is designed to validate the model's significant role in enhancing forecast accuracy and elevating overall management efficiency, particularly in complex software project environments. The results indicate that the integration of time prediction algorithms with LLMs has the potential to optimize software project progress management. These quantitative results suggest the effectiveness of the method in practical applications. In conclusion, this study demonstrates that integrating time prediction algorithms with LLMs can significantly improve the predictive accuracy and efficiency of software project management. This offers an advanced project management tool for the industry, with the potential to improve operational efficiency, optimize resource allocation, and ensure timely project completion.Keywords: software project management, time prediction algorithms, large language models (LLMS), forecast accuracy, project progress prediction
Procedia PDF Downloads 79149 Development of a Fuzzy Logic Based Model for Monitoring Child Pornography
Authors: Mariam Ismail, Kazeem Rufai, Jeremiah Balogun
Abstract:
A study was conducted to apply fuzzy logic to the development of a monitoring model for child pornography based on associated risk factors, which can be used by forensic experts or integrated into forensic systems for the early detection of child pornographic activities. A number of methods were adopted in the study, which includes an extensive review of related works was done in order to identify the factors that are associated with child pornography following which they were validated by an expert sex psychologist and guidance counselor, and relevant data was collected. Fuzzy membership functions were used to fuzzify the associated variables identified alongside the risk of the occurrence of child pornography based on the inference rules that were provided by the experts consulted, and the fuzzy logic expert system was simulated using the Fuzzy Logic Toolbox available in the MATLAB Software Release 2016. The results of the study showed that there were 4 categories of risk factors required for assessing the risk of a suspect committing child pornography offenses. The results of the study showed that 2 and 3 triangular membership functions were used to formulate the risk factors based on the 2 and 3 number of labels assigned, respectively. The results of the study showed that 5 fuzzy logic models were formulated such that the first 4 was used to assess the impact of each category on child pornography while the last one takes the 4 outputs from the 4 fuzzy logic models as inputs required for assessing the risk of child pornography. The following conclusion was made; there were factors that were related to personal traits, social traits, history of child pornography crimes, and self-regulatory deficiency traits by the suspects required for the assessment of the risk of child pornography crimes committed by a suspect. Using the values of the identified risk factors selected for this study, the risk of child pornography can be easily assessed from their values in order to determine the likelihood of a suspect perpetuating the crime.Keywords: fuzzy, membership functions, pornography, risk factors
Procedia PDF Downloads 129148 Theoretical Comparisons and Empirical Illustration of Malmquist, Hicks–Moorsteen, and Luenberger Productivity Indices
Authors: Fatemeh Abbasi, Sahand Daneshvar
Abstract:
Productivity is one of the essential goals of companies to improve performance, which as a strategy-oriented method, determines the basis of the company's economic growth. The history of productivity goes back centuries, but most researchers defined productivity as the relationship between a product and the factors used in production in the early twentieth century. Productivity as the optimal use of available resources means that "more output using less input" can increase companies' economic growth and prosperity capacity. Also, having a quality life based on economic progress depends on productivity growth in that society. Therefore, productivity is a national priority for any developed country. There are several methods for calculating productivity growth measurements that can be divided into parametric and non-parametric methods. Parametric methods rely on the existence of a function in their hypotheses, while non-parametric methods do not require a function based on empirical evidence. One of the most popular non-parametric methods is Data Envelopment Analysis (DEA), which measures changes in productivity over time. The DEA evaluates the productivity of decision-making units (DMUs) based on mathematical models. This method uses multiple inputs and outputs to compare the productivity of similar DMUs such as banks, government agencies, companies, airports, Etc. Non-parametric methods are themselves divided into the frontier and non frontier approaches. The Malmquist productivity index (MPI) proposed by Caves, Christensen, and Diewert (1982), the Hicks–Moorsteen productivity index (HMPI) proposed by Bjurek (1996), or the Luenberger productivity indicator (LPI) proposed by Chambers (2002) are powerful tools for measuring productivity changes over time. This study will compare the Malmquist, Hicks–Moorsteen, and Luenberger indices theoretically and empirically based on DEA models and review their strengths and weaknesses.Keywords: data envelopment analysis, Hicks–Moorsteen productivity index, Leuenberger productivity indicator, malmquist productivity index
Procedia PDF Downloads 194147 Quantifying Rumen Enteric Methane Production in Extensive Production Systems
Authors: Washaya Soul, Mupangwa John, Mapfumo Lizwell, Muchenje Voster
Abstract:
Ruminant animals contribute a considerable amount of methane to the atmosphere, which is a cause of concern for global warming. Two studies were conducted in beef and goats where the studies aimed to determine the enteric CH₄ levels from a herd of beef cows raised on semi-arid rangelands and to evaluate the effect of supplementing goats with forage legumes: Vigna unguiculata and Lablab purpureus on enteric methane production. A total of 24 cows were selected from Boran and Nguni cows (n = 12 per breed) from two different farms; parity (P1 – P4) and season (dry vs. wet) were considered predictor variables in the first experiment. Eighteen goats (weaners, 9 males, 9 females) were used, in which sex and forage species were predictor variables in the second experiment. Three treatment diets were used in goats. Methane was measured using a Laser methane detector [LMD] for six consecutive days and repeated once after every three months in beef cows and once every week for 6 weeks in goats during the post-adaptation period. Parity and breed had no effects on CH₄ production in beef cows; however, season significantly influenced CH₄ outputs. Methane production was higher (P<0.05) in the dry compared to the wet season, 31.1CH₄/DMI(g/kg) and 28.8 CH₄/DMI(g/kg) for the dry and wet seasons, respectively. In goats, forage species and sex of the animal affected enteric methane production (P<0.05). Animals produce more gas when ruminating than feeding or just standing for all treatments. The control treatment exhibited higher (P<0.05) methane emissions per kg of DMI. Male goats produced more methane compared to females (17.40L/day; 12.46 g/kg DMI and 0.126g/day) versus (15.47L/day, 12.28 g/kg DMI, 0.0109g/day) respectively. It was concluded that cows produce more CH₄/DMI during the dry season, while forage legumes reduce enteric methane production in goats, and male goats produce more gas compared to females. It is recommended to introduce forage legumes, particularly during the dry season, to reduce the amount of gas produced.Keywords: beef cows, extensive grazing system, forage legumes, greenhouse gases, goats Laser methane detector.
Procedia PDF Downloads 66146 The Many Faces of Inspiration: A Study on Socio-Cultural Influences in Design
Authors: Nithya Venkataraman
Abstract:
The creative journey in design often starts with a spark of inspiration, the source of which can be from myriad stimuli- nature, poetry, personal experiences or even fleeting thoughts and images. While it is indeed an important source of creative exploration, interpretation of this inspiration may often times be influenced by demographic and psychographic variables of the creator - Age, gender, lifecycle stage, personal experiences and individual personality traits being some of these factors. Common sources of inspiration can thus be interpreted differently, translating to different elements of design, and using varied principles in their execution. Do such variables in the creator influence the nature of the creative output? If yes, what are the visible matrices in the output which can be differentiated? An observational study with two groups of Design students, studying in the same design institute, under the guidance of the same design mentor, was conducted to map this influence. Both the groups were unaware of each other but worked with a common source of inspiration as provided by the instructor. In order to maintain congruence, both the groups were provided with lyrical compositions from well-known ballads and poetry as the source of their inspiration. The outputs were abstract renditions using lines, colors and shapes; and these were analyzed under matrices for the elements and principles used to create the compositions. The study indicated that there was a demarcation in terms of the choice of lines, colors and shapes chosen to create the composition, between both groups. The groups also tended to use repetition, proportion and emphasis differently; giving rise to varied uses of the Design principles. The study threw interesting observations on how Design interpretation can vary for the same source of inspiration, based on demographic and psychographic variances. The implications can be traced not just to the process of creative design, but also to the deep social roots that bind creative thinking and Design ideation; which can provide an interesting commentary between different cohorts on what constitutes ‘Good Design’.Keywords: design compositions, inspiration, interpretation, psychographic factors, social factors
Procedia PDF Downloads 121145 Unsupervised Classification of DNA Barcodes Species Using Multi-Library Wavelet Networks
Authors: Abdesselem Dakhli, Wajdi Bellil, Chokri Ben Amar
Abstract:
DNA Barcode, a short mitochondrial DNA fragment, made up of three subunits; a phosphate group, sugar and nucleic bases (A, T, C, and G). They provide good sources of information needed to classify living species. Such intuition has been confirmed by many experimental results. Species classification with DNA Barcode sequences has been studied by several researchers. The classification problem assigns unknown species to known ones by analyzing their Barcode. This task has to be supported with reliable methods and algorithms. To analyze species regions or entire genomes, it becomes necessary to use similarity sequence methods. A large set of sequences can be simultaneously compared using Multiple Sequence Alignment which is known to be NP-complete. To make this type of analysis feasible, heuristics, like progressive alignment, have been developed. Another tool for similarity search against a database of sequences is BLAST, which outputs shorter regions of high similarity between a query sequence and matched sequences in the database. However, all these methods are still computationally very expensive and require significant computational infrastructure. Our goal is to build predictive models that are highly accurate and interpretable. This method permits to avoid the complex problem of form and structure in different classes of organisms. On empirical data and their classification performances are compared with other methods. Our system consists of three phases. The first is called transformation, which is composed of three steps; Electron-Ion Interaction Pseudopotential (EIIP) for the codification of DNA Barcodes, Fourier Transform and Power Spectrum Signal Processing. The second is called approximation, which is empowered by the use of Multi Llibrary Wavelet Neural Networks (MLWNN).The third is called the classification of DNA Barcodes, which is realized by applying the algorithm of hierarchical classification.Keywords: DNA barcode, electron-ion interaction pseudopotential, Multi Library Wavelet Neural Networks (MLWNN)
Procedia PDF Downloads 317144 The Rapid Industrialization Model
Authors: Fredrick Etyang
Abstract:
This paper presents a Rapid Industrialization Model (RIM) designed to support existing industrialization policies, strategies and industrial development plans at National, Regional and Constituent level in Africa. The model will reinforce efforts to attainment of inclusive and sustainable industrialization of Africa by state and non-state actors. The overall objective of this model is to serve as a framework for rapid industrialization in developing economies and the specific objectives range from supporting rapid industrialization development to promoting a structural change in the economy, a balanced regional industrial growth, achievement of local, regional and international competitiveness in areas of clear comparative advantage in industrial exports and ultimately, the RIM will serve as a step-by-step guideline for the industrialization of African Economies. This model is a product of a scientific research process underpinned by desk research through the review of African countries development plans, strategies, datasets, industrialization efforts and consultation with key informants. The rigorous research process unearthed multi-directional and renewed efforts towards industrialization of Africa premised on collective commitment of individual states, regional economic communities and the African union commission among other strategic stakeholders. It was further, established that the inputs into industrialization of Africa outshine the levels of industrial development on the continent. The RIM comes in handy to serve as step-by-step framework for African countries to follow in their industrial development efforts of transforming inputs into tangible outputs and outcomes in the short, intermediate and long-run. This model postulates three stages of industrialization and three phases toward rapid industrialization of African economies, the model is simple to understand, easily implementable and contextualizable with high return on investment for each unit invested into industrialization supported by the model. Therefore, effective implementation of the model will result into inclusive and sustainable rapid industrialization of Africa.Keywords: economic development, industrialization, economic efficiency, exports and imports
Procedia PDF Downloads 83143 The Application of Line Balancing Technique and Simulation Program to Increase Productivity in Hard Disk Drive Components
Authors: Alonggot Limcharoen, Jintana Wannarat, Vorawat Panich
Abstract:
This study aims to investigate the balancing of the number of operators (Line Balancing technique) in the production line of hard disk drive components in order to increase efficiency. At present, the trend of using hard disk drives has continuously declined leading to limits in a company’s revenue potential. It is important to improve and develop the production process to create market share and to have the ability to compete with competitors with a higher value and quality. Therefore, an effective tool is needed to support such matters. In this research, the Arena program was applied to analyze the results both before and after the improvement. Finally, the precedent was used before proceeding with the real process. There were 14 work stations with 35 operators altogether in the RA production process where this study was conducted. In the actual process, the average production time was 84.03 seconds per product piece (by timing 30 times in each work station) along with a rating assessment by implementing the Westinghouse principles. This process showed that the rating was 123% underlying an assumption of 5% allowance time. Consequently, the standard time was 108.53 seconds per piece. The Takt time was calculated from customer needs divided by working duration in one day; 3.66 seconds per piece. Of these, the proper number of operators was 30 people. That meant five operators should be eliminated in order to increase the production process. After that, a production model was created from the actual process by using the Arena program to confirm model reliability; the outputs from imitation were compared with the original (actual process) and this comparison indicated that the same output meaning was reliable. Then, worker numbers and their job responsibilities were remodeled into the Arena program. Lastly, the efficiency of production process enhanced from 70.82% to 82.63% according to the target.Keywords: hard disk drive, line balancing, ECRS, simulation, arena program
Procedia PDF Downloads 225142 Determinants of Green Strategy: Analysis Using Probit and Logit Models
Authors: Ayushi Modi, Eliot Bochet-Merand
Abstract:
This study investigates the structural determinants of green strategies among Small and Medium Enterprises (SMEs) in the European Union and select countries, utilizing data from the Flash Eurobarometer 498 - SMEs, Resource Efficiency, and Green Markets. By applying sequential logit analysis, we explore the drivers behind the adoption and scaling of green actions, such as resource efficiency, waste management, and product innovation, while also examining the provision of green products and services. A key contribution of this research is the novel distinction between the process stage (green actions) and the product stage (green outputs), allowing for a deeper analysis of how green initiatives translate into sustainable business outcomes. Our findings reveal that structural characteristics, such as firm size, sector, and turnover growth, significantly influence the likelihood of both providing green products and implementing comprehensive green actions. Smaller, younger firms in high-impact sectors like construction and industry are more likely to engage in sustainability efforts, particularly when they have a green strategy and a dedicated green workforce. Furthermore, companies serving B2B and B2C clients and experiencing turnover growth are more inclined to offer green products. The study underscores the economic implications of these insights, suggesting that financial flexibility, strategic commitment, and human capital investments are critical for scaling green initiatives. By refining variables and excluding heterogeneous countries, our data management ensures robust results. This research provides novel insights into the distinct roles of process and product stages in sustainability, offering valuable policy recommendations for promoting environmental performance in SMEs.Keywords: green strategy, resource efficiency, SMES, sustainability, product innovation, environmental performance
Procedia PDF Downloads 18141 Design of Solar Charge Controller and Power Converter with the Multisim
Authors: Sohal Latif
Abstract:
Solar power is in the form of photovoltaic, also known as PV, which is a form of renewable energy that applies solar panels in producing electricity from the sun. It has a vital role in fulfilling the present need for clean and renewable energy to get rid of conventional and non-renewable energy sources that emit high levels of greenhouse gases. Solar energy is embraced because of its availability, easy accessibility, and effectiveness in the provision of power, chiefly in country areas. In solar charging, device charge entails a change of light power into electricity using photovoltaic or PV panels, which supply direct current electric power or DC. Here, the solar charge controller has a very crucial role to play regarding the voltages and the currents coming from the solar panels to take up the changing needs of a battery without overcharging the same. Certain devices, such as inverters, are required to transform the DC power produced by the solar panels into an AC to serve the normal electrical appliances and the current power network. This project was initiated for a project of a solar charge controller and power converter with the MULTISIM. The formation of this project begins with a literature survey to obtain basic knowledge about power converters, charge controllers, and photovoltaic systems. Fundamentals of the operation of solar panels include the process by which light is converted into electricity and a comparison of PWM and MPPT chargers with controllers. Knowledge of rectifiers is built to help achieve AC-to-DC and DC-AC change. Choosing a resistor, capacitance, MOSFET, and OP-AMP is done by the need of the system. The circuit diagrams of converters and charge controllers are designed using the Multisim program. Pulse width modulation, Bubba oscillator circuit, and inverter circuits are modeled and simulated. In the subsequent steps, the analysis of the simulation outcomes indicates the efficiency of the intended converter systems. The various outputs from the different configurations, with the transformer incorporated as well as without it, are then monitored for effective power conversion as well as power regulation.Keywords: solar charge controller, MULTISIM, converter, inverter
Procedia PDF Downloads 22140 Integrating Dependent Material Planning Cycle into Building Information Management: A Building Information Management-Based Material Management Automation Framework
Authors: Faris Elghaish, Sepehr Abrishami, Mark Gaterell, Richard Wise
Abstract:
The collaboration and integration between all building information management (BIM) processes and tasks are necessary to ensure that all project objectives can be delivered. The literature review has been used to explore the state of the art BIM technologies to manage construction materials as well as the challenges which have faced the construction process using traditional methods. Thus, this paper aims to articulate a framework to integrate traditional material planning methods such as ABC analysis theory (Pareto principle) to analyse and categorise the project materials, as well as using independent material planning methods such as Economic Order Quantity (EOQ) and Fixed Order Point (FOP) into the BIM 4D, and 5D capabilities in order to articulate a dependent material planning cycle into BIM, which relies on the constructability method. Moreover, we build a model to connect between the material planning outputs and the BIM 4D and 5D data to ensure that all project information will be accurately presented throughout integrated and complementary BIM reporting formats. Furthermore, this paper will present a method to integrate between the risk management output and the material management process to ensure that all critical materials are monitored and managed under the all project stages. The paper includes browsers which are proposed to be embedded in any 4D BIM platform in order to predict the EOQ as well as FOP and alarm the user during the construction stage. This enables the planner to check the status of the materials on the site as well as to get alarm when the new order will be requested. Therefore, this will lead to manage all the project information in a single context and avoid missing any information at early design stage. Subsequently, the planner will be capable of building a more reliable 4D schedule by allocating the categorised material with the required EOQ to check the optimum locations for inventory and the temporary construction facilitates.Keywords: building information management, BIM, economic order quantity, EOQ, fixed order point, FOP, BIM 4D, BIM 5D
Procedia PDF Downloads 172139 Bioinformatics Identification of Rare Codon Clusters in Proteins Structure of HBV
Authors: Abdorrasoul Malekpour, Mohammad Ghorbani Mojtaba Mortazavi, Mohammadreza Fattahi, Mohammad Hassan Meshkibaf, Ali Fakhrzad, Saeid Salehi, Saeideh Zahedi, Amir Ahmadimoghaddam, Parviz Farzadnia Dr., Mohammadreza Hajyani Asl Bs
Abstract:
Hepatitis B as an infectious disease has eight main genotypes (A–H). The aim of this study is to Bioinformatically identify Rare Codon Clusters (RCC) in proteins structure of HBV. For detection of protein family accession numbers (Pfam) of HBV proteins; used of uni-prot database and Pfam search tool were used. Obtained Pfam IDs were analyzed in Sherlocc program and RCCs in HBV proteins were detected. In further, the structures of TrEMBL entries proteins studied in PDB database and 3D structures of the HBV proteins and locations of RCCs were visualized and studied using Swiss PDB Viewer software. Pfam search tool have found nine significant hits and 0 insignificant hits in 3 frames. Results of Pfams studied in the Sherlocc program show this program not identified RCCs in the external core antigen (PF08290) and truncated HBeAg protein (PF08290). By contrast the RCCs become identified in Hepatitis core antigen (PF00906) Large envelope protein S (PF00695), X protein (PF00739), DNA polymerase (viral) N-terminal domain (PF00242) and Protein P (Pf00336). In HBV genome, seven RCC identified that found in hepatitis core antigen, large envelope protein S and DNA polymerase proteins and proteins structures of TrEMBL entries sequences that reported in Sherlocc program outputs are not complete. Based on situation of RCC in structure of HBV proteins, it suggested those RCCs are important in HBV life cycle. We hoped that this study provide a new and deep perspective in protein research and drug design for treatment of HBV.Keywords: rare codon clusters, hepatitis B virus, bioinformatic study, infectious disease
Procedia PDF Downloads 488138 Study of the Biochemical Properties of the Protease Coagulant Milk Extracted from Sunflower Cake: Manufacturing Test of Cheeses Uncooked Dough Press and Analysis of Sensory Properties
Authors: Kahlouche Amal, Touzene F. Zohra, Betatache Fatihaet Nouani Abdelouahab
Abstract:
The development of the world production of the cheese these last decades, as well as agents' greater request cheap coagulants, accentuated the search for new surrogates of the rennet. What about the interest to explore the vegetable biodiversity, the source well cheap of many naturals metabolites that the scientists today praise it (thistle, latex of fig tree, Cardoon, seeds of melon). Indeed, a big interest is concerned the search for surrogates of vegetable origin. The objective of the study is to show the possibility of extracting a protease coagulant the milk from the cake of Sunflower, available raw material and the potential source of surrogates of rennet. so, the determination of the proteolytic activity of raw extracts, the purification, the elimination of the pigments of tint of the enzymatic preparations, a better knowledge of the coagulative properties through study of the effect of certain factors (temperature, pH, concentration in CaCl2) are so many factors which contribute to value milk particularly those produced by the small ruminants of the Algerian dairy exploitations. Otherwise, extracts coagulants of vegetable origin allowed today to value traditional, in addition, although the extract coagulants of vegetable origin made it possible today to develop traditional cheeses whose Iberian peninsula is the promoter, but the test of 'pressed paste not cooked' cheese manufacturing led to the semi-scale pilot; and that, by using the enzymatic extract of sunflower (Helianthus annus) which gave satisfactory results as well to the level of outputs as on the sensory level,which, statistically,did not give any significant difference between studied cheeses. These results confirm the possibility of use of this coagulase as a substitute of rennet commercial on an industrial scale.Keywords: characterization, cheese, Rennet, sunflower
Procedia PDF Downloads 351137 Study on Seismic Performance of Reinforced Soil Walls in Order to Offer Modified Pseudo Static Method
Authors: Majid Yazdandoust
Abstract:
This study, tries to suggest a design method based on displacement using finite difference numerical modeling in reinforcing soil retaining wall with steel strip. In this case, dynamic loading characteristics such as duration, frequency, peak ground acceleration, geometrical characteristics of reinforced soil structure and type of the site are considered to correct the pseudo static method and finally introduce the pseudo static coefficient as a function of seismic performance level and peak ground acceleration. For this purpose, the influence of dynamic loading characteristics, reinforcement length, height of reinforced system and type of the site are investigated on seismic behavior of reinforcing soil retaining wall with steel strip. Numerical results illustrate that the seismic response of this type of wall is highly dependent to cumulative absolute velocity, maximum acceleration, and height and reinforcement length so that the reinforcement length can be introduced as the main factor in shape of failure. Considering the loading parameters, mechanically stabilized earth wall parameters and type of the site showed that the used method in this study leads to most efficient designs in comparison with other methods which are generally suggested in cods that are usually based on limit-equilibrium concept. The outputs show the over-estimation of equilibrium design methods in comparison with proposed displacement based methods here.Keywords: pseudo static coefficient, seismic performance design, numerical modeling, steel strip reinforcement, retaining walls, cumulative absolute velocity, failure shape
Procedia PDF Downloads 485136 A Hierarchical Bayesian Calibration of Data-Driven Models for Composite Laminate Consolidation
Authors: Nikolaos Papadimas, Joanna Bennett, Amir Sakhaei, Timothy Dodwell
Abstract:
Composite modeling of consolidation processes is playing an important role in the process and part design by indicating the formation of possible unwanted prior to expensive experimental iterative trial and development programs. Composite materials in their uncured state display complex constitutive behavior, which has received much academic interest, and this with different models proposed. Errors from modeling and statistical which arise from this fitting will propagate through any simulation in which the material model is used. A general hyperelastic polynomial representation was proposed, which can be readily implemented in various nonlinear finite element packages. In our case, FEniCS was chosen. The coefficients are assumed uncertain, and therefore the distribution of parameters learned using Markov Chain Monte Carlo (MCMC) methods. In engineering, the approach often followed is to select a single set of model parameters, which on average, best fits a set of experiments. There are good statistical reasons why this is not a rigorous approach to take. To overcome these challenges, A hierarchical Bayesian framework was proposed in which population distribution of model parameters is inferred from an ensemble of experiments tests. The resulting sampled distribution of hyperparameters is approximated using Maximum Entropy methods so that the distribution of samples can be readily sampled when embedded within a stochastic finite element simulation. The methodology is validated and demonstrated on a set of consolidation experiments of AS4/8852 with various stacking sequences. The resulting distributions are then applied to stochastic finite element simulations of the consolidation of curved parts, leading to a distribution of possible model outputs. With this, the paper, as far as the authors are aware, represents the first stochastic finite element implementation in composite process modelling.Keywords: data-driven , material consolidation, stochastic finite elements, surrogate models
Procedia PDF Downloads 145135 Tourism Oriented Planning Experience in the Historical City Center of Trabzon (Turkey) with Strategic Spatial Planning Approach: Evaluation of Approach and Process
Authors: Emrehan Ozcan, Dilek Beyazlı
Abstract:
The development of tourism depends on an accurate planning approach as well as on the right planning process. This dependency is also a key factor in ensuring sustainability of tourism. The types of tourism, social expectations, planning practice, the socio-economic and the cultural structure of the region are determinants of planning approaches for tourism development. The tourism plans prepared for the historic city centers are usually based on the revitalization of cultural and historical values. The preservation and development of the tourism potentials of the historic city centers are important for providing an economic contribution to the locality, creating livable solutions for local residents and also the sustainability of tourism. This research is about experiencing and discussing a planning approach that will provide tourism development based on historical and cultural values. Historical and cultural values in the historical city center of Trabzon -which has a settlement history of approximately 4000 years, is located on the Black Sea coast of Turkey- wear out over years and lose their tourism potential. A planning study has been experienced with strategic spatial planning approach for Trabzon, which has not done a tourism-oriented planning study until now. The stages of the planning process provided by strategic spatial planning approach are an assessment of the current situation; vision, strategies, and actions; action planning; designing and implementation of actions and monitoring-evaluation. In the discussion section, the advantages, planning process, methods and techniques of the approach are discussed for the possibilities and constraints in terms of tourism planning. In this context, it is aimed to put forth tourism planning process, stages, and implementation tools within the scope of strategic spatial planning approach by comparing approaches used in the tourism-oriented/priority planning of historical city centers. Suggestions on the position and effect of the preferred planning approach in the existing spatial planning practice are the outputs of the study.Keywords: cultural heritage, tourism oriented planning, Trabzon, strategic spatial Planning
Procedia PDF Downloads 258134 Assessing the Benefits of Super Depo Sutorejo as a Model of integration of Waste Pickers in a Sustainable City Waste Management
Authors: Yohanes Kambaru Windi, Loetfia Dwi Rahariyani, Dyah Wijayanti, Eko Rustamaji
Abstract:
Surabaya, the second largest city in Indonesia, has been struggling for years with waste production and its management. Nearly 11,000 tons of waste are generated daily by domestic, commercial and industrial areas. It is predicted that approximately 1,300 tons of waste overflew the Benowo Landfill daily in 2013 and projected that the landfill operation will be critical in 2015. The Super Depo Sutorejo (SDS) is a pilot project on waste management launched by the government of Surabaya in March 2013. The project is aimed to reduce the amount of waste dumped in landfill by sorting the recyclable and organic waste for composting by employing waste pickers to sort the waste before transported to landfill. This study is intended to assess the capacity of SDS to process and reduce waste and its complementary benefits. It also overviews the benefits of the project to the waste pickers in term of satisfaction to the job. Waste processing data-sheets were used to assess the difference between input and outputs waste. A survey was distributed to 30 waste pickers and interviews were conducted as a further insight on a particular issue. The analysis showed that SDS enable to reduce waste up to 50% before dumped in the final disposal area. The cost-benefits analysis using cost differential calculation revealed the economic benefit is considerable low, but composting may substitute tangible benefits for maintain the city’s parks. Waste pickers are mostly satisfied with their job (i.e. Salary, health coverage, job security), services and facilities available in SDS and enjoyed rewarding social life within the project. It is concluded that SDS is an effective and efficient model for sustainable waste management and reliable to be developed in developing countries. It is a strategic approach to empower and open up working opportunity for the poor urban community and prolong the operation of landfills.Keywords: cost-benefits, integration, satisfaction, waste management
Procedia PDF Downloads 476133 Destruction of Coastal Wetlands in Harper City-Liberia: Setting Nature against the Future Society
Authors: Richard Adu Antwako
Abstract:
Coastal wetland destruction and its consequences have recently taken the center stage of global discussions. This phenomenon is no gray area to humanity as coastal wetland-human interaction seems inevitably ingrained in the earliest civilizations, amidst the demanding use of its resources to meet their necessities. The severity of coastal wetland destruction parallels with growing civilizations, and it is against this backdrop that, this paper interrogated the causes of coastal wetland destruction in Harper City in Liberia, compared the degree of coastal wetland stressors to the non-equilibrium thermodynamic scale as well as suggested an integrated coastal zone management to address the problems. Literature complemented the primary data gleaned via global positioning system devices, field observation, questionnaire, and interviews. Multi-sampling techniques were used to generate data from the sand miners, institutional heads, fisherfolk, community-based groups, and other stakeholders. Non-equilibrium thermodynamic theory remains vibrant in discerning the ecological stability, and it would be employed to further understand the coastal wetland destruction in Harper City, Liberia and to measure the coastal wetland stresses-amplitude and elasticity. The non-equilibrium thermodynamics postulates that the coastal wetlands are capable of assimilating resources (inputs), as well as discharging products (outputs). However, the input-output relationship exceedingly stretches beyond the thresholds of the coastal wetlands, leading to coastal wetland disequilibrium. Findings revealed that the sand mining, mangrove removal, and crude dumping have transformed the coastal wetlands, resulting in water pollution, flooding, habitat loss and disfigured beaches in Harper City in Liberia. This paper demonstrates that the coastal wetlands are converted into developmental projects and agricultural fields, thus, endangering the future society against nature.Keywords: amplitude, crude dumping, elasticity, non-equilibrium thermodynamics, wetland destruction
Procedia PDF Downloads 141