Search results for: activity-based benefit approach
11547 Rank-Based Chain-Mode Ensemble for Binary Classification
Authors: Chongya Song, Kang Yen, Alexander Pons, Jin Liu
Abstract:
In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve.Keywords: consensus, curse of correlation, imbalance classification, rank-based chain-mode ensemble
Procedia PDF Downloads 13811546 Integrated Coastal Management for the Sustainable Development of Coastal Cities: The Case of El-Mina, Tripoli, Lebanon
Authors: G. Ghamrawi, Y. Abunnasr, M. Fawaz, S. Yazigi
Abstract:
Coastal cities are constantly exposed to environmental degradation and economic regression fueled by rapid and uncontrolled urban growth as well as continuous resource depletion. This is the case of the City of Mina in Tripoli (Lebanon), where lack of awareness to preserve social, ecological, and historical assets, coupled with the increasing development pressures, are threatening the socioeconomic status of the city residents, the quality of life and accessibility to the coast. To address these challenges, a holistic coastal urban design and planning approach was developed to analyze the environmental, political, legal, and socioeconomic context of the city. This approach aims to investigate the potential of balancing urban development with the protection and enhancement of cultural, ecological, and environmental assets under an integrated coastal zone management approach (ICZM). The analysis of Mina's different sectors adopted several tools that include direct field observation, interviews with stakeholders, analysis of available data, historical maps, and previously proposed projects. The findings from the analysis were mapped and graphically represented, allowing the recognition of character zones that become the design intervention units. Consequently, the thesis proposes an urban, city-scale intervention that identifies 6 different character zones (the historical fishing port, Abdul Wahab island, the abandoned Port Said, Hammam el Makloub, the sand beach, and the new developable area) and proposes context-specific design interventions that capitalize on the main characteristics of each zone. Moreover, the intervention builds on the institutional framework of ICZM as well as other studies previously conducted for the coast and adopts nature-based solutions with hybrid systems for providing better environmental design solutions for developing the coast. This enables the realization of an all-inclusive, well-connected shoreline with easy and free access towards the sea; a developed shoreline with an active local economy, and an improved urban environment.Keywords: blue green infrastructure, coastal cities, hybrid solutions, integrated coastal zone management, sustainable development, urban planning
Procedia PDF Downloads 15611545 Artificial Cells Capable of Communication by Using Polymer Hydrogel
Authors: Qi Liu, Jiqin Yao, Xiaohu Zhou, Bo Zheng
Abstract:
The first artificial cell was produced by Thomas Chang in the 1950s when he was trying to make a mimic of red blood cells. Since then, many different types of artificial cells have been constructed from one of the two approaches: a so-called bottom-up approach, which aims to create a cell from scratch, and a top-down approach, in which genes are sequentially knocked out from organisms until only the minimal genome required for sustaining life remains. In this project, bottom-up approach was used to build a new cell-free expression system which mimics artificial cell that capable of protein expression and communicate with each other. The artificial cells constructed from the bottom-up approach are usually lipid vesicles, polymersomes, hydrogels or aqueous droplets containing the nucleic acids and transcription-translation machinery. However, lipid vesicles based artificial cells capable of communication present several issues in the cell communication research: (1) The lipid vesicles normally lose the important functions such as protein expression within a few hours. (2) The lipid membrane allows the permeation of only small molecules and limits the types of molecules that can be sensed and released to the surrounding environment for chemical communication; (3) The lipid vesicles are prone to rupture due to the imbalance of the osmotic pressure. To address these issues, the hydrogel-based artificial cells were constructed in this work. To construct the artificial cell, polyacrylamide hydrogel was functionalized with Acrylate PEG Succinimidyl Carboxymethyl Ester (ACLT-PEG2000-SCM) moiety on the polymer backbone. The proteinaceous factors can then be immobilized on the polymer backbone by the reaction between primary amines of proteins and N-hydroxysuccinimide esters (NHS esters) of ACLT-PEG2000-SCM, the plasmid template and ribosome were encapsulated inside the hydrogel particles. Because the artificial cell could continuously express protein with the supply of nutrients and energy, the artificial cell-artificial cell communication and artificial cell-natural cell communication could be achieved by combining the artificial cell vector with designed plasmids. The plasmids were designed referring to the quorum sensing (QS) system of bacteria, which largely relied on cognate acyl-homoserine lactone (AHL) / transcription pairs. In one communication pair, “sender” is the artificial cell or natural cell that can produce AHL signal molecule by synthesizing the corresponding signal synthase that catalyzed the conversion of S-adenosyl-L-methionine (SAM) into AHL, while the “receiver” is the artificial cell or natural cell that can sense the quorum sensing signaling molecule form “sender” and in turn express the gene of interest. In the experiment, GFP was first immobilized inside the hydrogel particle to prove that the functionalized hydrogel particles could be used for protein binding. After that, the successful communication between artificial cell-artificial cell and artificial cell-natural cell was demonstrated, the successful signal between artificial cell-artificial cell or artificial cell-natural cell could be observed by recording the fluorescence signal increase. The hydrogel-based artificial cell designed in this work can help to study the complex communication system in bacteria, it can also be further developed for therapeutic applications.Keywords: artificial cell, cell-free system, gene circuit, synthetic biology
Procedia PDF Downloads 15211544 Measuring the Extent of Equalization in Fiscal Transfers in India: An Index-Based Approach
Authors: Ragini Trehan, D.K. Srivastava
Abstract:
In the post-planning era, India’s fiscal transfers from the central to state governments are solely determined by the Finance Commissions (FCs). While in some of the well-established federations such as Australia, Canada, and Germany, equalization serves as the guiding principle of fiscal transfers and is constitutionally mandated, in India, it is not explicitly mandated, and FCs attempt to implement it indirectly by a combination of a formula-based share in the divisible pool of central taxes supplemented by a set of grants. In this context, it is important to measure the extent of equalization that is achieved through FC transfers with a view to improving the design of such transfers. This study uses an index-based methodology for measuring the degree of equalization achieved through FC-transfers covering the period from FC12 to the first year of FC15 spanning from 2005-06 to 2020-21. The ‘Index of Equalization’ shows that the extent of equalization has remained low in the range of 30% to 37% for the four Commission periods under review. The highest degree of equalization at 36.7% was witnessed in the FC12 period and the lowest equalization at 29.5% was achieved during the FC15(1) period. The equalizing efficiency of recommended transfers also shows a consistent fall from 11.4% in the FC12 period to 7.5% by the FC15 (1) period. Further, considering progressivity in fiscal transfers as a special case of equalizing transfers, this study shows that the scheme of per capita total transfers when determined using the equalization approach is more progressive and is characterized by minimal deviations as compared to the profile of transfers recommended by recent FCs.Keywords: fiscal transfers, index of equalization, equalizing efficiency, fiscal capacity, expenditure needs, finance Commission, tax effort
Procedia PDF Downloads 7411543 Systems Approach on Thermal Analysis of an Automatic Transmission
Authors: Sinsze Koo, Benjin Luo, Matthew Henry
Abstract:
In order to increase the performance of an automatic transmission, the automatic transmission fluid is required to be warm up to an optimal operating temperature. In a conventional vehicle, cold starts result in friction loss occurring in the gear box and engine. The stop and go nature of city driving dramatically affect the warm-up of engine oil and automatic transmission fluid and delay the time frame needed to reach an optimal operating temperature. This temperature phenomenon impacts both engine and transmission performance but also increases fuel consumption and CO2 emission. The aim of this study is to develop know-how of the thermal behavior in order to identify thermal impacts and functional principles in automatic transmissions. Thermal behavior was studied using models and simulations, developed using GT-Suit, on a one-dimensional thermal and flow transport. A power train of a conventional vehicle was modeled in order to emphasis the thermal phenomena occurring in the various components and how they impact the automatic transmission performance. The simulation demonstrates the thermal model of a transmission fluid cooling system and its component parts in warm-up after a cold start. The result of these analyses will support the future designs of transmission systems and components in an attempt to obtain better fuel efficiency and transmission performance. Therefore, these thermal analyses could possibly identify ways that improve existing thermal management techniques with prioritization on fuel efficiency.Keywords: thermal management, automatic transmission, hybrid, and systematic approach
Procedia PDF Downloads 37711542 Improving Food Security and Commercial Development through Promotion of High Value Medicinal and Industrial Plants in the Swat Valley of Pakistan
Authors: Hassan Sher
Abstract:
Agriculture has a pivotal role in Pakistan’s economy, accounting for about one-fourth of the GDP and employing almost half the population. However, the competitiveness, productivity, growth, employment potential, export opportunity, and contribution to GDP of the sector is significantly hampered by agriculture marketing laws/regulations at the provincial level that reward rent seeking behavior, promote monopoly power, artificially reduce farmer incomes while inflating prices to consumers, and act as disincentives to investment. Although of more recent vintage than some other provincial agricultural marketing laws, the NWFP Agricultural and Livestock Produce Markets Act, 2007 is a throwback to a colonial paradigm, where restrictions on agricultural produce marketing and Government control of distribution channels is the norm. The Swat Valley (in which we include its tributary valleys) is an area of Pakistan in which there is poverty is both extreme and pervasive. For many, a significant portion of the family’s income comes from selling plants that are used as herbs, medicines, and perfumes. Earlier studies have shown that the benefit they derive from this work is less than they might because of: Lack of knowledge concerning which plants and which plant parts are valuable, Lack of knowledge concerning optimal preservation and storage of material, illiteracy. Another concern that much of the plant material sold from the valley is collected in the wild, without an appreciation of the negative impact continued collecting has on wild populations. We propose: Creating colored cards to help inhabitants recognize the 25 most valuable plants in their area; Developing and sharing protocols for growing the 25 most valuable plants in a home garden; Developing and sharing efficient mechanisms for drying plants so they do not lose value; Encouraging increased literacy by incorporating numbers and a few words in the handouts.Keywords: food security, medicinal plants, industrial plants, economic development
Procedia PDF Downloads 32611541 Design and Optimization of Open Loop Supply Chain Distribution Network Using Hybrid K-Means Cluster Based Heuristic Algorithm
Authors: P. Suresh, K. Gunasekaran, R. Thanigaivelan
Abstract:
Radio frequency identification (RFID) technology has been attracting considerable attention with the expectation of improved supply chain visibility for consumer goods, apparel, and pharmaceutical manufacturers, as well as retailers and government procurement agencies. It is also expected to improve the consumer shopping experience by making it more likely that the products they want to purchase are available. Recent announcements from some key retailers have brought interest in RFID to the forefront. A modified K- Means Cluster based Heuristic approach, Hybrid Genetic Algorithm (GA) - Simulated Annealing (SA) approach, Hybrid K-Means Cluster based Heuristic-GA and Hybrid K-Means Cluster based Heuristic-GA-SA for Open Loop Supply Chain Network problem are proposed. The study incorporated uniform crossover operator and combined crossover operator in GAs for solving open loop supply chain distribution network problem. The algorithms are tested on 50 randomly generated data set and compared with each other. The results of the numerical experiments show that the Hybrid K-means cluster based heuristic-GA-SA, when tested on 50 randomly generated data set, shows superior performance to the other methods for solving the open loop supply chain distribution network problem.Keywords: RFID, supply chain distribution network, open loop supply chain, genetic algorithm, simulated annealing
Procedia PDF Downloads 16511540 An Assessment of the Role of Actors in the Medical Waste Management Policy-Making Process of Bangladesh
Authors: Md Monirul Islam, Shahaduz Zaman, Mosarraf H. Sarker
Abstract:
Context: Medical waste management (MWM) is a critical sector in Bangladesh due to its impact on human health and the environment. There is a need to assess the current policies and identify the role of policy actors in the policy formulation and implementation process. Research Aim: The study aimed to evaluate the role of policy actors in the medical waste management policy-making process in Bangladesh, identify policy gaps, and provide actionable recommendations for improvement. Methodology: The study adopted a qualitative research method and conducted key informant interviews. The data collected were analyzed using the thematic coding approach through Atlas.ti software. Findings: The study found that policies are formulated at higher administrative levels and implemented in a top-down approach. Higher-level institutions predominantly contribute to policy development, while lower-level institutions focus on implementation. However, due to negligence, ignorance, and lack of coordination, medical waste management receives insufficient attention from the actors. The study recommends the need for immediate strategies, a comprehensive action plan, regular policy updates, and inter-ministerial meetings to enhance medical waste management practices and interventions. Theoretical Importance: The research contributes to evaluating the role of policy actors in medical waste management policymaking and implementation in Bangladesh. It identifies policy gaps and provides actionable recommendations for improvement. Data Collection: The study used key informant interviews as the data collection method. Thirty-six participants were interviewed, including influential policymakers and representatives of various administrative spheres. Analysis Procedures: The data collected was analyzed using the inductive thematic analysis approach. Question Addressed: The study aimed to assess the role of policy actors in medical waste management policymaking and implementation in Bangladesh. Conclusion: In conclusion, the study provides insights into the current medical waste management policy in Bangladesh, the role of policy actors in policy formulation and implementation, and the need for improved strategies and policy updates. The findings of this study can guide future policy-making efforts to enhance medical waste management practices and interventions in Bangladesh.Keywords: key informant, medical waste management, policy maker, qualitative study
Procedia PDF Downloads 8111539 Numerical Simulation of Lifeboat Launching Using Overset Meshing
Authors: Alok Khaware, Vinay Kumar Gupta, Jean Noel Pederzani
Abstract:
Lifeboat launching from marine vessel or offshore platform is one of the important areas of research in offshore applications. With the advancement of computational fluid dynamic simulation (CFD) technology to solve fluid induced motions coupled with Six Degree of Freedom (6DOF), rigid body dynamics solver, it is now possible to predict the motion of the lifeboat precisely in different challenging conditions. Traditionally dynamic remeshing approach is used to solve this kind of problems, but remeshing approach has some bottlenecks to control good quality mesh in transient moving mesh cases. In the present study, an overset method with higher-order interpolation is used to simulate a lifeboat launched from an offshore platform into calm water, and volume of fluid (VOF) method is used to track free surface. Overset mesh consists of a set of overlapping component meshes, which allows complex geometries to be meshed with lesser effort. Good quality mesh with local refinement is generated at the beginning of the simulation and stay unchanged throughout the simulation. Overset mesh accuracy depends on the precise interpolation technique; the present study includes a robust and accurate least square interpolation method and results obtained with overset mesh shows good agreement with experiment.Keywords: computational fluid dynamics, free surface flow, lifeboat launching, overset mesh, volume of fluid
Procedia PDF Downloads 27711538 Prediction of Slaughter Body Weight in Rabbits: Multivariate Approach through Path Coefficient and Principal Component Analysis
Authors: K. A. Bindu, T. V. Raja, P. M. Rojan, A. Siby
Abstract:
The multivariate path coefficient approach was employed to study the effects of various production and reproduction traits on the slaughter body weight of rabbits. Information on 562 rabbits maintained at the university rabbit farm attached to the Centre for Advanced Studies in Animal Genetics, and Breeding, Kerala Veterinary and Animal Sciences University, Kerala State, India was utilized. The manifest variables used in the study were age and weight of dam, birth weight, litter size at birth and weaning, weight at first, second and third months. The linear multiple regression analysis was performed by keeping the slaughter weight as the dependent variable and the remaining as independent variables. The model explained 48.60 percentage of the total variation present in the market weight of the rabbits. Even though the model used was significant, the standardized beta coefficients for the independent variables viz., age and weight of the dam, birth weight and litter sizes at birth and weaning were less than one indicating their negligible influence on the slaughter weight. However, the standardized beta coefficient of the second-month body weight was maximum followed by the first-month weight indicating their major role on the market weight. All the other factors influence indirectly only through these two variables. Hence it was concluded that the slaughter body weight can be predicted using the first and second-month body weights. The principal components were also developed so as to achieve more accuracy in the prediction of market weight of rabbits.Keywords: component analysis, multivariate, slaughter, regression
Procedia PDF Downloads 16511537 A Comparative Soft Computing Approach to Supplier Performance Prediction Using GEP and ANN Models: An Automotive Case Study
Authors: Seyed Esmail Seyedi Bariran, Khairul Salleh Mohamed Sahari
Abstract:
In multi-echelon supply chain networks, optimal supplier selection significantly depends on the accuracy of suppliers’ performance prediction. Different methods of multi criteria decision making such as ANN, GA, Fuzzy, AHP, etc have been previously used to predict the supplier performance but the “black-box” characteristic of these methods is yet a major concern to be resolved. Therefore, the primary objective in this paper is to implement an artificial intelligence-based gene expression programming (GEP) model to compare the prediction accuracy with that of ANN. A full factorial design with %95 confidence interval is initially applied to determine the appropriate set of criteria for supplier performance evaluation. A test-train approach is then utilized for the ANN and GEP exclusively. The training results are used to find the optimal network architecture and the testing data will determine the prediction accuracy of each method based on measures of root mean square error (RMSE) and correlation coefficient (R2). The results of a case study conducted in Supplying Automotive Parts Co. (SAPCO) with more than 100 local and foreign supply chain members revealed that, in comparison with ANN, gene expression programming has a significant preference in predicting supplier performance by referring to the respective RMSE and R-squared values. Moreover, using GEP, a mathematical function was also derived to solve the issue of ANN black-box structure in modeling the performance prediction.Keywords: Supplier Performance Prediction, ANN, GEP, Automotive, SAPCO
Procedia PDF Downloads 41911536 Gas Network Noncooperative Game
Authors: Teresa Azevedo PerdicoúLis, Paulo Lopes Dos Santos
Abstract:
The conceptualisation of the problem of network optimisation as a noncooperative game sets up a holistic interactive approach that brings together different network features (e.g., com-pressor stations, sources, and pipelines, in the gas context) where the optimisation objectives are different, and a single optimisation procedure becomes possible without having to feed results from diverse software packages into each other. A mathematical model of this type, where independent entities take action, offers the ideal modularity and subsequent problem decomposition in view to design a decentralised algorithm to optimise the operation and management of the network. In a game framework, compressor stations and sources are under-stood as players which communicate through network connectivity constraints–the pipeline model. That is, in a scheme similar to tatonnementˆ, the players appoint their best settings and then interact to check for network feasibility. The devolved degree of network unfeasibility informs the players about the ’quality’ of their settings, and this two-phase iterative scheme is repeated until a global optimum is obtained. Due to network transients, its optimisation needs to be assessed at different points of the control interval. For this reason, the proposed approach to optimisation has two stages: (i) the first stage computes along the period of optimisation in order to fulfil the requirement just mentioned; (ii) the second stage is initialised with the solution found by the problem computed at the first stage, and computes in the end of the period of optimisation to rectify the solution found at the first stage. The liability of the proposed scheme is proven correct on an abstract prototype and three example networks.Keywords: connectivity matrix, gas network optimisation, large-scale, noncooperative game, system decomposition
Procedia PDF Downloads 15211535 The Role of the University of Zululand in Documenting and Disseminating Indigenous Knowledge, in KwaZulu-Natal, South Africa
Authors: Smiso Buthelezi, Petros Dlamini, Dennis Ocholla
Abstract:
The study assesses the University of Zululand's practices for documenting, sharing, and accessing indigenous knowledge. Two research objectives guided it: to determine how indigenous knowledge (IK) is developed at the University of Zululand and how indigenous knowledge (IK) is documented at the University of Zululand. The study adopted both interpretive and positivist research paradigms. Ultimately, qualitative and quantitative research methods were used. The qualitative research approach collected data from academic and non-academic staff members. Interviews were conducted with 18 academic staff members and 5 with support staff members. The quantitative research approach was used to collect data from indigenous knowledge (IK) theses and dissertations from the University of Zululand Institutional Repository between 2009-2019. The study results revealed that many departments across the University of Zululand were involved in creating indigenous knowledge (IK)-related content. The department of African Languages was noted to be more involved in creating IK-related content. Moreover, the documentation of the content related to indigenous knowledge (IK) at the University of Zululand is done frequently but is not readily known. It was found that the creation and documentation of indigenous knowledge by different departments faced several challenges. The common challenges are a lack of interest among indigenous knowledge (IK) owners in sharing their knowledge, the local language as a barrier, and a shortage of proper tools for recording and capturing indigenous knowledge (IK). One of the study recommendations is the need for an indigenous knowledge systems (IKS) policy to be in place at the University of Zululand.Keywords: knowledge creation, SECI model, information and communication technology., indigenous knowledge
Procedia PDF Downloads 11211534 Video Compression Using Contourlet Transform
Authors: Delara Kazempour, Mashallah Abasi Dezfuli, Reza Javidan
Abstract:
Video compression used for channels with limited bandwidth and storage devices has limited storage capabilities. One of the most popular approaches in video compression is the usage of different transforms. Discrete cosine transform is one of the video compression methods that have some problems such as blocking, noising and high distortion inappropriate effect in compression ratio. wavelet transform is another approach is better than cosine transforms in balancing of compression and quality but the recognizing of curve curvature is so limit. Because of the importance of the compression and problems of the cosine and wavelet transforms, the contourlet transform is most popular in video compression. In the new proposed method, we used contourlet transform in video image compression. Contourlet transform can save details of the image better than the previous transforms because this transform is multi-scale and oriented. This transform can recognize discontinuity such as edges. In this approach we lost data less than previous approaches. Contourlet transform finds discrete space structure. This transform is useful for represented of two dimension smooth images. This transform, produces compressed images with high compression ratio along with texture and edge preservation. Finally, the results show that the majority of the images, the parameters of the mean square error and maximum signal-to-noise ratio of the new method based contourlet transform compared to wavelet transform are improved but in most of the images, the parameters of the mean square error and maximum signal-to-noise ratio in the cosine transform is better than the method based on contourlet transform.Keywords: video compression, contourlet transform, discrete cosine transform, wavelet transform
Procedia PDF Downloads 44411533 Disaster Management Approach for Planning an Early Response to Earthquakes in Urban Areas
Authors: Luis Reynaldo Mota-Santiago, Angélica Lozano
Abstract:
Determining appropriate measures to face earthquakesarea challenge for practitioners. In the literature, some analyses consider disaster scenarios, disregarding some important field characteristics. Sometimes, software that allows estimating the number of victims and infrastructure damages is used. Other times historical information of previous events is used, or the scenarios’informationis assumed to be available even if it isnot usual in practice. Humanitarian operations start immediately after an earthquake strikes, and the first hours in relief efforts are important; local efforts are critical to assess the situation and deliver relief supplies to the victims. A preparation action is prepositioning stockpiles, most of them at central warehouses placed away from damage-prone areas, which requires large size facilities and budget. Usually, decisions in the first 12 hours (standard relief time (SRT)) after the disaster are the location of temporary depots and the design of distribution paths. The motivation for this research was the delay in the reaction time of the early relief efforts generating the late arrival of aid to some areas after the Mexico City 7.1 magnitude earthquake in 2017. Hence, a preparation approach for planning the immediate response to earthquake disasters is proposed, intended for local governments, considering their capabilities for planning and for responding during the SRT, in order to reduce the start-up time of immediate response operations in urban areas. The first steps are the generation and analysis of disaster scenarios, which allow estimatethe relief demand before and in the early hours after an earthquake. The scenarios can be based on historical data and/or the seismic hazard analysis of an Atlas of Natural Hazards and Risk as a way to address the limited or null available information.The following steps include the decision processes for: a) locating local depots (places to prepositioning stockpiles)and aid-giving facilities at closer places as possible to risk areas; and b) designing the vehicle paths for aid distribution (from local depots to the aid-giving facilities), which can be used at the beginning of the response actions. This approach allows speeding up the delivery of aid in the early moments of the emergency, which could reduce the suffering of the victims allowing additional time to integrate a broader and more streamlined response (according to new information)from national and international organizations into these efforts. The proposed approachis applied to two case studies in Mexico City. These areas were affectedby the 2017’s earthquake, having limited aid response. The approach generates disaster scenarios in an easy way and plans a faster early response with a short quantity of stockpiles which can be managed in the early hours of the emergency by local governments. Considering long-term storage, the estimated quantities of stockpiles require a limited budget to maintain and a small storage space. These stockpiles are useful also to address a different kind of emergencies in the area.Keywords: disaster logistics, early response, generation of disaster scenarios, preparation phase
Procedia PDF Downloads 11011532 The Functional-Engineered Product-Service System Model: An Extensive Review towards a Unified Approach
Authors: Nicolas Haber
Abstract:
The study addresses the design process of integrated product-service offerings as a measure of answering environmental sustainability concerns by replacing stand-alone physical artefacts with comprehensive solutions relying on functional results rather than conventional product sales. However, views regarding this transformation are dissimilar and differentiated: The study discusses the importance and requirements of product-service systems before analysing the theoretical studies accomplished in the extent of their design and development processes. Based on this, a framework, built on a design science approach, is proposed, where the distinct approaches from the literature are merged towards a unified structure serving as a generic methodology to designing product-service systems. Each stage of this model is then developed to present a holistic design proposal called the Functional Engineered Product-Service System (FEPSS) model. Product-service systems are portrayed as customisable solutions tailored to specific settings and defined circumstances. Moreover, the approaches adopted to guide the design process are diversified. A thorough analysis of the design strategies and development processes however, allowed the extraction of a design backbone, valid to varied situations and contexts whether they are product-oriented, use-oriented or result-oriented. The goal is to guide manufacturers towards an eased adoption of these integrated offerings, given their inherited environmental benefits, by proposing a robust all-purpose design process.Keywords: functional product, integrated product-service offerings, product-service systems, sustainable design
Procedia PDF Downloads 29311531 Authentication and Traceability of Meat Products from South Indian Market by Species-Specific Polymerase Chain Reaction
Authors: J. U. Santhosh Kumar, V. Krishna, Sebin Sebastian, G. S. Seethapathy, G. Ravikanth, R. Uma Shaanker
Abstract:
Food is one of the basic needs of human beings. It requires the normal function of the body part and a healthy growth. Recently, food adulteration increases day by day to increase the quantity and make more benefit. Animal source foods can provide a variety of micronutrients that are difficult to obtain in adequate quantities from plant source foods alone. Particularly in the meat industry, products from animals are susceptible targets for fraudulent labeling due to the economic profit that results from selling cheaper meat as meat from more profitable and desirable species. This work presents an overview of the main PCR-based techniques applied to date to verify the authenticity of beef meat and meat products from beef species. We were analyzed 25 market beef samples in South India. We examined PCR methods based on the sequence of the cytochrome b gene for source species identification. We found all sample were sold as beef meat as Bos Taurus. However, interestingly Male meats are more valuable high price compare to female meat, due to this reason most of the markets samples are susceptible. We were used sex determination gene of cattle like TSPY(Y-encoded, testis-specific protein TSPY is a Y-specific gene). TSPY homologs exist in several mammalian species, including humans, horses, and cattle. This gene is Y coded testis protein genes, which only amplify the male. We used multiple PCR products form species-specific “fingerprints” on gel electrophoresis, which may be useful for meat authentication. Amplicons were obtained only by the Cattle -specific PCR. We found 13 market meat samples sold as female beef samples. These results suggest that the species-specific PCR methods established in this study would be useful for simple and easy detection of adulteration of meat products.Keywords: authentication, meat products, species-specific, TSPY
Procedia PDF Downloads 37511530 Coarse-Grained Computational Fluid Dynamics-Discrete Element Method Modelling of the Multiphase Flow in Hydrocyclones
Authors: Li Ji, Kaiwei Chu, Shibo Kuang, Aibing Yu
Abstract:
Hydrocyclones are widely used to classify particles by size in industries such as mineral processing and chemical processing. The particles to be handled usually have a broad range of size distributions and sometimes density distributions, which has to be properly considered, causing challenges in the modelling of hydrocyclone. The combined approach of Computational Fluid Dynamics (CFD) and Discrete Element Method (DEM) offers convenience to model particle size/density distribution. However, its direct application to hydrocyclones is computationally prohibitive because there are billions of particles involved. In this work, a CFD-DEM model with the concept of the coarse-grained (CG) model is developed to model the solid-fluid flow in a hydrocyclone. The DEM is used to model the motion of discrete particles by applying Newton’s laws of motion. Here, a particle assembly containing a certain number of particles with same properties is treated as one CG particle. The CFD is used to model the liquid flow by numerically solving the local-averaged Navier-Stokes equations facilitated with the Volume of Fluid (VOF) model to capture air-core. The results are analyzed in terms of fluid and solid flow structures, and particle-fluid, particle-particle and particle-wall interaction forces. Furthermore, the calculated separation performance is compared with the measurements. The results obtained from the present study indicate that this approach can offer an alternative way to examine the flow and performance of hydrocyclonesKeywords: computational fluid dynamics, discrete element method, hydrocyclone, multiphase flow
Procedia PDF Downloads 40711529 Lagrangian Approach for Modeling Marine Litter Transport
Authors: Sarra Zaied, Arthur Bonpain, Pierre Yves Fravallo
Abstract:
The permanent supply of marine litter implies their accumulation in the oceans, which causes the presence of more compact wastes layers. Their Spatio-temporal distribution is never homogeneous and depends mainly on the hydrodynamic characteristics of the environment and the size and location of the wastes. As part of optimizing collect of marine plastic wastes, it is important to measure and monitor their evolution over time. For this, many research studies have been dedicated to describing the wastes behavior in order to identify their accumulation in oceans areas. Several models are therefore developed to understand the mechanisms that allow the accumulation and the displacements of marine litter. These models are able to accurately simulate the drift of wastes to study their behavior and stranding. However, these works aim to study the wastes behavior over a long period of time and not at the time of waste collection. This work investigates the transport of floating marine litter (FML) to provide basic information that can help in optimizing wastes collection by proposing a model for predicting their behavior during collection. The proposed study is based on a Lagrangian modeling approach that uses the main factors influencing the dynamics of the waste. The performance of the proposed method was assessed on real data collected from the Copernicus Marine Environment Monitoring Service (CMEMS). Evaluation results in the Java Sea (Indonesia) prove that the proposed model can effectively predict the position and the velocity of marine wastes during collection.Keywords: floating marine litter, lagrangian transport, particle-tracking model, wastes drift
Procedia PDF Downloads 19111528 Experimental and Computational Investigations on the Mitigation of Air Pollutants Using Pulsed Radio Waves
Authors: Gangadhara Siva Naga Venkata Krishna Satya Narayana Swamy Undi
Abstract:
Particulate matter (PM) pollution in ambient air is a major environmental health risk factor contributing to disease and mortality worldwide. Current air pollution control methods have limitations in reducing real-world ambient PM levels. This study demonstrates the efficacy of using pulsed radio wave technology as a distinct approach to lower outdoor particulate pollution. Experimental data were compared with computational models to evaluate the efficiency of pulsed waves in coagulating and settling PM. Results showed 50%+ reductions in PM2.5 and PM10 concentrations at the city scale, with particle removal rates exceeding gravity settling by over 3X. Historical air quality data further validated the significant PM reductions achieved in test cases. Computational analyses revealed the underlying coagulation mechanisms induced by the pulsed waves, supporting the feasibility of this strategy for ambient particulate control. The pulsed electromagnetic technology displayed robustness in sustainably managing PM levels across diverse urban and industrial environments. Findings highlight the promise of this advanced approach as a next-generation solution to mitigate particulate air pollution and associated health burdens globally. The technology's scalability and energy efficiency can help address a key gap in current efforts to improve ambient air quality.Keywords: particulate matter, mitigation technologies, clean air, ambient air pollution
Procedia PDF Downloads 5111527 Digital Design and Practice of The Problem Based Learning in College of Medicine, Qassim University, Saudi Arabia
Authors: Ahmed Elzainy, Abir El Sadik, Waleed Al Abdulmonem, Ahmad Alamro, Homaidan Al-Homaidan
Abstract:
Problem-based learning (PBL) is an educational modality which stimulates critical and creative thinking. PBL has been practiced in the college of medicine, Qassim University, Saudi Arabia, since the 2002s with offline face to face activities. Therefore, crucial technological changes in paperless work were needed. The aim of the present study was to design and implement the digitalization of the PBL activities and to evaluate its impact on students' and tutors’ performance. This approach promoted the involvement of all stakeholders after their awareness of the techniques of using online tools. IT support, learning resources facilities, and required multimedia were prepared. Students’ and staff perception surveys reflected their satisfaction with these remarkable changes. The students were interested in the new digitalized materials and educational design, which facilitated the conduction of PBL sessions and provided sufficient time for discussion and peer sharing of knowledge. It enhanced the tutors for supervision and tracking students’ activities on the Learning Management System. It could be concluded that introducing of digitalization of the PBL activities promoted the students’ performance, engagement and enabled a better evaluation of PBL materials and getting prompt students as well as staff feedback. These positive findings encouraged the college to implement the digitalization approach in other educational activities, such as Team-Based Learning, as an additional opportunity for further development.Keywords: multimedia in PBL, online PBL, problem-based learning, PBL digitalization
Procedia PDF Downloads 12011526 Feature Based Unsupervised Intrusion Detection
Authors: Deeman Yousif Mahmood, Mohammed Abdullah Hussein
Abstract:
The goal of a network-based intrusion detection system is to classify activities of network traffics into two major categories: normal and attack (intrusive) activities. Nowadays, data mining and machine learning plays an important role in many sciences; including intrusion detection system (IDS) using both supervised and unsupervised techniques. However, one of the essential steps of data mining is feature selection that helps in improving the efficiency, performance and prediction rate of proposed approach. This paper applies unsupervised K-means clustering algorithm with information gain (IG) for feature selection and reduction to build a network intrusion detection system. For our experimental analysis, we have used the new NSL-KDD dataset, which is a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. With a split of 60.0% for the training set and the remainder for the testing set, a 2 class classifications have been implemented (Normal, Attack). Weka framework which is a java based open source software consists of a collection of machine learning algorithms for data mining tasks has been used in the testing process. The experimental results show that the proposed approach is very accurate with low false positive rate and high true positive rate and it takes less learning time in comparison with using the full features of the dataset with the same algorithm.Keywords: information gain (IG), intrusion detection system (IDS), k-means clustering, Weka
Procedia PDF Downloads 29611525 Performative Acts Exhibited in Selected Ghanaian Newspaper Headlines
Authors: Charlotte Tetebea Asiamah
Abstract:
This paper sought to highlight the use of performative acts as exhibited in a Ghanaian newspaper headline; the Daily Graphic. The study categorically discusses and analyze thirty headlines on performative acts as captured in the month of June and July, 2024. The paper dwells on J.L Austin and J.R Searle’s theory of speech acts. Although a lot has been done in the area of performative acts, there is still a gap as far as newspaper headlines are concerned. Getting to know performative act’s stand in the domain of newspaper headlines will contribute to the discussion in literature thereby extending the scope of discourse as far as performative acts are concerned. Some of the questions for this study among others are; Do performative acts exhibited in newspaper headlines follow felicity conditions? Are the utterances explicitly stated or otherwise?A qualitative method approach was used in gathering and analyzing data. This approach was chosen in order to gain a depth insight of the study. The headlines were selected using the instrument of document analysis. Out of the numerous headlines, the researcher snapped over 60 headlines after which thirty (30) headlines were carefully selected for the study. The 30 newspaper headlines were purposively selected based on the element of performativity in them which were related to the study. Per the data, the findings depicted that, Performative Acts are exhibited in the Ghanaian Daily Graphic Newspaper headlines. The performative acts are expressed in all of the five categories of performative acts as J.R Searle discussed in his writing. These acts were seen in all the categories of the newspaper headlines; be it, governance or politics, social, international news and sports. It was also observed in the data that, directives were the most used performative act. The performative acts found in the newspaper headlines helped to grab readers attention, it also served as a way of influencing how readers perceive an utterance made by an individual in the headlines.Keywords: explicit, headline, illocutionary, newspaper, performative
Procedia PDF Downloads 1911524 Mother-Child Conversations about Emotions and Socio-Emotional Education in Children with Autism Spectrum Disorder
Authors: Beaudoin Marie-Joelle, Poirier Nathalie
Abstract:
Introduction: Children with autism spectrum disorder (ASD) tend to lack socio-emotional skills (e.g., emotional regulation and theory of mind). Eisenberg’s theoretical model on emotion-related socialization behaviors suggests that mothers of children with ASD could play a central role in fostering the acquisition of socio-emotional skills by engaging in frequent educational conversations about emotions. Although, mothers’ perceptions of their own emotional skills and their child’s personality traits and social deficits could mitigate the benefit of their educative role. Objective: Our study aims to explore the association between mother-child conversations about emotions and the socio-emotional skills of their children when accounting for the moderating role of the mothers’ perceptions. Forty-nine mothers completed five questionnaires about emotionally related conversations, self-openness to emotions, and perceptions of personality and socio-emotional skills of their children with ASD. Results: Regression analyses showed that frequent mother-child conversations about emotions predicted better emotional regulation and theory of mind skills in children with ASD (p < 0.01). The children’s theory of mind was moderated by mothers’ perceptions of their own emotional openness (p < 0.05) and their perceptions of their children’s openness to experience (p < 0.01) and conscientiousness (p < 0.05). Conclusion: Mothers likely play an important role in the socio-emotional education of children with ASD. Further, mothers may be most helpful when they perceive that their interventions improve their child’s behaviors. Our findings corroborate those of the Eisenberg model, which claims that mother-child conversations about emotions predict socio-emotional development skills in children with ASD. Our results also help clarify the moderating role of mothers’ perceptions, which could mitigate their willingness to engage in educational conversations about emotions with their children. Therefore, in special needs' children education, school professionals could collaborate with mothers to increase the frequency of emotion-related conversations in ASD's students with emotion dysregulation or theory of mind problems.Keywords: autism, parental socialization of emotion, emotional regulation, theory of mind
Procedia PDF Downloads 8811523 Augmented Reality: New Relations with the Architectural Heritage Education
Authors: Carla Maria Furuno Rimkus
Abstract:
The technologies related to virtual reality and augmented reality in combination with mobile technologies, are being more consolidated and used each day. The increasing technological availability along with the decrease of their acquisition and maintenance costs, have favored the expansion of its use in the field of historic heritage. In this context it is focused, in this article, on the potential of mobile applications in the dissemination of the architectural heritage, using the technology of Augmented Reality. From this perspective approach, it is discussed about the process of producing an application for mobile devices on the Android platform, which combines the technologies of geometric modeling with augmented reality (AR) and access to interactive multimedia contents with cultural, social and historic information of the historic building that we take as the object of study: a block with a set of buildings built in the XVIII century, known as "Quarteirão dos Trapiches", which was modeled in 3D, coated with the original texture of its facades and displayed on AR. From this perspective approach, this paper discusses about methodological aspects of the development of this application regarding to the process and the project development tools, and presents our considerations on methodological aspects of developing an application for the Android system, focused on the dissemination of the architectural heritage, in order to encourage the tourist potential of the city in a sustainable way and to contribute to develop the digital documentation of the heritage of the city, meeting a demand of tourists visiting the city and the professionals who work in the preservation and restoration of it, consisting of architects, historians, archaeologists, museum specialists, among others.Keywords: augmented reality, architectural heritage, geometric modeling, mobile applications
Procedia PDF Downloads 47811522 New Environmental Culture in Algeria: Eco Design
Authors: S. Tireche, A. Tairi abdelaziz
Abstract:
Environmental damage has increased steadily in recent decades: Depletion of natural resources, destruction of the ozone layer, greenhouse effect, degradation of the quality of life, land use etc. New terms have emerged as: "Prevention rather than cure" or "polluter pays" falls within the principles of common sense, their practical implementation still remains fragmented. Among the avenues to be explored, one of the most promising is certainly one that focuses on product design. Indeed, where better than during the design phase, can reduce the source of future impacts on the environment? What choices or those of design, they influence more on the environmental characteristics of products? The most currently recognized at the international level is the analysis of the life cycle (LCA) and Life Cycle Assessment, subject to International Standardization (ISO 14040-14043). LCA provides scientific and objective assessment of potential impacts of the product or service, considering its entire life cycle. This approach makes it possible to minimize impacts to the source in pollution prevention. It is widely preferable to curative approach, currently majority in the industrial crops, led mostly by a report of pollution. The "product" is to reduce the environmental impacts of a given product, taking into account all or part of its life cycle. Currently, there are emerging tools, known as eco-design. They are intended to establish an environmental profile of the product to improve its environmental performance. They require a quantity sufficient information on the product for each phase of its life cycle: raw material extraction, manufacturing, distribution, usage, end of life (recycling or incineration or deposit) and all stages of transport. The assessment results indicate the sensitive points of the product studied, points on which the developer must act.Keywords: eco design, impact, life cycle analysis (LCA), sustainability
Procedia PDF Downloads 42711521 Ethical Considerations for Conducting Research on Violence against Women with Disabilities: Discussing Issues of Reasonable Accommodation, Capacity and Equal Participation
Authors: Ingrid Van Der Heijden, Naeemah Abrahams, Jane Harries
Abstract:
Background: Women with disabilities are largely missing from global research on violence prevention, yet research shows that women with disabilities are a particularly marginalised group who experience heightened levels and unique forms of violence than men with disabilities, and women without disabilities. They face heightened stigma, discrimination, and violence due to their gender and their disability. Including women with disabilities in violence, research helps inform policy and prevention interventions that are relevant and inclusive. To ensure their inclusion in violence research, we need ethical guidelines that are sensitive to their heightened risk and vulnerability, that recognize the diversity in the disabled population, but that also promote disabled people’s agency in defining their own violence prevention needs and agendas. Objective: To highlight pertinent ethical issues around women with disabilities’ inclusion and participation in violence research. Methodology: Considering the lack of formalized guidelines for research of people with disabilities, we draw from the literature on international ethics guidelines for researching violence against women, and the Emancipatory Disability Research paradigm, as well as drawing from our own experiences from the field in applying the guidelines when doing research with disabled women. Findings: Following the guiding ethical principles of respect, benefit, justice, and do no harm, we argue that reasonable accommodation, capacity, and equal participation need to be considered in conceptualizing and conducting ethical violence research with women with disabilities. We conclude that disability research in the area of violence is highly politicized and must be carefully scrutinized to ensure justice and the contribution of women with disabilities to their own welfare. Implications: We suggest that these issues are practically applied in the field and tested and critiqued to enhance best practice for undertaking ethical research with this particular group. It is important that not only researchers and ethics committees, but also disabled women and disabled organizations, are involved in enhancing and formalizing ethical research guidelines for marginalized populations.Keywords: capacity, emancipatory disability research paradigm equal participation, reasonable accommodation, research ethics, violence against women with disabilities
Procedia PDF Downloads 34111520 Development of Carrot Puree with Algae for the Elderly with Dysphagia
Authors: Obafemi Akinwotu, Aylin Tas, Tony Taylor, Bukola Onarinde
Abstract:
The study was conducted to explore the methods and tools to improve texture and preserve the total phenolic and antioxidant compounds of dysphagia foods produced from carrot-based puree with decolourised Chlorella algae. Textural properties (Texture profile analysis [TPA]; the International Dysphagia Diet Standardization Initiative, particle size test [PST]) and rheological properties (viscosity and viscoelastic properties) of carrot puree defrosted by different treatments (microwave, steamer, oven), were characterised using hydrocolloids (guar gum, k. carrageenan, and xanthan gum), and the results were compared to a level 4 commercial sample. DPPH (2,2-diphenyl-1-picrylhydrazyl) antiradical scavenging radicals and total phenolic contents were employed to evaluate the total phenolics, and radical scavenging properties of defrosted carrot puree sonicated carrot puree (20 Hz, 30 min, 60 oC), and vacuum-dried carrot powder with the addition of algae. Results show that the viscosity, viscoelasticity test, TPA, and PST of the commercial sample were comparable to those of guar gum and xanthan gum containing puree, suggesting that they could be used as dysphagia diets. There was no noticeable decolourisation of the Chlorella pigment. Additionally, the use of the microwave, stemmer, and oven for defrosting treatment had an impact on the textural characteristics of the moulded samples upon cooling and also contributed to the reduction in the total phenolic and antioxidant properties of the samples. Sonication treatments of algae exposure reduced the cloudiness of the green pigment and lightened the colour of the samples containing algae, and they also reduced the drying time from 2.5 to 1.5 hours during the preliminary work. The low-temperature vacuum- and freeze-dried samples increased the concentration of the powder and resulted in an increase in the total phenolic content of the dry samples. The dried products may therefore have the potential to become more nutrient-dense to benefit the health of individuals with dysphagia.Keywords: dysphagia, elderly, hydrocolloids, carrot puree
Procedia PDF Downloads 6311519 Restructuring the College Classroom: Scaffolding Student Learning and Engagement in Higher Education
Authors: Claire Griffin
Abstract:
Recent years have witnessed a surge in the use of innovative teaching approaches to support student engagement and higher-order learning within higher education. This paper seeks to explore the use of collaborative, interactive teaching and learning strategies to support student engagement in a final year undergraduate Developmental Psychology module. In particular, the use of the jigsaw method, in-class presentations and online discussion fora were adopted in a ‘lectorial’ style teaching approach, aimed at scaffolding learning, fostering social interdependence and supporting various levels of student engagement in higher education. Using the ‘Student Course Engagement Questionnaire’, the impact of such teaching strategies on students’ college classroom experience was measured, with additional qualitative student feedback gathered. Results illustrate the positive impact of the teaching methodologies on students’ levels of engagement, with positive implications emerging across the four engagement factors: skills engagement, emotional engagement, participation/interaction engagement and performance engagement. Thematic analysis on students’ qualitative comments also provided greater insight into the positive impact of the ‘lectorial’ teaching approach on students’ classroom experience within higher level education. Implications of the findings are presented in terms of informing effective teaching practices within higher education. Additional avenues for future research and strategy usage will also be discussed, in light of evolving practice and cutting edge literature within the field.Keywords: learning, higher education, scaffolding, student engagement
Procedia PDF Downloads 37811518 6-Degree-Of-Freedom Spacecraft Motion Planning via Model Predictive Control and Dual Quaternions
Authors: Omer Burak Iskender, Keck Voon Ling, Vincent Dubanchet, Luca Simonini
Abstract:
This paper presents Guidance and Control (G&C) strategy to approach and synchronize with potentially rotating targets. The proposed strategy generates and tracks a safe trajectory for space servicing missions, including tasks like approaching, inspecting, and capturing. The main objective of this paper is to validate the G&C laws using a Hardware-In-the-Loop (HIL) setup with realistic rendezvous and docking equipment. Throughout this work, the assumption of full relative state feedback is relaxed by onboard sensors that bring realistic errors and delays and, while the proposed closed loop approach demonstrates the robustness to the above mentioned challenge. Moreover, G&C blocks are unified via the Model Predictive Control (MPC) paradigm, and the coupling between translational motion and rotational motion is addressed via dual quaternion based kinematic description. In this work, G&C is formulated as a convex optimization problem where constraints such as thruster limits and the output constraints are explicitly handled. Furthermore, the Monte-Carlo method is used to evaluate the robustness of the proposed method to the initial condition errors, the uncertainty of the target's motion and attitude, and actuator errors. A capture scenario is tested with the robotic test bench that has onboard sensors which estimate the position and orientation of a drifting satellite through camera imagery. Finally, the approach is compared with currently used robust H-infinity controllers and guidance profile provided by the industrial partner. The HIL experiments demonstrate that the proposed strategy is a potential candidate for future space servicing missions because 1) the algorithm is real-time implementable as convex programming offers deterministic convergence properties and guarantee finite time solution, 2) critical physical and output constraints are respected, 3) robustness to sensor errors and uncertainties in the system is proven, 4) couples translational motion with rotational motion.Keywords: dual quaternion, model predictive control, real-time experimental test, rendezvous and docking, spacecraft autonomy, space servicing
Procedia PDF Downloads 146