Search results for: information management system
13405 A Review of Transformer Modeling for Power Line Communication Applications
Authors: Balarabe Nkom, Adam P. R. Taylor, Craig Baguley
Abstract:
Power Line Communications (PLC) is being employed in existing power systems, despite the infrastructure not being designed with PLC considerations in mind. Given that power transformers can last for decades, the distribution transformer in particular exists as a relic of un-optimized technology. To determine issues that may need to be addressed in subsequent designs of such transformers, it is essential to have a highly accurate transformer model for simulations and subsequent optimization for the PLC environment, with a view to increase data speed, throughput, and efficiency, while improving overall system stability and reliability. This paper reviews various methods currently available for creating transformer models and provides insights into the requirements of each for obtaining high accuracy. The review indicates that a combination of traditional analytical methods using a hybrid approach gives good accuracy at reasonable costs.Keywords: distribution transformer, modelling, optimization, power line communications
Procedia PDF Downloads 51213404 Technology Blending as an Innovative Construction Mechanism in the Global South
Authors: Janet Kaningen, Richard N. Kaningen, Jonas Kaningen
Abstract:
This paper aims to discover the best ways to improve production efficiency, cost efficiency, community cohesion, and long-term sustainability in Ghana's housing delivery. Advanced Construction Technologies (ACTs) are set to become the sustainable mainstay of the construction industry due to the demand for innovative housing solutions. Advances in material science, building component production, and assembly technologies are leading to the development of next-generation materials such as polymeric-fiber-based products, light-metal alloys, and eco-materials. Modular housing construction has become more efficient and cost-effective than traditional building methods and is becoming increasingly popular for commercial, industrial, and residential projects. Effective project management and logistics will be imperative in the future speed and cost of modular construction housing.Keywords: technology blending, sustainability, housing, Ghana
Procedia PDF Downloads 9313403 Utilising Unground Oil Palm Ash in Producing Foamed Concrete and Its Implementation as an Interlocking Mortar-Less Block
Authors: Hanizam Awang, Mohammed Zuhear Al-Mulali
Abstract:
In this study, the possibility of using unground oil palm ash (UOPA) for producing foamed concrete is investigated. The UOPA used in this study is produced by incinerating palm oil biomass at a temperature exceeding 1000ºC. A semi-structural density of 1300kg/m3 was used with filler to binder ratio of 1.5 and preliminary water to binder ratio of 0.45. Cement was replaced by UOPA at replacement levels of 0, 25, 35, 45, 55 and 65% by weight of binder. Properties such as density, compressive strength, drying shrinkage and water absorption were investigated to the age of 90 days. The mix with a 35% of UOPA content was chosen to be used as the base material of a newly designed interlocking, mortar-less block system.Keywords: foamed concrete, oil palm ash, strength, interlocking block
Procedia PDF Downloads 26513402 Green Building Practices: Harmonizing Non-Governmental Organizations Roles and Energy Efficiency
Authors: Abimbola A. Adebayo, Kikelomo I. Adebayo
Abstract:
Green buildings provide serious challenges for governments all over the world with regard to achieving energy efficiency in buildings. Energy efficient buildings are needed to keep up with minimal impacts on the environment throughout their cycle and to enhance sustainable development. The lack of awareness and benefits of energy efficient buildings have given rise to NGO’s playing important role in filling data gaps, publicizing information, and undertaking awareness raising and policy engagement activities. However, these roles are countered by concerns about subsidies for evaluations, incentives to facilitate data-sharing, and incentives to finance independent research. On the basis of literature review on experiences with NGO’s involvement in energy efficient buildings, this article identifies governance strategies that stimulate the harmonization of NGO’s roles in green buildings with the objective to increase energy efficiency in buildings.Keywords: energy efficiency, green buildings, NGOs, sustainable development
Procedia PDF Downloads 24313401 Urban Poor: The Situations and Characteristics of the Problem and Social Welfare Service of Bangkok Metropolis
Authors: Sanchai Ratthanakwan
Abstract:
This research aims to study situations and characteristics of the problems facing the urban poor. The data and information are collected by focus group and in-depth interview leader and members of Four Regions Slum Network, community representatives and the social welfare officer. The research can be concluded that the problems of the urban poor faced with three major problems: Firstly, the shortage of housing and stability issues in housing; secondly, the problem of substandard quality of life; and thirdly, the debt problem. The study found that a solution will be found in two ways: First way is the creation of housing for the urban poor in slums or community intrusion by the state. Second way is the stability in the housing and subsistence provided by the community center called “housing stability”.Keywords: urban poor, social welfare, Bangkok metropolis, housing stability
Procedia PDF Downloads 42713400 Mining Multicity Urban Data for Sustainable Population Relocation
Authors: Xu Du, Aparna S. Varde
Abstract:
In this research, we propose to conduct diagnostic and predictive analysis about the key factors and consequences of urban population relocation. To achieve this goal, urban simulation models extract the urban development trends as land use change patterns from a variety of data sources. The results are treated as part of urban big data with other information such as population change and economic conditions. Multiple data mining methods are deployed on this data to analyze nonlinear relationships between parameters. The result determines the driving force of population relocation with respect to urban sprawl and urban sustainability and their related parameters. Experiments so far reveal that data mining methods discover useful knowledge from the multicity urban data. This work sets the stage for developing a comprehensive urban simulation model for catering to specific questions by targeted users. It contributes towards achieving sustainability as a whole.Keywords: data mining, environmental modeling, sustainability, urban planning
Procedia PDF Downloads 31213399 Feasibility Study of Distributed Lightless Intersection Control with Level 1 Autonomous Vehicles
Authors: Bo Yang, Christopher Monterola
Abstract:
Urban intersection control without the use of the traffic light has the potential to vastly improve the efficiency of the urban traffic flow. For most proposals in the literature, such lightless intersection control depends on the mass market commercialization of highly intelligent autonomous vehicles (AV), which limits the prospects of near future implementation. We present an efficient lightless intersection traffic control scheme that only requires Level 1 AV as defined by NHTSA. The technological barriers of such lightless intersection control are thus very low. Our algorithm can also accommodate a mixture of AVs and conventional vehicles. We also carry out large scale numerical analysis to illustrate the feasibility, safety and robustness, comfort level, and control efficiency of our intersection control scheme.Keywords: intersection control, autonomous vehicles, traffic modelling, intelligent transport system
Procedia PDF Downloads 46213398 Analysis of Risk Factors Affecting the Motor Insurance Pricing with Generalized Linear Models
Authors: Puttharapong Sakulwaropas, Uraiwan Jaroengeratikun
Abstract:
Casualty insurance business, the optimal premium pricing and adequate cost for an insurance company are important in risk management. Normally, the insurance pure premium can be determined by multiplying the claim frequency with the claim cost. The aim of this research was to study in the application of generalized linear models to select the risk factor for model of claim frequency and claim cost for estimating a pure premium. In this study, the data set was the claim of comprehensive motor insurance, which was provided by one of the insurance company in Thailand. The results of this study found that the risk factors significantly related to pure premium at the 0.05 level consisted of no claim bonus (NCB) and used of the car (Car code).Keywords: generalized linear models, risk factor, pure premium, regression model
Procedia PDF Downloads 46713397 Quartic Spline Method for Numerical Solution of Self-Adjoint Singularly Perturbed Boundary Value Problems
Authors: Reza Mohammadi
Abstract:
Using quartic spline, we develop a method for numerical solution of singularly perturbed two-point boundary-value problems. The purposed method is fourth-order accurate and applicable to problems both in singular and non-singular cases. The convergence analysis of the method is given. The resulting linear system of equations has been solved by using a tri-diagonal solver. We applied the presented method to test problems which have been solved by other existing methods in references, for comparison of presented method with the existing methods. Numerical results are given to illustrate the efficiency of our methods.Keywords: second-order ordinary differential equation, singularly-perturbed, quartic spline, convergence analysis
Procedia PDF Downloads 36313396 Potential of Aerodynamic Feature on Monitoring Multilayer Rough Surfaces
Authors: Ibtissem Hosni, Lilia Bennaceur Farah, Saber Mohamed Naceur
Abstract:
In order to assess the water availability in the soil, it is crucial to have information about soil distributed moisture content; this parameter helps to understand the effect of humidity on the exchange between soil, plant cover and atmosphere in addition to fully understanding the surface processes and the hydrological cycle. On the other hand, aerodynamic roughness length is a surface parameter that scales the vertical profile of the horizontal component of the wind speed and characterizes the surface ability to absorb the momentum of the airflow. In numerous applications of the surface hydrology and meteorology, aerodynamic roughness length is an important parameter for estimating momentum, heat and mass exchange between the soil surface and atmosphere. It is important on this side, to consider the atmosphere factors impact in general, and the natural erosion in particular, in the process of soil evolution and its characterization and prediction of its physical parameters. The study of the induced movements by the wind over soil vegetated surface, either spaced plants or plant cover, is motivated by significant research efforts in agronomy and biology. The known major problem in this side concerns crop damage by wind, which presents a booming field of research. Obviously, most models of soil surface require information about the aerodynamic roughness length and its temporal and spatial variability. We have used a bi-dimensional multi-scale (2D MLS) roughness description where the surface is considered as a superposition of a finite number of one-dimensional Gaussian processes each one having a spatial scale using the wavelet transform and the Mallat algorithm to describe natural surface roughness. We have introduced multi-layer aspect of the humidity of the soil surface, to take into account a volume component in the problem of backscattering radar signal. As humidity increases, the dielectric constant of the soil-water mixture increases and this change is detected by microwave sensors. Nevertheless, many existing models in the field of radar imagery, cannot be applied directly on areas covered with vegetation due to the vegetation backscattering. Thus, the radar response corresponds to the combined signature of the vegetation layer and the layer of soil surface. Therefore, the key issue of the numerical estimation of soil moisture is to separate the two contributions and calculate both scattering behaviors of the two layers by defining the scattering of the vegetation and the soil blow. This paper presents a synergistic methodology, and it is for estimating roughness and soil moisture from C-band radar measurements. The methodology adequately represents a microwave/optical model which has been used to calculate the scattering behavior of the aerodynamic vegetation-covered area by defining the scattering of the vegetation and the soil below.Keywords: aerodynamic, bi-dimensional, vegetation, synergistic
Procedia PDF Downloads 27213395 Isolation and Identification of Compounds from the Leaves of Actinodaphne sesquipedalis Hook. F. Var. Glabra (Lauraceae)
Authors: O. Hanita, S. A. Ainnul Hamidah, A. H. Yang Zalila, M. R. Siti Nadiah, M. H. Najihah, M. A. Hapipah
Abstract:
The crude extract of the leaves of Actinodaphne sesquipedalis Hook. F. Var. Glabra (Kochummen), was taken under phytochemical investigation. The crude methanolic extract was partitioned with a different solvent system by increasing their polarities (n-hexane, dichloromethane, and methanol). The compounds were fractionated and isolated from n-hexane partition by using column chromatography with silica gel 60 or Sephadex LH-20 as a stationary phase and preparative thin layer chromatographic technique. Isolates were characterized using TLC, FTIR, UV spectrophotometer and NMR spectroscopy. The n-hexane fractionates yielded a total of four compounds namely N-methyllaurotetanine (1), dicentrine (2), β-sitosterol (3), and stigmasterol (4). The result indicates that the leaves of Actinodaphne sesquipedalis may provide a rich source of alkaloids and triterpenoids.Keywords: actinodaphne sesquipedalis, alkaloids, phytochemical investigation, triterpenoids
Procedia PDF Downloads 40013394 Neural Network Monitoring Strategy of Cutting Tool Wear of Horizontal High Speed Milling
Authors: Kious Mecheri, Hadjadj Abdechafik, Ameur Aissa
Abstract:
The wear of cutting tool degrades the quality of the product in the manufacturing processes. The online monitoring of the cutting tool wear level is very necessary to prevent the deterioration of the quality of machining. Unfortunately there is not a direct manner to measure the cutting tool wear online. Consequently we must adopt an indirect method where wear will be estimated from the measurement of one or more physical parameters appearing during the machining process such as the cutting force, the vibrations, or the acoustic emission etc. In this work, a neural network system is elaborated in order to estimate the flank wear from the cutting force measurement and the cutting conditions.Keywords: flank wear, cutting forces, high speed milling, signal processing, neural network
Procedia PDF Downloads 39713393 Texture and Twinning in Selective Laser Melting Ti-6Al-4V Alloys
Authors: N. Kazantseva, P. Krakhmalev, I. Yadroitsev, A. Fefelov, N. Vinogradova, I. Ezhov, T. Kurennykh
Abstract:
Martensitic texture-phase transition in Selective Laser Melting (SLM) Ti-6Al-4V (ELI) alloys was found. Electron Backscatter Diffraction (EBSD) analysis showed the initial cubic beta < 100 > (001) BCC texture. Such kind of texture is observed in BCC metals with flat rolling texture when axis is in the direction of rolling and the texture plane coincides with the plane of rolling. It was found that the texture of the parent BCC beta-phase determined the texture of low-temperature HCP alpha-phase limited the choice of its orientation variants. The {10-12} < -1011 > twinning system in titanium alloys after SLM was determined. Analysis of the oxygen contamination in SLM alloys was done. Comparison of the obtained results with the conventional titanium alloys is also provided.Keywords: additive technology, texture, twins, Ti-6Al-4V, oxygen content
Procedia PDF Downloads 64113392 Mobility and Effective Regulatory Policies in the 21st Century Transport Sector
Authors: Pedro Paulino
Abstract:
The majority of the world’s population is already living in urban areas and the urban population is expected to keep increasing in the next decades. This exponential increase in urban population carries with it obvious mobility problems. Not only a new paradigm in the transport sector is needed in order to address these problems; effective regulatory policies to ensure the quality of services, passenger rights, competition between operators and consistency of the entire mobile ecosystem are needed as well. The purpose of this paper is to present the problems the world faces in this sector and contribute to their solution. Indeed, our study concludes that only through the active supervision of the markets and the activity of monitoring the various operators will it be possible to develop a sustainable and efficient transport system which meets the needs of a changing world.Keywords: mobility, regulation policies, sanctioning powers, sustainable transport
Procedia PDF Downloads 30213391 Probabilistic Safety Assessment of Koeberg Spent Fuel Pool
Authors: Sibongiseni Thabethe, Ian Korir
Abstract:
The effective management of spent fuel pool (SFP) safety has been raised as one of the emerging issues to further enhance nuclear installation safety after the Fukushima accident on March 11, 2011. Before then, SFP safety-related issues have been mainly focused on (a) controlling the configuration of the fuel assemblies in the pool with no loss of pool coolants and (b) ensuring adequate pool storage space to prevent fuel criticality owing to chain reactions of the fission products and the ability for neutron absorption to keep the fuel cool. A probabilistic safety (PSA) assessment was performed using the systems analysis program for hands-on integrated reliability evaluations (SAPHIRE) computer code. Event and fault tree analysis was done to develop a PSA model for the Koeberg SFP. We present preliminary PSA results of events that lead to boiling and cause fuel uncovering, resulting in possible fuel damage in the Koeberg SFP.Keywords: computer code, fuel assemblies, probabilistic risk assessment, spent fuel pool
Procedia PDF Downloads 17613390 Cortex-M3 Based Virtual Platform Implementation for Software Development
Authors: Jun Young Moon, Hyeonggeon Lee, Jong Tae Kim
Abstract:
In this paper, we present Cortex-M3 based virtual platform which can virtualize wearable hardware platform and evaluate hardware performance. Cortex-M3 is very popular microcontroller in wearable devices, hardware sensors and display devices. This platform can be used to implement software layer for specific hardware architecture. By using the proposed platform the software development process can be parallelized with hardware development process. We present internal mechanism to implement the proposed virtual platform and describe how to use the proposed platform to develop software by using case study which is low cost wearable device that uses Cortex-M3.Keywords: electronic system level design, software development, virtual platform, wearable device
Procedia PDF Downloads 37713389 Some Studies on Endometritis in Pure Arabian Mares
Authors: Khairi El Battawy, Monika Skalicki
Abstract:
The present investigation has been done on pure Egyptian Arabian mares that reared in private horse studs. Fifty non-pregnant mares were selected and examined to classify them as either being reproductively healthy or subfertile mares including clinical endometritis, early embryonic death, granulosa cell tumor, repeat breeder (post-breeding endometritis), and anoestrus mares. The purpose of the study was to assess oxidative/antioxidant biochemical metabolites, lipogram, trace elements and reproductive hormones throughout reproductive conditions in mares during regular estrous, anestrum, early pregnancy, granulose cell tumor, ovulation failure, and endometritis. Results showed intensification of the free radical-dependent process in the blood of infertile mare, especially mares with endometritis. Ultrasonography as a diagnostic tool diagnosis of endometritis in mares was an important step as it revealed much information concerning infertility problem.Keywords: endometritis, ovulation, oxidative, mare
Procedia PDF Downloads 18113388 Intertextuality in Tourism Advertising: Sources of Knowledge Asymmetries in Translating Vocative Texts
Authors: Maria Ilyushkina
Abstract:
The article addresses the problem of translating vocative texts with intertextual references and describes the influence of language on how knowledge and meaning are developed in the field of advertising. The starting point of the article takes advertisements from the sphere of tourism and the way we choose, translate, and interpret intertexts. The article focuses on the perception and understanding of the information in printed texts advertising recreational facilities and services for tourists as the target audience by representatives of other cultures and the knowledge intertexts convey. The authors argue that intertextuality complicates translation leading to knowledge asymmetries. Studying typical communicative failures is considered to be of great importance, allowing for improvement in the practice of translation in the sphere of advertising as well as preventing the fallacious transfer of knowledge when translating foreign intertexts.Keywords: advertising, translation, intertext, Russian culture, knowledge asymmetries, tourism, vocative texts
Procedia PDF Downloads 13813387 Nursing Experience in the Intensive Care of a Lung Cancer Patient with Pulmonary Embolism on Extracorporeal Membrane Oxygenation
Authors: Huang Wei-Yi
Abstract:
Objective: This article explores the intensive care nursing experience of a lung cancer patient with pulmonary embolism who was placed on ECMO. Following a sudden change in the patient’s condition and a consensus reached during a family meeting, the decision was made to withdraw life-sustaining equipment and collaborate with the palliative care team. Methods: The nursing period was from October 20 to October 27, 2023. The author monitored physiological data, observed, provided direct care, conducted interviews, performed physical assessments, and reviewed medical records. Together with the critical care team and bypass personnel, a comprehensive assessment was conducted using Gordon's Eleven Functional Health Patterns to identify the patient’s health issues, which included pain related to lung cancer and invasive devices, fear of death due to sudden deterioration, and altered tissue perfusion related to hemodynamic instability. Results: The patient was admitted with fever, back pain, and painful urination. During hospitalization, the patient experienced sudden discomfort followed by cardiac arrest, requiring multiple CPR attempts and ECMO placement. A subsequent CT angiogram revealed a pulmonary embolism. The patient's condition was further complicated by severe pain due to compression fractures, and a diagnosis of terminal lung cancer was unexpectedly confirmed, leading to emotional distress and uncertainty about future treatment. Throughout the critical care process, ECMO was removed on October 24, stabilizing the patient’s body temperature between 36.5-37°C and maintaining a mean arterial pressure of 60-80 mmHg. Pain management, including Morphine 8mg in 0.9% N/S 100ml IV drip q6h PRN and Ultracet 37.5 mg/325 mg 1# PO q6h, kept the pain level below 3. The patient was transferred to the ward on October 27 and discharged home on October 30. Conclusion: During the care period, collaboration with the medical team and palliative care professionals was crucial. Adjustments to pain medication, symptom management, and lung cancer-targeted therapy improved the patient’s physical discomfort and pain levels. By applying the unique functions of nursing and the four principles of palliative care, positive encouragement was provided. Family members, along with social workers, clergy, psychologists, and nutritionists, participated in cross-disciplinary care, alleviating anxiety and fear. The consensus to withdraw ECMO and life-sustaining equipment enabled the patient and family to receive high-quality care and maintain autonomy in decision-making. A follow-up call on November 1 confirmed that the patient was emotionally stable, pain-free, and continuing with targeted lung cancer therapy.Keywords: intensive care, lung cancer, pulmonary embolism, ECMO
Procedia PDF Downloads 3413386 Automatic Extraction of Water Bodies Using Whole-R Method
Authors: Nikhat Nawaz, S. Srinivasulu, P. Kesava Rao
Abstract:
Feature extraction plays an important role in many remote sensing applications. Automatic extraction of water bodies is of great significance in many remote sensing applications like change detection, image retrieval etc. This paper presents a procedure for automatic extraction of water information from remote sensing images. The algorithm uses the relative location of R-colour component of the chromaticity diagram. This method is then integrated with the effectiveness of the spatial scale transformation of whole method. The whole method is based on water index fitted from spectral library. Experimental results demonstrate the improved accuracy and effectiveness of the integrated method for automatic extraction of water bodies.Keywords: feature extraction, remote sensing, image retrieval, chromaticity, water index, spectral library, integrated method
Procedia PDF Downloads 39213385 Task Scheduling on Parallel System Using Genetic Algorithm
Authors: Jasbir Singh Gill, Baljit Singh
Abstract:
Scheduling and mapping the application task graph on multiprocessor parallel systems is considered as the most crucial and critical NP-complete problem. Many genetic algorithms have been proposed to solve such problems. In this paper, two genetic approach based algorithms have been designed and developed with or without task duplication. The proposed algorithms work on two fitness functions. The first fitness i.e. task fitness is used to minimize the total finish time of the schedule (schedule length) while the second fitness function i.e. process fitness is concerned with allocating the tasks to the available highly efficient processor from the list of available processors (load balance). Proposed genetic-based algorithms have been experimentally implemented and evaluated with other state-of-art popular and widely used algorithms.Keywords: parallel computing, task scheduling, task duplication, genetic algorithm
Procedia PDF Downloads 35313384 Effect of Precursor’s Grain Size on the Conversion of Microcrystalline Gallium Antimonide GaSb to Nanocrystalline Gallium Nitride GaN
Authors: Jerzy F. Janik, Mariusz Drygas, Miroslaw M. Bucko
Abstract:
A simple precursor system has been recently developed in our laboratory for the conversion of affordable microcrystalline gallium antimonide GaSb to a range of nanocrystalline powders of gallium nitride GaN – a wide bandgap semiconductor indispensable in modern optoelectronics. The process relies on high temperature nitridation reactions of GaSb with ammonia. Topochemical relationships set up by the cubic lattice of GaSb result in some metastable cubic GaN formed in addition to the stable hexagonal GaN. A prior application of high energy ball milling to the initially microcrystalline GaSb precursor is shown to alter the nitridation output.Keywords: nanocrystalline, gallium nitride, GaN, gallium antimonide, GaSb, nitridation, ball milling
Procedia PDF Downloads 40213383 Definition of Aerodynamic Coefficients for Microgravity Unmanned Aerial System
Authors: Gamaliel Salazar, Adriana Chazaro, Oscar Madrigal
Abstract:
The evolution of Unmanned Aerial Systems (UAS) has made it possible to develop new vehicles capable to perform microgravity experiments which due its cost and complexity were beyond the reach for many institutions. In this study, the aerodynamic behavior of an UAS is studied through its deceleration stage after an initial free fall phase (where the microgravity effect is generated) using Computational Fluid Dynamics (CFD). Due to the fact that the payload would be analyzed under a microgravity environment and the nature of the payload itself, the speed of the UAS must be reduced in a smoothly way. Moreover, the terminal speed of the vehicle should be low enough to preserve the integrity of the payload and vehicle during the landing stage. The UAS model is made by a study pod, control surfaces with fixed and mobile sections, landing gear and two semicircular wing sections. The speed of the vehicle is decreased by increasing the angle of attack (AoA) of each wing section from 2° (where the airfoil S1091 has its greatest aerodynamic efficiency) to 80°, creating a circular wing geometry. Drag coefficients (Cd) and forces (Fd) are obtained employing CFD analysis. A simplified 3D model of the vehicle is analyzed using Ansys Workbench 16. The distance between the object of study and the walls of the control volume is eight times the length of the vehicle. The domain is discretized using an unstructured mesh based on tetrahedral elements. The refinement of the mesh is made by defining an element size of 0.004 m in the wing and control surfaces in order to figure out the fluid behavior in the most important zones, as well as accurate approximations of the Cd. The turbulent model k-epsilon is selected to solve the governing equations of the fluids while a couple of monitors are placed in both wing and all-body vehicle to visualize the variation of the coefficients along the simulation process. Employing a statistical approximation response surface methodology the case of study is parametrized considering the AoA of the wing as the input parameter and Cd and Fd as output parameters. Based on a Central Composite Design (CCD), the Design Points (DP) are generated so the Cd and Fd for each DP could be estimated. Applying a 2nd degree polynomial approximation the drag coefficients for every AoA were determined. Using this values, the terminal speed at each position is calculated considering a specific Cd. Additionally, the distance required to reach the terminal velocity at each AoA is calculated, so the minimum distance for the entire deceleration stage without comprising the payload could be determine. The Cd max of the vehicle is 1.18, so its maximum drag will be almost like the drag generated by a parachute. This guarantees that aerodynamically the vehicle can be braked, so it could be utilized for several missions allowing repeatability of microgravity experiments.Keywords: microgravity effect, response surface, terminal speed, unmanned system
Procedia PDF Downloads 17713382 Transforming Ganges to be a Living River through Waste Water Management
Authors: P. M. Natarajan, Shambhu Kallolikar, S. Ganesh
Abstract:
By size and volume of water, Ganges River basin is the biggest among the fourteen major river basins in India. By Hindu’s faith, it is the main ‘holy river’ in this nation. But, of late, the pollution load, both domestic and industrial sources are deteriorating the surface and groundwater as well as land resources and hence the environment of the Ganges River basin is under threat. Seeing this scenario, the Indian government began to reclaim this river by two Ganges Action Plans I and II since 1986 by spending Rs. 2,747.52 crores ($457.92 million). But the result was no improvement in the water quality of the river and groundwater and environment even after almost three decades of reclamation, and hence now the New Indian Government is taking extra care to rejuvenate this river and allotted Rs. 2,037 cores ($339.50 million) in 2014 and Rs. 20,000 crores ($3,333.33 million) in 2015. The reasons for the poor water quality and stinking environment even after three decades of reclamation of the river are either no treatment/partial treatment of the sewage. Hence, now the authors are suggesting a tertiary level treatment standard of sewages of all sources and origins of the Ganges River basin and recycling the entire treated water for nondomestic uses. At 20million litres per day (MLD) capacity of each sewage treatment plant (STP), this basin needs about 2020 plants to treat the entire sewage load. Cost of the STPs is Rs. 3,43,400 million ($5,723.33 million) and the annual maintenance cost is Rs. 15,352 million ($255.87 million). The advantages of the proposed exercise are: we can produce a volume of 1,769.52 million m3 of biogas. Since biogas is energy, can be used as a fuel, for any heating purpose, such as cooking. It can also be used in a gas engine to convert the energy in the gas into electricity and heat. It is possible to generate about 3,539.04 million kilowatt electricity per annum from the biogas generated in the process of wastewater treatment in Ganges basin. The income generation from electricity works out to Rs 10,617.12million ($176.95million). This power can be used to bridge the supply and demand gap of energy in the power hungry villages where 300million people are without electricity in India even today, and to run these STPs as well. The 664.18 million tonnes of sludge generated by the treatment plants per annum can be used in agriculture as manure with suitable amendments. By arresting the pollution load the 187.42 cubic kilometer (km3) of groundwater potential of the Ganges River basin could be protected from deterioration. Since we can recycle the sewage for non-domestic purposes, about 14.75km3 of fresh water per annum can be conserved for future use. The total value of the water saving per annum is Rs.22,11,916million ($36,865.27million) and each citizen of Ganges River basin can save Rs. 4,423.83/ ($73.73) per annum and Rs. 12.12 ($0.202) per day by recycling the treated water for nondomestic uses. Further the environment of this basin could be kept clean by arresting the foul smell as well as the 3% of greenhouse gages emission from the stinking waterways and land. These are the ways to reclaim the waterways of Ganges River basin from deterioration.Keywords: Holy Ganges River, lifeline of India, wastewater treatment and management, making Ganges permanently holy
Procedia PDF Downloads 28613381 Efficient Bargaining versus Right to Manage in the Era of Liberalization
Authors: Panagiota Koliousi, Natasha Miaouli
Abstract:
We compare product and labour market liberalization under the two trade union bargaining models: the Right-to-Manage (RTM) model and the Efficient Bargaining (EB) model. The vehicle is a dynamic general equilibrium (DGE) model that incorporates two types of agents (capitalists and workers), imperfectly competitive product and labour markets. The model is solved numerically employing common parameter values and data from the euro area. A key message is that product market deregulation is favourable under any labour market structure while opting for labour market deregulation one should provide special attention to the structure of the labour market such as the bargaining system of unions. If the prevailing way of bargaining is the RTM model then restructuring both markets is beneficial for all agents.Keywords: market structure, structural reforms, trade unions, unemployment
Procedia PDF Downloads 19913380 Quantitative, Preservative Methodology for Review of Interview Transcripts Using Natural Language Processing
Authors: Rowan P. Martnishn
Abstract:
During the execution of a National Endowment of the Arts grant, approximately 55 interviews were collected from professionals across various fields. These interviews were used to create deliverables – historical connections for creations that began as art and evolved entirely into computing technology. With dozens of hours’ worth of transcripts to be analyzed by qualitative coders, a quantitative methodology was created to sift through the documents. The initial step was to both clean and format all the data. First, a basic spelling and grammar check was applied, as well as a Python script for normalized formatting which used an open-source grammatical formatter to make the data as coherent as possible. 10 documents were randomly selected to manually review, where words often incorrectly translated during the transcription were recorded and replaced throughout all other documents. Then, to remove all banter and side comments, the transcripts were spliced into paragraphs (separated by change in speaker) and all paragraphs with less than 300 characters were removed. Secondly, a keyword extractor, a form of natural language processing where significant words in a document are selected, was run on each paragraph for all interviews. Every proper noun was put into a data structure corresponding to that respective interview. From there, a Bidirectional and Auto-Regressive Transformer (B.A.R.T.) summary model was then applied to each paragraph that included any of the proper nouns selected from the interview. At this stage the information to review had been sent from about 60 hours’ worth of data to 20. The data was further processed through light, manual observation – any summaries which proved to fit the criteria of the proposed deliverable were selected, as well their locations within the document. This narrowed that data down to about 5 hours’ worth of processing. The qualitative researchers were then able to find 8 more connections in addition to our previous 4, exceeding our minimum quota of 3 to satisfy the grant. Major findings of the study and subsequent curation of this methodology raised a conceptual finding crucial to working with qualitative data of this magnitude. In the use of artificial intelligence there is a general trade off in a model between breadth of knowledge and specificity. If the model has too much knowledge, the user risks leaving out important data (too general). If the tool is too specific, it has not seen enough data to be useful. Thus, this methodology proposes a solution to this tradeoff. The data is never altered outside of grammatical and spelling checks. Instead, the important information is marked, creating an indicator of where the significant data is without compromising the purity of it. Secondly, the data is chunked into smaller paragraphs, giving specificity, and then cross-referenced with the keywords (allowing generalization over the whole document). This way, no data is harmed, and qualitative experts can go over the raw data instead of using highly manipulated results. Given the success in deliverable creation as well as the circumvention of this tradeoff, this methodology should stand as a model for synthesizing qualitative data while maintaining its original form.Keywords: B.A.R.T.model, keyword extractor, natural language processing, qualitative coding
Procedia PDF Downloads 3413379 Research on Reducing Food Losses by Extending the Date of Minimum Durability on the Example of Cereal Products
Authors: Monika Trzaskowska, Dorota Zielinska, Anna Lepecka, Katarzyna Neffe-Skocinska, Beata Bilska, Marzena Tomaszewska, Danuta Kolozyn-Krajewska
Abstract:
Microbiological quality and food safety are important food characteristics. Regulation (EU) No 1169/2011 of the European Parliament and of the Council on the provision of food information to consumers introduces the obligation to provide information on the 'use-by' date or the date of minimum durability (DMD). The second term is the date until which the properly stored or transported foodstuff retains its physical, chemical, microbiological and organoleptic properties. The date should be preceded by 'best before'. It is used for durable products, e.g., pasta. In relation to reducing food losses, the question may be asked whether products with the date of minimum durability currently declared retain quality and safety beyond this. The aim of the study was to assess the sensory quality and microbiological safety of selected cereal products, i.e., pasta and millet after DMD. The scope of the study was to determine the markers of microbiological quality, i.e., the total viable count (TVC), the number of bacteria from the Enterobacteriaceae family and the number of yeast and mold (TYMC) on the last day of DMD and after 1 and 3 months of storage. In addition, the presence of Salmonella and Listeria monocytogenes was examined on the last day of DMD. The sensory quality of products was assessed by quantitative descriptive analysis (QDA), the intensity of 14 differentiators and overall quality were defined and determined. In the tested samples of millet and pasta, no pathogenic bacteria Salmonella and Listeria monocytogenes were found. The value of the distinguishing features of selected quality and microbiological safety indicators on the last DMD day was in the range of about 3-1 log cfu/g. This demonstrates the good microbiological quality of the tested food. Comparing the products, a higher number of microorganisms was found in the samples of millet. After 3 months of storage, TVC decreased in millet, while in pasta, it was found to increase in value. In both products, the number of bacteria from the Enterobacretiaceae family decreased. In contrast, the number of TYMCs increased in samples of millet, and in pasta decreased. The intensity of sensory characteristic in the studied period varied. It remained at a similar level or increased. Millet was found to increase the intensity and flavor of 'cooked porridge' 3 months after DMD. Similarly, in the pasta, the smell and taste of 'cooked pasta' was more intense. To sum up, the researched products on the last day of the minimum durability date were characterized by very good microbiological and sensory quality, which was maintained for 3 months after this date. Based on these results, the date of minimum durability of tested products could be extended. The publication was financed on the basis of an agreement with the National Center for Research and Development No. Gospostrateg 1/385753/1/NCBR/2018 for the implementation and financing of the project under the strategic research and development program 'social and economic development of Poland in the conditions of globalizing markets – GOSPOSTRATEG - acronym PROM'.Keywords: date of minimum durability, food losses, food quality and safety, millet, pasta
Procedia PDF Downloads 16313378 Time Efficient Color Coding for Structured-Light 3D Scanner
Authors: Po-Hao Huang, Pei-Ju Chiang
Abstract:
The structured light 3D scanner is commonly used for measuring the 3D shape of an object. Through projecting designed light patterns on the object, deformed patterns can be obtained and used for the geometric shape reconstruction. At present, Gray code is the most reliable and commonly used light pattern in the structured light 3D scanner. However, the trade-off between scanning efficiency and accuracy is a long-standing and challenging problem. The design of light patterns plays a significant role in the scanning efficiency and accuracy. Thereby, we proposed a novel encoding method integrating color information and Gray-code to improve the scanning efficiency. We will demonstrate that with the proposed method, the scanning time can be reduced to approximate half of the one needed by Gray-code without reduction of precision.Keywords: gray-code, structured light scanner, 3D shape acquisition, 3D reconstruction
Procedia PDF Downloads 46313377 Investigation of Astrocyte Physiology on Stiffness-Controlled Cellulose Acetate Nanofiber as a Tissue Scaffold
Authors: Sun Il Yu, Jung Hyun Joo, Hwa Sung Shin
Abstract:
Astrocytes are known as dominant cells in CNS and play a role as a supporter of CNS activity and regeneration. Recently, three-dimensional culture of astrocytes were actively applied to understand in vivo astrocyte works. Electrospun nanofibers are attractive for 3D cell culture system because they have a high surface to volume ratio and porous structure, and have already been used for 3D astrocyte cultures. In this research, the stiffness of cellulose acetate (CA) nanofiber was controlled by heat treatment. As stiffness increased, astrocyte cell viability and adhesion increased. Reactivity of astrocyte was also upregulated in stiffer CA nanofiber in terms of GFAP, an intermediate filament protein. Finally, we demonstrated that stiffness-controllable CA is attractive for astrocyte tissue engineering.Keywords: astrocyte, cellulose acetate, nanofiber, tissue scaffold
Procedia PDF Downloads 35713376 Evaluation of the Effect of Learning Disabilities and Accommodations on the Prediction of the Exam Performance: Ordinal Decision-Tree Algorithm
Abstract:
Providing students with learning disabilities (LD) with extra time to grant them equal access to the exam is a necessary but insufficient condition to compensate for their LD; there should also be a clear indication that the additional time was actually used. For example, if students with LD use more time than students without LD and yet receive lower grades, this may indicate that a different accommodation is required. If they achieve higher grades but use the same amount of time, then the effectiveness of the accommodation has not been demonstrated. The main goal of this study is to evaluate the effect of including parameters related to LD and extended exam time, along with other commonly-used characteristics (e.g., student background and ability measures such as high-school grades), on the ability of ordinal decision-tree algorithms to predict exam performance. We use naturally-occurring data collected from hundreds of undergraduate engineering students. The sub-goals are i) to examine the improvement in prediction accuracy when the indicator of exam performance includes 'actual time used' in addition to the conventional indicator (exam grade) employed in most research; ii) to explore the effectiveness of extended exam time on exam performance for different courses and for LD students with different profiles (i.e., sets of characteristics). This is achieved by using the patterns (i.e., subgroups) generated by the algorithms to identify pairs of subgroups that differ in just one characteristic (e.g., course or type of LD) but have different outcomes in terms of exam performance (grade and time used). Since grade and time used to exhibit an ordering form, we propose a method based on ordinal decision-trees, which applies a weighted information-gain ratio (WIGR) measure for selecting the classifying attributes. Unlike other known ordinal algorithms, our method does not assume monotonicity in the data. The proposed WIGR is an extension of an information-theoretic measure, in the sense that it adjusts to the case of an ordinal target and takes into account the error severity between two different target classes. Specifically, we use ordinal C4.5, random-forest, and AdaBoost algorithms, as well as an ensemble technique composed of ordinal and non-ordinal classifiers. Firstly, we find that the inclusion of LD and extended exam-time parameters improves prediction of exam performance (compared to specifications of the algorithms that do not include these variables). Secondly, when the indicator of exam performance includes 'actual time used' together with grade (as opposed to grade only), the prediction accuracy improves. Thirdly, our subgroup analyses show clear differences in the effect of extended exam time on exam performance among different courses and different student profiles. From a methodological perspective, we find that the ordinal decision-tree based algorithms outperform their conventional, non-ordinal counterparts. Further, we demonstrate that the ensemble-based approach leverages the strengths of each type of classifier (ordinal and non-ordinal) and yields better performance than each classifier individually.Keywords: actual exam time usage, ensemble learning, learning disabilities, ordinal classification, time extension
Procedia PDF Downloads 103