Search results for: hydrologic modeling system
9264 The Role and Challenges of Social Workers in Child Protection: The Case of Indonesia
Authors: B. Rusyidi
Abstract:
Since 2009, the Indonesian Ministry of Social Affairs has been implementing Program Kesejahteraan Sosial Anak (PKSA) (Child Welfare Program) a conditional cash transfer program that targets neglected children, children with disabilities, street children, children in conflict with the law, and children in need of special protection, all from poor households. PKSA integrates three elements: Transfer of cash, care and social services through social workers, and institutional childcare assistance. This qualitative study analyzed the roles and the challenges of social workers in implementing PKSA and lays out recommendations to inform policy changes. Data were collected in late 2014 from national and local government and non-government child welfare agencies, social workers, and childcare institution representatives through interviews and Focused Group Discussions (FGDs). Field work took place in six districts in the provinces of Jakarta, Central Java and South Sulawesi. The study found that the social workers’ role was significant in facilitating cash transfer, providing education and guidance, and linking children and families to basic social services. This improved utilization of basic social services enhanced children and families’ behaviors and contributed to the well being of the children. However, only a small number of childcare institutions have social workers, leaving many children and families without care and social service linkages, depriving them of rehabilitative components to help them regain their social functions. Some social workers reported their struggles with heavy workloads, lack of professional competencies and training, limited job security, and inadequate professional acknowledgment from other professions. Parts of those challenges were due to the centralized nature of the program and the lack of shared vision and commitment about the child protection system among related government agencies both at the national and local levels. The study highlights the necessity to implement an integrated child protection system, decentralize the PKSA program, and increase the number, competence, case management, and management and monitoring of social workers. The most recent progress of the program and its impacts on social workers are also discussed.Keywords: child protection, conditional cash transfer, program decentralization, social worker, working conditions
Procedia PDF Downloads 2189263 Rapid, Direct, Real-Time Method for Bacteria Detection on Surfaces
Authors: Evgenia Iakovleva, Juha Koivisto, Pasi Karppinen, J. Inkinen, Mikko Alava
Abstract:
Preventing the spread of infectious diseases throughout the worldwide is one of the most important tasks of modern health care. Infectious diseases not only account for one fifth of the deaths in the world, but also cause many pathological complications for the human health. Touch surfaces pose an important vector for the spread of infections by varying microorganisms, including antimicrobial resistant organisms. Further, antimicrobial resistance is reply of bacteria to the overused or inappropriate used of antibiotics everywhere. The biggest challenges in bacterial detection by existing methods are non-direct determination, long time of analysis, the sample preparation, use of chemicals and expensive equipment, and availability of qualified specialists. Therefore, a high-performance, rapid, real-time detection is demanded in rapid practical bacterial detection and to control the epidemiological hazard. Among the known methods for determining bacteria on the surfaces, Hyperspectral methods can be used as direct and rapid methods for microorganism detection on different kind of surfaces based on fluorescence without sampling, sample preparation and chemicals. The aim of this study was to assess the relevance of such systems to remote sensing of surfaces for microorganisms detection to prevent a global spread of infectious diseases. Bacillus subtilis and Escherichia coli with different concentrations (from 0 to 10x8 cell/100µL) were detected with hyperspectral camera using different filters as visible visualization of bacteria and background spots on the steel plate. A method of internal standards was applied for monitoring the correctness of the analysis results. Distances from sample to hyperspectral camera and light source are 25 cm and 40 cm, respectively. Each sample is optically imaged from the surface by hyperspectral imaging system, utilizing a JAI CM-140GE-UV camera. Light source is BeamZ FLATPAR DMX Tri-light, 3W tri-colour LEDs (red, blue and green). Light colors are changed through DMX USB Pro interface. The developed system was calibrated following a standard procedure of setting exposure and focused for light with λ=525 nm. The filter is ThorLabs KuriousTM hyperspectral filter controller with wavelengths from 420 to 720 nm. All data collection, pro-processing and multivariate analysis was performed using LabVIEW and Python software. The studied human eye visible and invisible bacterial stains clustered apart from a reference steel material by clustering analysis using different light sources and filter wavelengths. The calculation of random and systematic errors of the analysis results proved the applicability of the method in real conditions. Validation experiments have been carried out with photometry and ATP swab-test. The lower detection limit of developed method is several orders of magnitude lower than for both validation methods. All parameters of the experiments were the same, except for the light. Hyperspectral imaging method allows to separate not only bacteria and surfaces, but also different types of bacteria, such as Gram-negative Escherichia coli and Gram-positive Bacillus subtilis. Developed method allows skipping the sample preparation and the use of chemicals, unlike all other microbiological methods. The time of analysis with novel hyperspectral system is a few seconds, which is innovative in the field of microbiological tests.Keywords: Escherichia coli, Bacillus subtilis, hyperspectral imaging, microorganisms detection
Procedia PDF Downloads 2249262 Cars Redistribution Optimization Problem in the Free-Float Car-Sharing
Authors: Amine Ait-Ouahmed, Didier Josselin, Fen Zhou
Abstract:
Free-Float car-sharing is an one-way car-sharing service where cars are available anytime and anywhere in the streets such that no dedicated stations are needed. This means that after driving a car you can park it anywhere. This car-sharing system creates an imbalance car distribution in the cites which can be regulated by staff agents through the redistribution of cars. In this paper, we aim to solve the car-reservation and agents traveling problem so that the number of successful cars’ reservations could be maximized. Beside, we also tend to minimize the distance traveled by agents for cars redistribution. To this end, we present a mixed integer linear programming formulation for the car-sharing problem.Keywords: one-way car-sharing, vehicle redistribution, car reservation, linear programming
Procedia PDF Downloads 3489261 Wildlife Habitat Corridor Mapping in Urban Environments: A GIS-Based Approach Using Preliminary Category Weightings
Authors: Stefan Peters, Phillip Roetman
Abstract:
The global loss of biodiversity is threatening the benefits nature provides to human populations and has become a more pressing issue than climate change and requires immediate attention. While there have been successful global agreements for environmental protection, such as the Montreal Protocol, these are rare, and we cannot rely on them solely. Thus, it is crucial to take national and local actions to support biodiversity. Australia is one of the 17 countries in the world with a high level of biodiversity, and its cities are vital habitats for endangered species, with more of them found in urban areas than in non-urban ones. However, the protection of biodiversity in metropolitan Adelaide has been inadequate, with over 130 species disappearing since European colonization in 1836. In this research project we conceptualized, developed and implemented a framework for wildlife Habitat Hotspots and Habitat Corridor modelling in an urban context using geographic data and GIS modelling and analysis. We used detailed topographic and other geographic data provided by a local council, including spatial and attributive properties of trees, parcels, water features, vegetated areas, roads, verges, traffic, and census data. Weighted factors considered in our raster-based Habitat Hotspot model include parcel size, parcel shape, population density, canopy cover, habitat quality and proximity to habitats and water features. Weighted factors considered in our raster-based Habitat Corridor model include habitat potential (resulting from the Habitat Hotspot model), verge size, road hierarchy, road widths, human density, and presence of remnant indigenous vegetation species. We developed a GIS model, using Python scripting and ArcGIS-Pro Model-Builder, to establish an automated reproducible and adjustable geoprocessing workflow, adaptable to any study area of interest. Our habitat hotspot and corridor modelling framework allow to determine and map existing habitat hotspots and wildlife habitat corridors. Our research had been applied to the study case of Burnside, a local council in Adelaide, Australia, which encompass an area of 30 km2. We applied end-user expertise-based category weightings to refine our models and optimize the use of our habitat map outputs towards informing local strategic decision-making.Keywords: biodiversity, GIS modeling, habitat hotspot, wildlife corridor
Procedia PDF Downloads 1159260 Study of Variation of Winds Behavior on Micro Urban Environment with Use of Fuzzy Logic for Wind Power Generation: Case Study in the Cities of Arraial do Cabo and São Pedro da Aldeia, State of Rio de Janeiro, Brazil
Authors: Roberto Rosenhaim, Marcos Antonio Crus Moreira, Robson da Cunha, Gerson Gomes Cunha
Abstract:
This work provides details on the wind speed behavior within cities of Arraial do Cabo and São Pedro da Aldeia located in the Lakes Region of the State of Rio de Janeiro, Brazil. This region has one of the best potentials for wind power generation. In interurban layer, wind conditions are very complex and depend on physical geography, size and orientation of buildings and constructions around, population density, and land use. In the same context, the fundamental surface parameter that governs the production of flow turbulence in urban canyons is the surface roughness. Such factors can influence the potential for power generation from the wind within the cities. Moreover, the use of wind on a small scale is not fully utilized due to complexity of wind flow measurement inside the cities. It is difficult to accurately predict this type of resource. This study demonstrates how fuzzy logic can facilitate the assessment of the complexity of the wind potential inside the cities. It presents a decision support tool and its ability to deal with inaccurate information using linguistic variables created by the heuristic method. It relies on the already published studies about the variables that influence the wind speed in the urban environment. These variables were turned into the verbal expressions that are used in computer system, which facilitated the establishment of rules for fuzzy inference and integration with an application for smartphones used in the research. In the first part of the study, challenges of the sustainable development which are described are followed by incentive policies to the use of renewable energy in Brazil. The next chapter follows the study area characteristics and the concepts of fuzzy logic. Data were collected in field experiment by using qualitative and quantitative methods for assessment. As a result, a map of the various points is presented within the cities studied with its wind viability evaluated by a system of decision support using the method multivariate classification based on fuzzy logic.Keywords: behavior of winds, wind power, fuzzy logic, sustainable development
Procedia PDF Downloads 2939259 How Virtualization, Decentralization, and Network-Building Change the Manufacturing Landscape: An Industry 4.0 Perspective
Authors: Malte Brettel, Niklas Friederichsen, Michael Keller, Marius Rosenberg
Abstract:
The German manufacturing industry has to withstand an increasing global competition on product quality and production costs. As labor costs are high, several industries have suffered severely under the relocation of production facilities towards aspiring countries, which have managed to close the productivity and quality gap substantially. Established manufacturing companies have recognized that customers are not willing to pay large price premiums for incremental quality improvements. As a consequence, many companies from the German manufacturing industry adjust their production focusing on customized products and fast time to market. Leveraging the advantages of novel production strategies such as Agile Manufacturing and Mass Customization, manufacturing companies transform into integrated networks, in which companies unite their core competencies. Hereby, virtualization of the process- and supply-chain ensures smooth inter-company operations providing real-time access to relevant product and production information for all participating entities. Boundaries of companies deteriorate, as autonomous systems exchange data, gained by embedded systems throughout the entire value chain. By including Cyber-Physical-Systems, advanced communication between machines is tantamount to their dialogue with humans. The increasing utilization of information and communication technology allows digital engineering of products and production processes alike. Modular simulation and modeling techniques allow decentralized units to flexibly alter products and thereby enable rapid product innovation. The present article describes the developments of Industry 4.0 within the literature and reviews the associated research streams. Hereby, we analyze eight scientific journals with regards to the following research fields: Individualized production, end-to-end engineering in a virtual process chain and production networks. We employ cluster analysis to assign sub-topics into the respective research field. To assess the practical implications, we conducted face-to-face interviews with managers from the industry as well as from the consulting business using a structured interview guideline. The results reveal reasons for the adaption and refusal of Industry 4.0 practices from a managerial point of view. Our findings contribute to the upcoming research stream of Industry 4.0 and support decision-makers to assess their need for transformation towards Industry 4.0 practices.Keywords: Industry 4.0., mass customization, production networks, virtual process-chain
Procedia PDF Downloads 2779258 Combination of Modelling and Environmental Life Cycle Assessment Approach for Demand Driven Biogas Production
Authors: Juan A. Arzate, Funda C. Ertem, M. Nicolas Cruz-Bournazou, Peter Neubauer, Stefan Junne
Abstract:
— One of the biggest challenges the world faces today is global warming that is caused by greenhouse gases (GHGs) coming from the combustion of fossil fuels for energy generation. In order to mitigate climate change, the European Union has committed to reducing GHG emissions to 80–95% below the level of the 1990s by the year 2050. Renewable technologies are vital to diminish energy-related GHG emissions. Since water and biomass are limited resources, the largest contributions to renewable energy (RE) systems will have to come from wind and solar power. Nevertheless, high proportions of fluctuating RE will present a number of challenges, especially regarding the need to balance the variable energy demand with the weather dependent fluctuation of energy supply. Therefore, biogas plants in this content would play an important role, since they are easily adaptable. Feedstock availability varies locally or seasonally; however there is a lack of knowledge in how biogas plants should be operated in a stable manner by local feedstock. This problem may be prevented through suitable control strategies. Such strategies require the development of convenient mathematical models, which fairly describe the main processes. Modelling allows us to predict the system behavior of biogas plants when different feedstocks are used with different loading rates. Life cycle assessment (LCA) is a technique for analyzing several sides from evolution of a product till its disposal in an environmental point of view. It is highly recommend to use as a decision making tool. In order to achieve suitable strategies, the combination of a flexible energy generation provided by biogas plants, a secure production process and the maximization of the environmental benefits can be obtained by the combination of process modelling and LCA approaches. For this reason, this study focuses on the biogas plant which flexibly generates required energy from the co-digestion of maize, grass and cattle manure, while emitting the lowest amount of GHG´s. To achieve this goal AMOCO model was combined with LCA. The program was structured in Matlab to simulate any biogas process based on the AMOCO model and combined with the equations necessary to obtain climate change, acidification and eutrophication potentials of the whole production system based on ReCiPe midpoint v.1.06 methodology. Developed simulation was optimized based on real data from operating biogas plants and existing literature research. The results prove that AMOCO model can successfully imitate the system behavior of biogas plants and the necessary time required for the process to adapt in order to generate demanded energy from available feedstock. Combination with LCA approach provided opportunity to keep the resulting emissions from operation at the lowest possible level. This would allow for a prediction of the process, when the feedstock utilization supports the establishment of closed material circles within a smart bio-production grid – under the constraint of minimal drawbacks for the environment and maximal sustainability.Keywords: AMOCO model, GHG emissions, life cycle assessment, modelling
Procedia PDF Downloads 1889257 Association between Single Nucleotide Polymorphism of Calpain1 Gene and Meat Tenderness Traits in Different Genotypes of Chicken: Malaysian Native and Commercial Broiler Line
Authors: Abtehal Y. Anaas, Mohd. Nazmi Bin Abd. Manap
Abstract:
Meat Tenderness is one of the most important factors affecting consumers' assessment of meat quality. Variation in meat tenderness is genetically controlled and varies among breeds, and it is also influenced by environmental factors that can affect its creation during rigor mortis and postmortem. The final postmortem meat tenderization relies on the extent of proteolysis of myofibrillar proteins caused by the endogenous activity of the proteolytic calpain system. This calpain system includes different calcium-dependent cysteine proteases, and an inhibitor, calpastatin. It is widely accepted that in farm animals including chickens, the μ-calpain gene (CAPN1) is a physiological candidate gene for meat tenderness. This study aimed to identify the association of single nucleotide polymorphism (SNP) markers in the CAPN1 gene with the tenderness of chicken breast meat from two Malaysian native and commercial broiler breed crosses. Ten, five months old native chickens and ten, 42 days commercial broilers were collected from the local market and breast muscles were removed two hours after slaughter, packed separately in plastic bags and kept at -20ºC for 24 h. The tenderness phenotype for all chickens’ breast meats was determined by Warner-Bratzler Shear Force (WBSF). Thawing and cooking losses were also measured in the same breast samples before using in WBSF determination. Polymerase chain reaction (PCR) was used to identify the previously reported C7198A and G9950A SNPs in the CAPN1 gene and assess their associations with meat tenderness in the two breeds. The broiler breast meat showed lower shear force values and lower thawing loss rates than the native chickens (p<0.05), whereas there were similar in the rates of cooking loss. The study confirms some previous results that the markers CAPN1 C7198A and G9950A were not significantly associated with the variation in meat tenderness in chickens. Therefore, further study is needed to confirm the functional molecular mechanism of these SNPs and evaluate their associations in different chicken populations.Keywords: CAPNl, chicken, meat tenderness, meat quality, SNPs
Procedia PDF Downloads 2459256 Voltage Stability Assessment and Enhancement Using STATCOM -A Case Study
Authors: Puneet Chawla, Balwinder Singh
Abstract:
Recently, increased attention has been devoted to the voltage instability phenomenon in power systems. Many techniques have been proposed in the literature for evaluating and predicting voltage stability using steady state analysis methods. In this paper, P-V and Q-V curves have been generated for a 57 bus Patiala Rajpura circle of India. The power-flow program is developed in MATLAB using Newton-Raphson method. Using Q-V curves, the weakest bus of the power system and the maximum reactive power change permissible on that bus is calculated. STATCOMs are placed on the weakest bus to improve the voltage and hence voltage stability and also the power transmission capability of the line.Keywords: voltage stability, reactive power, power flow, weakest bus, STATCOM
Procedia PDF Downloads 5159255 Model Predictive Controller for Pasteurization Process
Authors: Tesfaye Alamirew Dessie
Abstract:
Our study focuses on developing a Model Predictive Controller (MPC) and evaluating it against a traditional PID for a pasteurization process. Utilizing system identification from the experimental data, the dynamics of the pasteurization process were calculated. Using best fit with data validation, residual, and stability analysis, the quality of several model architectures was evaluated. The validation data fit the auto-regressive with exogenous input (ARX322) model of the pasteurization process by roughly 80.37 percent. The ARX322 model structure was used to create MPC and PID control techniques. After comparing controller performance based on settling time, overshoot percentage, and stability analysis, it was found that MPC controllers outperform PID for those parameters.Keywords: MPC, PID, ARX, pasteurization
Procedia PDF Downloads 1639254 Modeling Standpipe Pressure Using Multivariable Regression Analysis by Combining Drilling Parameters and a Herschel-Bulkley Model
Authors: Seydou Sinde
Abstract:
The aims of this paper are to formulate mathematical expressions that can be used to estimate the standpipe pressure (SPP). The developed formulas take into account the main factors that, directly or indirectly, affect the behavior of SPP values. Fluid rheology and well hydraulics are some of these essential factors. Mud Plastic viscosity, yield point, flow power, consistency index, flow rate, drillstring, and annular geometries are represented by the frictional pressure (Pf), which is one of the input independent parameters and is calculated, in this paper, using Herschel-Bulkley rheological model. Other input independent parameters include the rate of penetration (ROP), applied load or weight on the bit (WOB), bit revolutions per minute (RPM), bit torque (TRQ), and hole inclination and direction coupled in the hole curvature or dogleg (DL). The technique of repeating parameters and Buckingham PI theorem are used to reduce the number of the input independent parameters into the dimensionless revolutions per minute (RPMd), the dimensionless torque (TRQd), and the dogleg, which is already in the dimensionless form of radians. Multivariable linear and polynomial regression technique using PTC Mathcad Prime 4.0 is used to analyze and determine the exact relationships between the dependent parameter, which is SPP, and the remaining three dimensionless groups. Three models proved sufficiently satisfactory to estimate the standpipe pressure: multivariable linear regression model 1 containing three regression coefficients for vertical wells; multivariable linear regression model 2 containing four regression coefficients for deviated wells; and multivariable polynomial quadratic regression model containing six regression coefficients for both vertical and deviated wells. Although that the linear regression model 2 (with four coefficients) is relatively more complex and contains an additional term over the linear regression model 1 (with three coefficients), the former did not really add significant improvements to the later except for some minor values. Thus, the effect of the hole curvature or dogleg is insignificant and can be omitted from the input independent parameters without significant losses of accuracy. The polynomial quadratic regression model is considered the most accurate model due to its relatively higher accuracy for most of the cases. Data of nine wells from the Middle East were used to run the developed models with satisfactory results provided by all of them, even if the multivariable polynomial quadratic regression model gave the best and most accurate results. Development of these models is useful not only to monitor and predict, with accuracy, the values of SPP but also to early control and check for the integrity of the well hydraulics as well as to take the corrective actions should any unexpected problems appear, such as pipe washouts, jet plugging, excessive mud losses, fluid gains, kicks, etc.Keywords: standpipe, pressure, hydraulics, nondimensionalization, parameters, regression
Procedia PDF Downloads 849253 Analyzing the Connection between Productive Structure and Communicable Diseases: An Econometric Panel Study
Authors: Julio Silva, Lia Hasenclever, Gilson G. Silva Jr.
Abstract:
The aim of this paper is to check possible convergence in health measures (aged-standard rate of morbidity and mortality) for communicable diseases between developed and developing countries, conditional to productive structures features. Understanding the interrelations between health patterns and economic development is particularly important in the context of low- and middle-income countries, where economic development comes along with deep social inequality. Developing countries with less diversified productive structures (measured through complexity index) but high heterogeneous inter-sectorial labor productivity (using as a proxy inter-sectorial coefficient of variation of labor productivity) has on average low health levels in communicable diseases compared to developed countries with high diversified productive structures and low labor market heterogeneity. Structural heterogeneity and productive diversification may have influence on health levels even considering per capita income. We set up a panel data for 139 countries from 1995 to 2015, joining several data about the countries, as economic development, health, and health system coverage, environmental and socioeconomic aspects. This information was obtained from World Bank, International Labour Organization, Atlas of Economic Complexity, United Nation (Development Report) and Institute for Health Metrics and Evaluation Database. Econometric panel models evidence shows that the level of communicable diseases has a positive relationship with structural heterogeneity, even considering other factors as per capita income. On the other hand, the recent process of convergence in terms of communicable diseases have been motivated for other reasons not directly related to productive structure, as health system coverage and environmental aspects. These evidences suggest a joint dynamics between the unequal distribution of communicable diseases and countries' productive structure aspects. These set of evidence are quite important to public policy as meet the health aims in Millennium Development Goals. It also highlights the importance of the process of structural change as fundamental to shift the levels of health in terms of communicable diseases and can contribute to the debate between the relation of economic development and health patterns changes.Keywords: economic development, inequality, population health, structural change
Procedia PDF Downloads 1449252 A Life History of a Female Counselor Participated in Sewol Ferry Disaster Counseling Korea: Based on Qualitative Analysis of Mandelbaum's Life History
Authors: Donghun Lee, Jiyoung Shin, Youjin Kim, Jin Joo Kim
Abstract:
The sinking of Sewol ferry occurred in Korea on the morning of 16 April 2014 while carrying 476 people. In all, 304 passengers, mostly secondary school students from Danwon High School in Ansan City died in the disaster. The sinking of Sewol ferry has resulted in widespread social and political turmoil within South Korea. Many criticize the actions of the captain and crews of the ferry as well as the ferry operator and the regulators who oversaw its operations. However, huge criticism has been directed at the South Korean government for its national disaster response system. This disaster has made Korean government build up a new disaster management and psychological support system. The purpose of this study was to understand developmental and change process of a female counselor in her late fifties participated in Sewol ferry disaster counseling for a year. She has participated in providing as a counselor counseling and psychological support for the victims' families of Sewol ferry disaster, additionally as a director of community youth counseling center operated by local government by establishing governmental psychological supports plan for recovering collective trauma in the community, through which she have gotten self-reflection of whole her life. For in-depth interview data analysis, Mandelbaum’s three conceptual frameworks were employed; dimensions, turnings, and adaptation. The result of the study indicates extracted categories of life dimension, turning point and adaptation. The details of these categories are ‘having a self-image in youth’, ‘marriage in fairy-tale’, ‘unexpected death of husband’, ‘taking a step forward from darkness’, the way of counselor’, nice grown child’, ‘Sewol ferry disaster’ in life dimension, ‘death in front of life’, ‘milestone in life, counseling’ in turning points, ‘before Sewol ferry disaster’, ‘after Sewol ferry disaster’ in adaptation. Life history methods revealed the counselor’s internal developmental process by analyzing what Sewol ferry disaster influenced on an individual life, especially a counselor's one, what changes she went through, and how she adapted herself to that. Based on the results, discussions and suggestions are provided.Keywords: development and change, disaster counseling, identity of female counselor, Mandelbaum’s life history, Sewol ferry
Procedia PDF Downloads 3369251 Analysis of Histogram Asymmetry for Waste Recognition
Authors: Janusz Bobulski, Kamila Pasternak
Abstract:
Despite many years of effort and research, the problem of waste management is still current. So far, no fully effective waste management system has been developed. Many programs and projects improve statistics on the percentage of waste recycled every year. In these efforts, it is worth using modern Computer Vision techniques supported by artificial intelligence. In the article, we present a method of identifying plastic waste based on the asymmetry analysis of the histogram of the image containing the waste. The method is simple but effective (94%), which allows it to be implemented on devices with low computing power, in particular on microcomputers. Such de-vices will be used both at home and in waste sorting plants.Keywords: waste management, environmental protection, image processing, computer vision
Procedia PDF Downloads 1199250 Identification of Nonlinear Systems Structured by Hammerstein-Wiener Model
Authors: A. Brouri, F. Giri, A. Mkhida, A. Elkarkri, M. L. Chhibat
Abstract:
Standard Hammerstein-Wiener models consist of a linear subsystem sandwiched by two memoryless nonlinearities. Presently, the linear subsystem is allowed to be parametric or not, continuous- or discrete-time. The input and output nonlinearities are polynomial and may be noninvertible. A two-stage identification method is developed such the parameters of all nonlinear elements are estimated first using the Kozen-Landau polynomial decomposition algorithm. The obtained estimates are then based upon in the identification of the linear subsystem, making use of suitable pre-ad post-compensators.Keywords: nonlinear system identification, Hammerstein-Wiener systems, frequency identification, polynomial decomposition
Procedia PDF Downloads 5119249 Computer-Assisted Management of Building Climate and Microgrid with Model Predictive Control
Authors: Vinko Lešić, Mario Vašak, Anita Martinčević, Marko Gulin, Antonio Starčić, Hrvoje Novak
Abstract:
With 40% of total world energy consumption, building systems are developing into technically complex large energy consumers suitable for application of sophisticated power management approaches to largely increase the energy efficiency and even make them active energy market participants. Centralized control system of building heating and cooling managed by economically-optimal model predictive control shows promising results with estimated 30% of energy efficiency increase. The research is focused on implementation of such a method on a case study performed on two floors of our faculty building with corresponding sensors wireless data acquisition, remote heating/cooling units and central climate controller. Building walls are mathematically modeled with corresponding material types, surface shapes and sizes. Models are then exploited to predict thermal characteristics and changes in different building zones. Exterior influences such as environmental conditions and weather forecast, people behavior and comfort demands are all taken into account for deriving price-optimal climate control. Finally, a DC microgrid with photovoltaics, wind turbine, supercapacitor, batteries and fuel cell stacks is added to make the building a unit capable of active participation in a price-varying energy market. Computational burden of applying model predictive control on such a complex system is relaxed through a hierarchical decomposition of the microgrid and climate control, where the former is designed as higher hierarchical level with pre-calculated price-optimal power flows control, and latter is designed as lower level control responsible to ensure thermal comfort and exploit the optimal supply conditions enabled by microgrid energy flows management. Such an approach is expected to enable the inclusion of more complex building subsystems into consideration in order to further increase the energy efficiency.Keywords: price-optimal building climate control, Microgrid power flow optimisation, hierarchical model predictive control, energy efficient buildings, energy market participation
Procedia PDF Downloads 4659248 Optimum Design of Helical Gear System on Basis of Maximum Power Transmission Capability
Authors: Yasaman Esfandiari
Abstract:
Mechanical engineering has always dealt with amplification of the input power in power trains. One of the ways to achieve this goal is to use gears to change the amplitude and direction of the torque and the speed. However, the gears should be optimally designed to best achieve these objectives. In this study, helical gear systems are optimized to achieve maximum power. Material selection, space restriction, available facilities for manufacturing, the probability of tooth breakage, and tooth wear are taken into account and governing equations are derived. Finally, a Matlab code was generated to solve the optimization problem and the results are verified.Keywords: design, gears, Matlab, optimization
Procedia PDF Downloads 2409247 Reduction of Nitrogen Monoxide with Carbon Monoxide from Gas Streams by 10% wt. Cu-Ce-Fe-Co/Activated Carbon
Authors: K. L. Pan, M. B. Chang
Abstract:
Nitrogen oxides (NOₓ) is regarded as one of the most important air pollutants. It not only causes adverse environmental effects but also harms human lungs and respiratory system. As a post-combustion treatment, selective catalytic reduction (SCR) possess the highest NO removal efficiency ( ≥ 85%), which is considered as the most effective technique for removing NO from gas streams. However, injection of reducing agent such as NH₃ is requested, and it is costly and may cause secondary pollution. Reduction of NO with carbon monoxide (CO) as reducing agent has been previously investigated. In this process, the key step involves the NO adsorption and dissociation. Also, the high performance mainly relies on the amounts of oxygen vacancy on catalyst surface and redox ability of catalyst, because oxygen vacancy can activate the N-O bond to promote its dissociation. Additionally, perfect redox ability can promote the adsorption of NO and oxidation of CO. Typically, noble metals such as iridium (Ir), platinum (Pt), and palladium (Pd) are used as catalyst for the reduction of NO with CO; however, high cost has limited their applications. Recently, transition metal oxides have been investigated for the reduction of NO with CO, especially CuₓOy, CoₓOy, Fe₂O₃, and MnOₓ are considered as effective catalysts. However, deactivation is inevitable as oxygen (O₂) exists in the gas streams because active sites (oxygen vacancies) of catalyst are occupied by O₂. In this study, Cu-Ce-Fe-Co is prepared and supported on activated carbon by impregnation method to form 10% wt. Cu-Ce-Fe-Co/activated carbon catalyst. Generally, addition of activated carbon on catalyst can bring several advantages: (1) NO can be effectively adsorbed by interaction between catalyst and activated carbon, resulting in the improvement of NO removal, (2) direct NO decomposition may be achieved over carbon associated with catalyst, and (3) reduction of NO could be enhanced by a reducing agent over carbon-supported catalyst. Therefore, 10% wt. Cu-Ce-Fe-Co/activated carbon may have better performance for reduction of NO with CO. Experimental results indicate that NO conversion achieved with 10% wt. Cu-Ce-Fe-Co/activated carbon reaches 83% at 150°C with 300 ppm NO and 10,000 ppm CO. As temperature is further increased to 200°C, 100% NO conversion could be achieved, implying that 10% wt. Cu-Ce-Fe-Co/activated carbon prepared has good activity for the reduction of NO with CO. In order to investigate the effect of O₂ on reduction of NO with CO, 1-5% O₂ are introduced into the system. The results indicate that NO conversions still maintain at ≥ 90% with 1-5% O₂ conditions at 200°C. It is worth noting that effect of O₂ on reduction of NO with CO could be significantly improved as carbon is used as support. It is inferred that carbon support can react with O₂ to produce CO₂ as O₂ exists in the gas streams. Overall, 10% wt. Cu-Ce-Fe-Co/activated carbon is demonstrated with good potential for reduction of NO with CO, and possible mechanisms will be elucidated in this paper.Keywords: nitrogen oxides (NOₓ), carbon monoxide (CO), reduction of NO with CO, carbon material, catalysis
Procedia PDF Downloads 2569246 Educational Related Information Technology Department Transformation: A Case Study
Authors: P. Joongsiri, K. Pattanapisuth, P. Siwatintuko, S. Vasupongayya
Abstract:
This paper presents a case study of developing a four-year plan for the information technology department at the Faculty of Engineering, Prince of Songkla University, Thailand. This work can be used as a case study for other in-house information technology department in a higher educational environment. The result of this paper is the guideline of the four year plan creation process which is generated by analyzing the related theories and several best practices.Keywords: strategic plan, management information system, information technology department governance, best practices, organization transformation
Procedia PDF Downloads 4589245 Development of a Fire Analysis Drone for Smoke Toxicity Measurement for Fire Prediction and Management
Authors: Gabrielle Peck, Ryan Hayes
Abstract:
This research presents the design and creation of a drone gas analyser, aimed at addressing the need for independent data collection and analysis of gas emissions during large-scale fires, particularly wasteland fires. The analyser drone, comprising a lightweight gas analysis system attached to a remote-controlled drone, enables the real-time assessment of smoke toxicity and the monitoring of gases released into the atmosphere during such incidents. The key components of the analyser unit included two gas line inlets connected to glass wool filters, a pump with regulated flow controlled by a mass flow controller, and electrochemical cells for detecting nitrogen oxides, hydrogen cyanide, and oxygen levels. Additionally, a non-dispersive infrared (NDIR) analyser is employed to monitor carbon monoxide (CO), carbon dioxide (CO₂), and hydrocarbon concentrations. Thermocouples can be attached to the analyser to monitor temperature, as well as McCaffrey probes combined with pressure transducers to monitor air velocity and wind direction. These additions allow for monitoring of the large fire and can be used for predictions of fire spread. The innovative system not only provides crucial data for assessing smoke toxicity but also contributes to fire prediction and management. The remote-controlled drone's mobility allows for safe and efficient data collection in proximity to the fire source, reducing the need for human exposure to hazardous conditions. The data obtained from the gas analyser unit facilitates informed decision-making by emergency responders, aiding in the protection of both human health and the environment. This abstract highlights the successful development of a drone gas analyser, illustrating its potential for enhancing smoke toxicity analysis and fire prediction capabilities. The integration of this technology into fire management strategies offers a promising solution for addressing the challenges associated with wildfires and other large-scale fire incidents. The project's methodology and results contribute to the growing body of knowledge in the field of environmental monitoring and safety, emphasizing the practical utility of drones for critical applications.Keywords: fire prediction, drone, smoke toxicity, analyser, fire management
Procedia PDF Downloads 899244 Characteristics Influencing Response of a Base Isolated Building
Authors: Ounis Hadj Mohamed, Ounis Abdelhafid
Abstract:
In order to illustrate the effect of damping on the response of a base-isolated building, a parametric study is led, taking into account the progressive variation of the damping ratio (10% to 30%) under different types of seismic excitations (near and far field). A time history analysis is used to determine the response of the structure in terms of relative displacement and understory drift at various levels of the building. Thus, the results show that the efficiency of the isolator increases with the assumed damping ratio, provided that this latter is less or equal to 20%. Beyond this value, the isolator becomes less convenient. Furthermore, a strong deviation of energy capacity by the LRB (Lead Rubber Bearing) system is recorded.Keywords: damping, base isolation, LRB, seismic excitation, hysteresis
Procedia PDF Downloads 4169243 Evaluation of Reservoir Quality in Cretaceous Sandstone Complex, Western Flank of Anambra Basin, Southern Nigeria
Authors: Bayole Omoniyi
Abstract:
This study demonstrates the value of outcrops as analogues for evaluating reservoir quality of sandbody in a typical high-sinuosity fluvial system. The study utilized data acquired from selected outcrops in the Campanian-Maastrichtian siliciclastic succession of the western flank of Anambra Basin, southern Nigeria. Textural properties derived from outcrop samples were correlated and compared with porosity and permeability using established standard charts. Porosity was estimated from thin sections of selected samples to reduce uncertainty in the estimates. Following facies classification, 14 distinct facies were grouped into three facies associations (FA1-FA3) and were subsequently modeled as discrete properties in a block-centered Cartesian grid on a scale that captures geometry of principal sandbodies. Porosity and permeability estimated from charts were populated in the grid using comparable geostatistical techniques that reflect their spatial distribution. The resultant models were conditioned to facies property to honour available data. The results indicate a strong control of geometrical parameters on facies distribution, lateral continuity and connectivity with resultant effect on porosity and permeability distribution. Sand-prone FA1 and FA2 display reservoir quality that varies internally from channel axis to margin in each succession. Furthermore, isolated stack pattern of sandbodies reduces static connectivity and thus, increases risk of poor communication between reservoir-quality sandbodies. FA3 is non-reservoir because it is mud-prone. In conclusion, the risk of poor communication between sandbodies may be effectively accentuated in reservoirs that have similar architecture because of thick lateral accretion deposits, usually mudstone, that tend to disconnect good-quality point-bar sandbodies. In such reservoirs, mudstone may act as a barrier to impede flow vertically from one sandbody to another and laterally at the margins of each channel-fill succession in the system. The development plan, therefore, must be designed to effectively mitigate these risks and the risk of stratigraphic compartmentalization for maximum hydrocarbon recovery.Keywords: analogues, architecture, connectivity, fluvial
Procedia PDF Downloads 249242 Connecting MRI Physics to Glioma Microenvironment: Comparing Simulated T2-Weighted MRI Models of Fixed and Expanding Extracellular Space
Authors: Pamela R. Jackson, Andrea Hawkins-Daarud, Cassandra R. Rickertsen, Kamala Clark-Swanson, Scott A. Whitmire, Kristin R. Swanson
Abstract:
Glioblastoma Multiforme (GBM), the most common primary brain tumor, often presents with hyperintensity on T2-weighted or T2-weighted fluid attenuated inversion recovery (T2/FLAIR) magnetic resonance imaging (MRI). This hyperintensity corresponds with vasogenic edema, however there are likely many infiltrating tumor cells within the hyperintensity as well. While MRIs do not directly indicate tumor cells, MRIs do reflect the microenvironmental water abnormalities caused by the presence of tumor cells and edema. The inherent heterogeneity and resulting MRI features of GBMs complicate assessing disease response. To understand how hyperintensity on T2/FLAIR MRI may correlate with edema in the extracellular space (ECS), a multi-compartmental MRI signal equation which takes into account tissue compartments and their associated volumes with input coming from a mathematical model of glioma growth that incorporates edema formation was explored. The reasonableness of two possible extracellular space schema was evaluated by varying the T2 of the edema compartment and calculating the possible resulting T2s in tumor and peripheral edema. In the mathematical model, gliomas were comprised of vasculature and three tumor cellular phenotypes: normoxic, hypoxic, and necrotic. Edema was characterized as fluid leaking from abnormal tumor vessels. Spatial maps of tumor cell density and edema for virtual tumors were simulated with different rates of proliferation and invasion and various ECS expansion schemes. These spatial maps were then passed into a multi-compartmental MRI signal model for generating simulated T2/FLAIR MR images. Individual compartments’ T2 values in the signal equation were either from literature or estimated and the T2 for edema specifically was varied over a wide range (200 ms – 9200 ms). T2 maps were calculated from simulated images. T2 values based on simulated images were evaluated for regions of interest (ROIs) in normal appearing white matter, tumor, and peripheral edema. The ROI T2 values were compared to T2 values reported in literature. The expanding scheme of extracellular space is had T2 values similar to the literature calculated values. The static scheme of extracellular space had a much lower T2 values and no matter what T2 was associated with edema, the intensities did not come close to literature values. Expanding the extracellular space is necessary to achieve simulated edema intensities commiserate with acquired MRIs.Keywords: extracellular space, glioblastoma multiforme, magnetic resonance imaging, mathematical modeling
Procedia PDF Downloads 2359241 Bioreactor for Cell-Based Impedance Measuring with Diamond Coated Gold Interdigitated Electrodes
Authors: Roman Matejka, Vaclav Prochazka, Tibor Izak, Jana Stepanovska, Martina Travnickova, Alexander Kromka
Abstract:
Cell-based impedance spectroscopy is suitable method for electrical monitoring of cell activity especially on substrates that cannot be easily inspected by optical microscope (without fluorescent markers) like decellularized tissues, nano-fibrous scaffold etc. Special sensor for this measurement was developed. This sensor consists of corning glass substrate with gold interdigitated electrodes covered with diamond layer. This diamond layer provides biocompatible non-conductive surface for cells. Also, a special PPFC flow cultivation chamber was developed. This chamber is able to fix sensor in place. The spring contacts are connecting sensor pads with external measuring device. Construction allows real-time live cell imaging. Combining with perfusion system allows medium circulation and generating shear stress stimulation. Experimental evaluation consist of several setups, including pure sensor without any coating and also collagen and fibrin coating was done. The Adipose derived stem cells (ASC) and Human umbilical vein endothelial cells (HUVEC) were seeded onto sensor in cultivation chamber. Then the chamber was installed into microscope system for live-cell imaging. The impedance measurement was utilized by vector impedance analyzer. The measured range was from 10 Hz to 40 kHz. These impedance measurements were correlated with live-cell microscopic imaging and immunofluorescent staining. Data analysis of measured signals showed response to cell adhesion of substrates, their proliferation and also change after shear stress stimulation which are important parameters during cultivation. Further experiments plan to use decellularized tissue as scaffold fixed on sensor. This kind of impedance sensor can provide feedback about cell culture conditions on opaque surfaces and scaffolds that can be used in tissue engineering in development artificial prostheses. This work was supported by the Ministry of Health, grants No. 15-29153A and 15-33018A.Keywords: bio-impedance measuring, bioreactor, cell cultivation, diamond layer, gold interdigitated electrodes, tissue engineering
Procedia PDF Downloads 3019240 Deep Neck Infection Associated with Peritoneal Sepsis: A Rare Death Case
Authors: Sait Ozsoy, Asude Gokmen, Mehtap Yondem, Hanife A. Alkan, Gulnaz T. Javan
Abstract:
Deep neck infection often develops due to upper respiratory tract and odontogenic infections. Gastrointestinal System perforation can occur for many reasons and is in need of the early diagnosis and prompt surgical treatment. In both cases late or incorrect diagnosis may lead to increase morbidity and high mortality. A patient with a diagnosis of deep neck abscess died while under treatment due to sepsis and multiple organ failure. Autopsy finding showed duodenal ulcer and this is reported in the literature.Keywords: peptic ulcer perforation, peritonitis, retropharyngeal abscess, sepsis
Procedia PDF Downloads 4989239 Cyclostationary Analysis of Polytime Coded Signals for LPI Radars
Authors: Metuku Shyamsunder, Kakarla Subbarao, P. Prasanna
Abstract:
In radars, an electromagnetic waveform is sent, and an echo of the same signal is received by the receiver. From this received signal, by extracting various parameters such as round trip delay, Doppler frequency it is possible to find distance, speed, altitude, etc. However, nowadays as the technology increases, intruders are intercepting transmitted signal as it reaches them, and they will be extracting the characteristics and trying to modify them. So there is a need to develop a system whose signal cannot be identified by no cooperative intercept receivers. That is why LPI radars came into existence. In this paper, a brief discussion on LPI radar and its modulation (polytime code (PT1)), detection (cyclostationary (DFSM & FAM) techniques such as DFSM, FAM are presented and compared with respect to computational complexity.Keywords: LPI radar, polytime codes, cyclostationary DFSM, FAM
Procedia PDF Downloads 4769238 Sensing Endocrine Disrupting Chemicals by Virus-Based Structural Colour Nanostructure
Authors: Lee Yujin, Han Jiye, Oh Jin-Woo
Abstract:
The adverse effects of endocrine disrupting chemicals (EDCs) has attracted considerable public interests. The benzene-like EDCs structure mimics the mechanisms of hormones naturally occurring in vivo, and alters physiological function of the endocrine system. Although, some of the most representative EDCs such as polychlorinated biphenyls (PCBs) and phthalates compounds already have been prohibited to produce and use in many countries, however, PCBs and phthalates in plastic products as flame retardant and plasticizer are still circulated nowadays. EDCs can be released from products while using and discarding, and it causes serious environmental and health issues. Here, we developed virus-based structurally coloured nanostructure that can detect minute EDCs concentration sensitively and selectively. These structurally coloured nanostructure exhibits characteristic angel-independent colors due to the regular virus bundle structure formation through simple pulling technique. The designed number of different colour bands can be formed through controlling concentration of virus solution and pulling speed. The virus, M-13 bacteriophage, was genetically engineered to react with specific ECDs, typically PCBs and phthalates. M-13 bacteriophage surface (pVIII major coat protein) was decorated with benzene derivative binding peptides (WHW) through phage library method. In the initial assessment, virus-based color sensor was exposed to several organic chemicals including benzene, toluene, phenol, chlorobenzene, and phthalic anhydride. Along with the selectivity evaluation of virus-based colour sensor, it also been tested for sensitivity. 10 to 300 ppm of phthalic anhydride and chlorobenzene were detected by colour sensor, and showed the significant sensitivity with about 90 of dissociation constant. Noteworthy, all measurements were analyzed through principal component analysis (PCA) and linear discrimination analysis (LDA), and exhibited clear discrimination ability upon exposure to 2 categories of EDCs (PCBs and phthalates). Because of its easy fabrication, high sensitivity, and the superior selectivity, M-13 bacteriophage-based color sensor could be a simple and reliable portable sensing system for environmental monitoring, healthcare, social security, and so on.Keywords: M-13 bacteriophage, colour sensor, genetic engineering, EDCs
Procedia PDF Downloads 2429237 Verification Protocols for the Lightning Protection of a Large Scale Scientific Instrument in Harsh Environments: A Case Study
Authors: Clara Oliver, Oibar Martinez, Jose Miguel Miranda
Abstract:
This paper is devoted to the study of the most suitable protocols to verify the lightning protection and ground resistance quality in a large-scale scientific facility located in a harsh environment. We illustrate this work by reviewing a case study: the largest telescopes of the Northern Hemisphere Cherenkov Telescope Array, CTA-N. This array hosts sensitive and high-speed optoelectronics instrumentation and sits on a clear, free from obstacle terrain at around 2400 m above sea level. The site offers a top-quality sky but also features challenging conditions for a lightning protection system: the terrain is volcanic and has resistivities well above 1 kOhm·m. In addition, the environment often exhibits humidities well below 5%. On the other hand, the high complexity of a Cherenkov telescope structure does not allow a straightforward application of lightning protection standards. CTA-N has been conceived as an array of fourteen Cherenkov Telescopes of two different sizes, which will be constructed in La Palma Island, Spain. Cherenkov Telescopes can provide valuable information on different astrophysical sources from the gamma rays reaching the Earth’s atmosphere. The largest telescopes of CTA are called LST’s, and the construction of the first one was finished in October 2018. The LST has a shape which resembles a large parabolic antenna, with a 23-meter reflective surface supported by a tubular structure made of carbon fibers and steel tubes. The reflective surface has 400 square meters and is made of an array of segmented mirrors that can be controlled individually by a subsystem of actuators. This surface collects and focuses the Cherenkov photons into the camera, where 1855 photo-sensors convert the light in electrical signals that can be processed by dedicated electronics. We describe here how the risk assessment of direct strike impacts was made and how down conductors and ground system were both tested. The verification protocols which should be applied for the commissioning and operation phases are then explained. We stress our attention on the ground resistance quality assessment.Keywords: grounding, large scale scientific instrument, lightning risk assessment, lightning standards and safety
Procedia PDF Downloads 1239236 GenAI Agents in Product Management: A Case Study from the Manufacturing Sector
Authors: Aron Witkowski, Andrzej Wodecki
Abstract:
Purpose: This study aims to explore the feasibility and effectiveness of utilizing Generative Artificial Intelligence (GenAI) agents as product managers within the manufacturing sector. It seeks to evaluate whether current GenAI capabilities can fulfill the complex requirements of product management and deliver comparable outcomes to human counterparts. Study Design/Methodology/Approach: This research involved the creation of a support application for product managers, utilizing high-quality sources on product management and generative AI technologies. The application was designed to assist in various aspects of product management tasks. To evaluate its effectiveness, a study was conducted involving 10 experienced product managers from the manufacturing sector. These professionals were tasked with using the application and providing feedback on the tool's responses to common questions and challenges they encounter in their daily work. The study employed a mixed-methods approach, combining quantitative assessments of the tool's performance with qualitative interviews to gather detailed insights into the user experience and perceived value of the application. Findings: The findings reveal that GenAI-based product management agents exhibit significant potential in handling routine tasks, data analysis, and predictive modeling. However, there are notable limitations in areas requiring nuanced decision-making, creativity, and complex stakeholder interactions. The case study demonstrates that while GenAI can augment human capabilities, it is not yet fully equipped to independently manage the holistic responsibilities of a product manager in the manufacturing sector. Originality/Value: This research provides an analysis of GenAI's role in product management within the manufacturing industry, contributing to the limited body of literature on the application of GenAI agents in this domain. It offers practical insights into the current capabilities and limitations of GenAI, helping organizations make informed decisions about integrating AI into their product management strategies. Implications for Academic and Practical Fields: For academia, the study suggests new avenues for research in AI-human collaboration and the development of advanced AI systems capable of higher-level managerial functions. Practically, it provides industry professionals with a nuanced understanding of how GenAI can be leveraged to enhance product management, guiding investments in AI technologies and training programs to bridge identified gaps.Keywords: generative artificial intelligence, GenAI, NPD, new product development, product management, manufacturing
Procedia PDF Downloads 499235 A Four-Step Ortho-Rectification Procedure for Geo-Referencing Video Streams from a Low-Cost UAV
Authors: B. O. Olawale, C. R. Chatwin, R. C. D. Young, P. M. Birch, F. O. Faithpraise, A. O. Olukiran
Abstract:
Ortho-rectification is the process of geometrically correcting an aerial image such that the scale is uniform. The ortho-image formed from the process is corrected for lens distortion, topographic relief, and camera tilt. This can be used to measure true distances, because it is an accurate representation of the Earth’s surface. Ortho-rectification and geo-referencing are essential to pin point the exact location of targets in video imagery acquired at the UAV platform. This can only be achieved by comparing such video imagery with an existing digital map. However, it is only when the image is ortho-rectified with the same co-ordinate system as an existing map that such a comparison is possible. The video image sequences from the UAV platform must be geo-registered, that is, each video frame must carry the necessary camera information before performing the ortho-rectification process. Each rectified image frame can then be mosaicked together to form a seamless image map covering the selected area. This can then be used for comparison with an existing map for geo-referencing. In this paper, we present a four-step ortho-rectification procedure for real-time geo-referencing of video data from a low-cost UAV equipped with multi-sensor system. The basic procedures for the real-time ortho-rectification are: (1) Decompilation of video stream into individual frames; (2) Finding of interior camera orientation parameters; (3) Finding the relative exterior orientation parameters for each video frames with respect to each other; (4) Finding the absolute exterior orientation parameters, using self-calibration adjustment with the aid of a mathematical model. Each ortho-rectified video frame is then mosaicked together to produce a 2-D planimetric mapping, which can be compared with a well referenced existing digital map for the purpose of georeferencing and aerial surveillance. A test field located in Abuja, Nigeria was used for testing our method. Fifteen minutes video and telemetry data were collected using the UAV and the data collected were processed using the four-step ortho-rectification procedure. The results demonstrated that the geometric measurement of the control field from ortho-images are more reliable than those from original perspective photographs when used to pin point the exact location of targets on the video imagery acquired by the UAV. The 2-D planimetric accuracy when compared with the 6 control points measured by a GPS receiver is between 3 to 5 meters.Keywords: geo-referencing, ortho-rectification, video frame, self-calibration
Procedia PDF Downloads 478