Search results for: location based data
43407 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models
Authors: Rossella Arcucci, Luisa D'Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti
Abstract:
This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.Keywords: data assimilation, GPU architectures, ocean models, parallel algorithm
Procedia PDF Downloads 41243406 An Assessment into Impact of Regional Conflicts upon Socio-Political Sustainability in Pakistan
Authors: Syed Toqueer Akhter, Muhammad Muzaffar Abbas
Abstract:
Conflicts in Pakistan are a result of a configuration of factors, which are directly related to the system of the state, the unstable regional setting, and the geo-strategic location of Pakistan at large. This paper examines the impact of regional conflict onto the socio-political sustainability of Pakistan. The magnitude of the spillover from a conflicted region is similar in size of the equivalent increase in domestic conflict. Pakistan has gone at war three times with India; the border with India is named as the tensest borderlines of the world. Disagreements with India and lack of dispute settlement mechanisms have negatively effected the peace in the region, influx of illegal weapons and refugees from Afghanistan as an outcome of 9/11 incidence, have exasperated the criticality of levels of internal conflict in Pakistan. Our empirical findings are based on the data collected on regional conflict levels, regional trade, global trade, comparative defence capabilities of the region in contrast to Pakistan and the government regime (Autocratic, Democratic) over 1972-2007. It has been proposed in this paper that the intent of domestic conflict is associated with the conflict in the region, regional trade, global trade and the government regime of Pakistan. The estimated model (OLS) implies that domestic conflict is effected positively and significantly with long term impact of conflict in the region. Also, if defence capabilities of the region are better than that of Pakistan it effects domestic conflict positively and significantly. Conflict in neighbouring countries are found as a source of domestic conflict in Pakistan, whereas the regional trade as well as type of government regimes in Pakistan lowered the intensity of domestic conflict significantly, while globalized trade imply risk of domestic conflict to be reduced but not significantly.Keywords: conflict, regional trade, socio-politcal instability
Procedia PDF Downloads 32143405 Tourism Area Development Optimation Based on Solar-Generated Renewable Energy Technology at Karimunjawa, Central Java Province, Indonesia
Authors: Yanuar Tri Wahyu Saputra, Ramadhani Pamapta Putra
Abstract:
Karimunjawa is one among Indonesian islands which is lacking of electricity supply. Despite condition above, Karimunjawa is an important tourism object in Indonesia's Central Java Province. Solar Power Plant is a potential technology to be applied in Karimunjawa, in order to fulfill the island's electrical supply need and to increase daily life and tourism quality among tourists and local population. This optimation modeling of Karimunjawa uses HOMER software program. The data we uses include wind speed data in Karimunjawa from BMKG (Indonesian Agency for Meteorology, Climatology and Geophysics), annual weather data in Karimunjawa from NASA, electricity requirements assumption data based on number of houses and business infrastructures in Karimunjawa. This modeling aims to choose which three system categories offer the highest financial profit with the lowest total Net Present Cost (NPC). The first category uses only PV with 8000 kW of electrical power and NPC value of $6.830.701. The second category uses hybrid system which involves both 1000 kW PV and 100 kW generator which results in total NPC of $6.865.590. The last category uses only generator with 750 kW of electrical power that results in total NPC of $ 16.368.197, the highest total NPC among the three categories. Based on the analysis above, we can conclude that the most optimal way to fulfill the electricity needs in Karimunjawa is to use 8000 kW PV with lower maintenance cost.Keywords: Karimunjawa, renewable energy, solar power plant, HOMER
Procedia PDF Downloads 46743404 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Human action recognition modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view football datasets. Our HMR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH multi-view football datasets, respectively.Keywords: computer vision, human motion analysis, random forest, machine learning
Procedia PDF Downloads 3843403 Compartmental Model Approach for Dosimetric Calculations of ¹⁷⁷Lu-DOTATOC in Adenocarcinoma Breast Cancer Based on Animal Data
Authors: M. S. Mousavi-Daramoroudi, H. Yousefnia, S. Zolghadri, F. Abbasi-Davani
Abstract:
Dosimetry is an indispensable and precious factor in patient treatment planning; to minimize the absorbed dose in vital tissues. In this study, In accordance with the proper characteristics of DOTATOC and ¹⁷⁷Lu, after preparing ¹⁷⁷Lu-DOTATOC at the optimal conditions for the first time in Iran, radionuclidic and radiochemical purity of the solution was investigated using an HPGe spectrometer and ITLC method, respectively. The biodistribution of the compound was assayed for treatment of adenocarcinoma breast cancer in bearing BALB/c mice. The results have demonstrated that ¹⁷⁷Lu-DOTATOC is a profitable selection for therapy of the tumors. Because of the vital role of internal dosimetry before and during therapy, the effort to improve the accuracy and rapidity of dosimetric calculations is necessary. For this reason, a new method was accomplished to calculate the absorbed dose through mixing between compartmental model, animal dosimetry and extrapolated data from animal to human and using MIRD method. Despite utilization of compartmental model based on the experimental data, it seems this approach may increase the accuracy of dosimetric data, confidently.Keywords: ¹⁷⁷Lu-DOTATOC, biodistribution modeling, compartmental model, internal dosimetry
Procedia PDF Downloads 22043402 Evaluation of Broiler Parent Breeds under Libyan Conditions
Authors: Salem A. Abdalla Bozrayda, Abulgasem M. Hubara
Abstract:
The use of commercial poultry breeds in Libya may result in large economic losses because genotypes selected in temperate climates may respond differently to other climate conditions and management. Therefore three commercial breeds (Hypeco, Avian, and Shaver) were evaluated in two regions. The data were obtained from weekly records of three parental flocks for each breed at Ghout El-sultan and Tawargha region. Feed Hen Housed (FHH), Hen Housed Egg Production (HHEP) Mortility % were the studied traits. Statistical model include location, year, month, age and breed. Hypeco produced more HHEP 68.6 with Less FHH 22.9 kg but with higher mortility 8.5 % than Avian and shaver breeds. The breeds exhibited different responses to the different months in Libya. In conclusion, the differences, which exhibited between the breeds in traits studied, indicate that genotype x environment must be considered when select breed to perform under Libyan conditions.Keywords: hypeco avian shaver, feed hen housed, hen housed egg production, mortility, Libya
Procedia PDF Downloads 28943401 Automatic Extraction of Water Bodies Using Whole-R Method
Authors: Nikhat Nawaz, S. Srinivasulu, P. Kesava Rao
Abstract:
Feature extraction plays an important role in many remote sensing applications. Automatic extraction of water bodies is of great significance in many remote sensing applications like change detection, image retrieval etc. This paper presents a procedure for automatic extraction of water information from remote sensing images. The algorithm uses the relative location of R-colour component of the chromaticity diagram. This method is then integrated with the effectiveness of the spatial scale transformation of whole method. The whole method is based on water index fitted from spectral library. Experimental results demonstrate the improved accuracy and effectiveness of the integrated method for automatic extraction of water bodies.Keywords: feature extraction, remote sensing, image retrieval, chromaticity, water index, spectral library, integrated method
Procedia PDF Downloads 38543400 Meat Qualities and Death on Arrival (DOA) of Broiler Chickens Transported in a Brazilian Tropical Conditions
Authors: Arlan S. Freitas, Leila M. Carvalho, Adriana L. Soares, Arnoud Neto, Marta S. Madruga, Elza I. Ida, Massami Shimokomaki
Abstract:
The objective of this work was to evaluate the influence of microclimatic profile of broiler transport trucks under commercial conditions over the breast meat quality and DOA (Death On Arrival) in a tropical Brazilian regions as the North East where routinely the season is divided into dry and wet seasons. The temperature remains fairly constant and obviously the relative humidity changes accordingly. Three loads of 4,100 forty seven days old broiler were monitored from farm to slaughterhouse in a distance of 4.3 km, morning period of October 2015 rainy days. The profile of the environmental variables inside the container truck throughout the journey was obtained by the installation of thermo anemometers in 6 different locations by monitoring the heat index (HI), air velocity (AV), temperature (T), and relative humidity (RH). Meat qualities were evaluated by determining the occurrence of PSE (pale, soft, exudative) meat and DFD (dark, firm dry) meat. The percentage of birds DOA per loaded truck was determined by counting the dead broiler during the hanging step at the slaughtering plant. The analysis of variance was performed using statistical software (Statistica 8 for windows, Statsoft 2007, Tulsa, OK, USA). The Tukey significance test (P<0.05) was applied to compare means from microenvironmental data, PSE, DFD and DOA. Fillet samples were collected at 24h post mortem for pH e color (L*, a* e b*) determination through the CIELAB system. Results showed the occurrence of 2.98% of PSE and 0.66% de DFD and only 0.016% of DOA and overall the most uncomfortable container location was at the truck frontal inferior presenting 6.25% of PSE. DFD of 2.0% were obtained from birds located at central and inferior rear locations. These values were unexpected in comparison to other results obtained in our laboratories in previous experiments carried out within the country south state. The results reported herein were lower in every aspect. Reasonable explanation would be the shorter distance, wet conditions throughout around 15-20 min journeys and lower T and RH values as observed in samples taken from the rear location as higher DFD values were obtained. These facts mean the animals were not under heat stressful condition but in fact under cold stress conditions as the result of DFD suggested in association to the lower number of DOA.Keywords: cold stress, DFD, microclimatic profile, PSE
Procedia PDF Downloads 23543399 A Web Service-Based Framework for Mining E-Learning Data
Authors: Felermino D. M. A. Ali, S. C. Ng
Abstract:
E-learning is an evolutionary form of distance learning and has become better over time as new technologies emerged. Today, efforts are still being made to embrace E-learning systems with emerging technologies in order to make them better. Among these advancements, Educational Data Mining (EDM) is one that is gaining a huge and increasing popularity due to its wide application for improving the teaching-learning process in online practices. However, even though EDM promises to bring many benefits to educational industry in general and E-learning environments in particular, its principal drawback is the lack of easy to use tools. The current EDM tools usually require users to have some additional technical expertise to effectively perform EDM tasks. Thus, in response to these limitations, this study intends to design and implement an EDM application framework which aims at automating and simplify the development of EDM in E-learning environment. The application framework introduces a Service-Oriented Architecture (SOA) that hides the complexity of technical details and enables users to perform EDM in an automated fashion. The framework was designed based on abstraction, extensibility, and interoperability principles. The framework implementation was made up of three major modules. The first module provides an abstraction for data gathering, which was done by extending Moodle LMS (Learning Management System) source code. The second module provides data mining methods and techniques as services; it was done by converting Weka API into a set of Web services. The third module acts as an intermediary between the first two modules, it contains a user-friendly interface that allows dynamically locating data provider services, and running knowledge discovery tasks on data mining services. An experiment was conducted to evaluate the overhead of the proposed framework through a combination of simulation and implementation. The experiments have shown that the overhead introduced by the SOA mechanism is relatively small, therefore, it has been concluded that a service-oriented architecture can be effectively used to facilitate educational data mining in E-learning environments.Keywords: educational data mining, e-learning, distributed data mining, moodle, service-oriented architecture, Weka
Procedia PDF Downloads 23643398 The Phonemic Inventory of Tenyidie Affricates: An Acoustic Study
Authors: NeisaKuonuo Tungoe
Abstract:
Tenyidie, also known as Angami, is spoken by the Angami tribe of Nagaland, North-East India, bordering Myanmar (Burma). It belongs to the Tibeto-Burman language group, falling under the Kuki-Chin-Naga sub-family. Tenyidie studies have seen random attempts at explaining the phonemic inventory of Tenyidie. Different scholars have variously emphasized the grammar or the history of Tenyidie. Many of these claims have been stimulating, but they were often based on a small amount of merely suggestive data or on auditory perception only. The principal objective of this paper is to analyse the affricate segments of Tenyidie as an acoustic study. There are seven categories to the inventory of Tenyidie; Plosives, Nasals, Affricates, Laterals, Rhotics, Fricatives, Semi vowels and Vowels. In all, there are sixty phonemes in the inventory. As mentioned above, the only prominent readings on Tenyidie or affricates in particular are only reflected through auditory perception. As noted above, this study aims to lay out the affricate segments based only on acoustic conclusions. There are seven affricates found in Tenyidie. They are: 1) Voiceless Labiodental Affricate - / pf /, 2) Voiceless Aspirated Labiodental Affricate- / pfh /, 3) Voiceless Alveolar Affricate - / ts /, 4) Voiceless Aspirated Alveolar Affricate - / tsh /, 5) Voiced Alveolar Affricate - / dz /, 6) Voiceless Post-Alveolar Affricate / tʃ / and 7) Voiced Post- Alveolar Affricate- / dʒ /. Since the study is based on acoustic features of affricates, five informants were asked to record their voice with Tenyidie phonemes and English phonemes. Throughout the study of the recorded data, PRAAT, a scientific software program that has made itself indispensible for the analyses of speech in phonetics, have been used as the main software. This data was then used as a comparative study between Tenyidie and English affricates. Comparisons have also been drawn between this study and the work of another author who has stated that there are only six affricates in Tenyidie. The study has been quite detailed regarding the specifics of the data. Detailed accounts of the duration and acoustic cues have been noted. The data will be presented in the form of spectrograms. Since there aren’t any other acoustic related data done on Tenyidie, this study will be the first in the long line of acoustic researches on Tenyidie.Keywords: tenyidie, affricates, praat, phonemic inventory
Procedia PDF Downloads 41743397 Poultry in Motion: Text Mining Social Media Data for Avian Influenza Surveillance in the UK
Authors: Samuel Munaf, Kevin Swingler, Franz Brülisauer, Anthony O’Hare, George Gunn, Aaron Reeves
Abstract:
Background: Avian influenza, more commonly known as Bird flu, is a viral zoonotic respiratory disease stemming from various species of poultry, including pets and migratory birds. Researchers have purported that the accessibility of health information online, in addition to the low-cost data collection methods the internet provides, has revolutionized the methods in which epidemiological and disease surveillance data is utilized. This paper examines the feasibility of using internet data sources, such as Twitter and livestock forums, for the early detection of the avian flu outbreak, through the use of text mining algorithms and social network analysis. Methods: Social media mining was conducted on Twitter between the period of 01/01/2021 to 31/12/2021 via the Twitter API in Python. The results were filtered firstly by hashtags (#avianflu, #birdflu), word occurrences (avian flu, bird flu, H5N1), and then refined further by location to include only those results from within the UK. Analysis was conducted on this text in a time-series manner to determine keyword frequencies and topic modeling to uncover insights in the text prior to a confirmed outbreak. Further analysis was performed by examining clinical signs (e.g., swollen head, blue comb, dullness) within the time series prior to the confirmed avian flu outbreak by the Animal and Plant Health Agency (APHA). Results: The increased search results in Google and avian flu-related tweets showed a correlation in time with the confirmed cases. Topic modeling uncovered clusters of word occurrences relating to livestock biosecurity, disposal of dead birds, and prevention measures. Conclusions: Text mining social media data can prove to be useful in relation to analysing discussed topics for epidemiological surveillance purposes, especially given the lack of applied research in the veterinary domain. The small sample size of tweets for certain weekly time periods makes it difficult to provide statistically plausible results, in addition to a great amount of textual noise in the data.Keywords: veterinary epidemiology, disease surveillance, infodemiology, infoveillance, avian influenza, social media
Procedia PDF Downloads 10543396 Reinforced Concrete Bridge Deck Condition Assessment Methods Using Ground Penetrating Radar and Infrared Thermography
Authors: Nicole M. Martino
Abstract:
Reinforced concrete bridge deck condition assessments primarily use visual inspection methods, where an inspector looks for and records locations of cracks, potholes, efflorescence and other signs of probable deterioration. Sounding is another technique used to diagnose the condition of a bridge deck, however this method listens for damage within the subsurface as the surface is struck with a hammer or chain. Even though extensive procedures are in place for using these inspection techniques, neither one provides the inspector with a comprehensive understanding of the internal condition of a bridge deck – the location where damage originates from. In order to make accurate estimates of repair locations and quantities, in addition to allocating the necessary funding, a total understanding of the deck’s deteriorated state is key. The research presented in this paper collected infrared thermography and ground penetrating radar data from reinforced concrete bridge decks without an asphalt overlay. These decks were of various ages and their condition varied from brand new, to in need of replacement. The goals of this work were to first verify that these nondestructive evaluation methods could identify similar areas of healthy and damaged concrete, and then to see if combining the results of both methods would provide a higher confidence than if the condition assessment was completed using only one method. The results from each method were presented as plan view color contour plots. The results from one of the decks assessed as a part of this research, including these plan view plots, are presented in this paper. Furthermore, in order to answer the interest of transportation agencies throughout the United States, this research developed a step-by-step guide which demonstrates how to collect and assess a bridge deck using these nondestructive evaluation methods. This guide addresses setup procedures on the deck during the day of data collection, system setups and settings for different bridge decks, data post-processing for each method, and data visualization and quantification.Keywords: bridge deck deterioration, ground penetrating radar, infrared thermography, NDT of bridge decks
Procedia PDF Downloads 15443395 Implementation of a Distant Learning Physician Assistant Program in Northern Michigan to Address Health Care Provider Shortage: Importance of Evaluation
Authors: Theresa Bacon-Baguley, Martina Reinhold
Abstract:
Introduction: The purpose of this paper is to discuss the importance of both formative and summative evaluation of a Physician Assistant (PA) program with a distant campus delivered through Interactive Television (ITV) to assure equity of educational experiences. Methodology: A needs assessment utilizing a case-control design determined the need and interest in expanding the existing PA program to northern Michigan. A federal grant was written and funded, which supported the hiring of two full-time faculty members and support staff at the distant site. The strengths and weaknesses of delivering a program through ITV were evaluated using weekly formative evaluation, and bi-semester summative evaluation. Formative evaluation involved discussion of lecture content to be delivered, special ITV needs, orientation of new lecturers to the system, student concerns, support staff updates, and scheduling of student/faculty traveling between the two campuses. The summative evaluation, designed from a literature review of barriers to ITV, included 19 statements designed to evaluate the following items: quality of technology (audio, video, etc.), confidence in the ITV system, quality of instruction and instructor interaction between the two locations, and availability of resources at each location. In addition, students were given the opportunity to write qualitative remarks for each course delivered between the two locations. This summative evaluation was given to all students at mid-semester and at the end of the semester. The goal of the summative evaluation was to have 80% or greater of the students respond favorably (‘Very Good’ or ‘Good’) to each of the 19 statements. Results: Prior to the start of the first cohort at the distant campus, the technology was tested. During this time period, the formative evaluations identified key components needing modification, which were rapidly addressed: ability to record lectures, lighting, sound, and content delivery. When the mid-semester summative survey was given to the first cohort of students, 18 of the 19 statements in the summative evaluation met the goal of 80% or greater in the favorable category. When the summative evaluation statements were stratified by the two cohorts, the summative evaluation identified that students at the home location responded that they did not have adequate access to printers, and students at the expansion location responded that they did not have adequate access to library resources. These results allowed the program to address the deficiencies through contacting informational technology for additional printers, and to provide students with knowledge on how to access library resources. Conclusion: Successful expansion of programs to a distant site utilizing ITV technology requires extensive monitoring using both formative and summative evaluation. The formative evaluation allowed for quick identification of issues that could immediately be addressed, both at the planning and developing stage, as well as during implementation. Through use of the summative evaluation the program is able to monitor the success/ effectiveness of the expansion and identify specific needs of students at each location.Keywords: assessment, distance learning, formative feedback, interactive television (ITV), student experience, summative feedback, support
Procedia PDF Downloads 24643394 A QoS Aware Cluster Based Routing Algorithm for Wireless Mesh Network Using LZW Lossless Compression
Authors: J. S. Saini, P. P. K. Sandhu
Abstract:
The multi-hop nature of Wireless Mesh Networks and the hasty progression of throughput demands results in multi- channels and multi-radios structures in mesh networks, but the main problem of co-channels interference reduces the total throughput, specifically in multi-hop networks. Quality of Service mentions a vast collection of networking technologies and techniques that guarantee the ability of a network to make available desired services with predictable results. Quality of Service (QoS) can be directed at a network interface, towards a specific server or router's performance, or in specific applications. Due to interference among various transmissions, the QoS routing in multi-hop wireless networks is formidable task. In case of multi-channel wireless network, since two transmissions using the same channel may interfere with each other. This paper has considered the Destination Sequenced Distance Vector (DSDV) routing protocol to locate the secure and optimised path. The proposed technique also utilizes the Lempel–Ziv–Welch (LZW) based lossless data compression and intra cluster data aggregation to enhance the communication between the source and the destination. The use of clustering has the ability to aggregate the multiple packets and locates a single route using the clusters to improve the intra cluster data aggregation. The use of the LZW based lossless data compression has ability to reduce the data packet size and hence it will consume less energy, thus increasing the network QoS. The MATLAB tool has been used to evaluate the effectiveness of the projected technique. The comparative analysis has shown that the proposed technique outperforms over the existing techniques.Keywords: WMNS, QOS, flooding, collision avoidance, LZW, congestion control
Procedia PDF Downloads 33843393 The Effect of Inlet Baffle Position in Improving the Efficiency of Oil and Water Gravity Separator Tanks
Authors: Haitham A. Hussein, Rozi Abdullah, Issa Saket, Md. Azlin
Abstract:
The gravitational effect has been extensively applied to separate oil from water in water and wastewater treatment systems. The maximum oil globules removal efficiency is improved by obtaining the best flow uniformity in separator tanks. This study used 2D computational fluid dynamics (CFD) to investigate the effect of different inlet baffle positions inside the separator tank. Laboratory experiment has been conducted, and the measured velocity fields which were by Nortek Acoustic Doppler Velocimeter (ADV) are used to verify the CFD model. Computational investigation results indicated that the construction of an inlet baffle in a suitable location provides the minimum recirculation zone volume, creates the best flow uniformity, and dissipates kinetic energy in the oil and water separator tank. Useful formulas were predicted to design the oil and water separator tanks geometry based on an experimental model.Keywords: oil/water separator tanks, inlet baffles, CFD, VOF
Procedia PDF Downloads 36843392 Comparative Analysis of the Third Generation of Research Data for Evaluation of Solar Energy Potential
Authors: Claudineia Brazil, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Rafael Haag
Abstract:
Renewable energy sources are dependent on climatic variability, so for adequate energy planning, observations of the meteorological variables are required, preferably representing long-period series. Despite the scientific and technological advances that meteorological measurement systems have undergone in the last decades, there is still a considerable lack of meteorological observations that form series of long periods. The reanalysis is a system of assimilation of data prepared using general atmospheric circulation models, based on the combination of data collected at surface stations, ocean buoys, satellites and radiosondes, allowing the production of long period data, for a wide gamma. The third generation of reanalysis data emerged in 2010, among them is the Climate Forecast System Reanalysis (CFSR) developed by the National Centers for Environmental Prediction (NCEP), these data have a spatial resolution of 0.50 x 0.50. In order to overcome these difficulties, it aims to evaluate the performance of solar radiation estimation through alternative data bases, such as data from Reanalysis and from meteorological satellites that satisfactorily meet the absence of observations of solar radiation at global and/or regional level. The results of the analysis of the solar radiation data indicated that the reanalysis data of the CFSR model presented a good performance in relation to the observed data, with determination coefficient around 0.90. Therefore, it is concluded that these data have the potential to be used as an alternative source in locations with no seasons or long series of solar radiation, important for the evaluation of solar energy potential.Keywords: climate, reanalysis, renewable energy, solar radiation
Procedia PDF Downloads 20943391 Advances in Design Decision Support Tools for Early-stage Energy-Efficient Architectural Design: A Review
Authors: Maryam Mohammadi, Mohammadjavad Mahdavinejad, Mojtaba Ansari
Abstract:
The main driving force for increasing movement towards the design of High-Performance Buildings (HPB) are building codes and rating systems that address the various components of the building and their impact on the environment and energy conservation through various methods like prescriptive methods or simulation-based approaches. The methods and tools developed to meet these needs, which are often based on building performance simulation tools (BPST), have limitations in terms of compatibility with the integrated design process (IDP) and HPB design, as well as use by architects in the early stages of design (when the most important decisions are made). To overcome these limitations in recent years, efforts have been made to develop Design Decision Support Systems, which are often based on artificial intelligence. Numerous needs and steps for designing and developing a Decision Support System (DSS), which complies with the early stages of energy-efficient architecture design -consisting of combinations of different methods in an integrated package- have been listed in the literature. While various review studies have been conducted in connection with each of these techniques (such as optimizations, sensitivity and uncertainty analysis, etc.) and their integration of them with specific targets; this article is a critical and holistic review of the researches which leads to the development of applicable systems or introduction of a comprehensive framework for developing models complies with the IDP. Information resources such as Science Direct and Google Scholar are searched using specific keywords and the results are divided into two main categories: Simulation-based DSSs and Meta-simulation-based DSSs. The strengths and limitations of different models are highlighted, two general conceptual models are introduced for each category and the degree of compliance of these models with the IDP Framework is discussed. The research shows movement towards Multi-Level of Development (MOD) models, well combined with early stages of integrated design (schematic design stage and design development stage), which are heuristic, hybrid and Meta-simulation-based, relies on Big-real Data (like Building Energy Management Systems Data or Web data). Obtaining, using and combining of these data with simulation data to create models with higher uncertainty, more dynamic and more sensitive to context and culture models, as well as models that can generate economy-energy-efficient design scenarios using local data (to be more harmonized with circular economy principles), are important research areas in this field. The results of this study are a roadmap for researchers and developers of these tools.Keywords: integrated design process, design decision support system, meta-simulation based, early stage, big data, energy efficiency
Procedia PDF Downloads 16243390 A Platform to Analyze Controllers for Solar Hot Water Systems
Authors: Aziz Ahmad, Guillermo Ramirez-Prado
Abstract:
Governments around the world encourage the use of solar water heating in residential houses due to the low maintenance requirements and efficiency of the solar collector water heating systems. The aim of this work is to study a domestic solar water heating system in a residential building to develop a model of the entire solar water heating system including flat-plate solar collector and storage tank. The proposed model is adaptable to any households and location. The model can be used to test different types of controllers and can provide efficiency as well as economic analysis. The proposed model is based on the heat and mass transfer equations along with assumptions applied in the model which can be modified for a variety of different solar water heating systems and sizes. Simulation results of the model were compared with the actual system which shows similar trends.Keywords: solar thermal systems, solar water heating, solar collector model, hot water tank model, solar controllers
Procedia PDF Downloads 27143389 A Similar Image Retrieval System for Auroral All-Sky Images Based on Local Features and Color Filtering
Authors: Takanori Tanaka, Daisuke Kitao, Daisuke Ikeda
Abstract:
The aurora is an attractive phenomenon but it is difficult to understand the whole mechanism of it. An approach of data-intensive science might be an effective approach to elucidate such a difficult phenomenon. To do that we need labeled data, which shows when and what types of auroras, have appeared. In this paper, we propose an image retrieval system for auroral all-sky images, some of which include discrete and diffuse aurora, and the other do not any aurora. The proposed system retrieves images which are similar to the query image by using a popular image recognition method. Using 300 all-sky images obtained at Tromso Norway, we evaluate two methods of image recognition methods with or without our original color filtering method. The best performance is achieved when SIFT with the color filtering is used and its accuracy is 81.7% for discrete auroras and 86.7% for diffuse auroras.Keywords: data-intensive science, image classification, content-based image retrieval, aurora
Procedia PDF Downloads 44943388 Wavelet Based Advanced Encryption Standard Algorithm for Image Encryption
Authors: Ajish Sreedharan
Abstract:
With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. As encryption process is applied to the whole image in AES ,it is difficult to improve the efficiency. In this paper, wavelet decomposition is used to concentrate the main information of image to the low frequency part. Then, AES encryption is applied to the low frequency part. The high frequency parts are XORed with the encrypted low frequency part and a wavelet reconstruction is applied. Theoretical analysis and experimental results show that the proposed algorithm has high efficiency, and satisfied security suits for image data transmission.Keywords: discrete wavelet transforms, AES, dynamic SBox
Procedia PDF Downloads 43243387 The Spatial Pattern of Economic Rents of an Airport Development Area: Lessons Learned from the Suvarnabhumi International Airport, Thailand
Authors: C. Bejrananda, Y. Lee, T. Khamkaew
Abstract:
With the rise of the importance of air transportation in the 21st century, the role of economics in airport planning and decision-making has become more important to the urban structure and land value around it. Therefore, this research aims to examine the relationship between an airport and its impacts on the distribution of urban land uses and land values by applying the Alonso’s bid rent model. The New Bangkok International Airport (Suvarnabhumi International Airport) was taken as a case study. The analysis was made over three different time periods of airport development (after the airport site was proposed, during airport construction, and after the opening of the airport). The statistical results confirm that Alonso’s model can be used to explain the impacts of the new airport only for the northeast quadrant of the airport, while proximity to the airport showed the inverse relationship with the land value of all six types of land use activities through three periods of time. It indicates that the land value for commercial land use is the most sensitive to the location of the airport or has the strongest requirement for accessibility to the airport compared to the residential and manufacturing land use. Also, the bid-rent gradients of the six types of land use activities have declined dramatically through the three time periods because of the Asian Financial Crisis in 1997. Therefore, the lesson learned from this research concerns about the reliability of the data used. The major concern involves the use of different areal units for assessing land value for different time periods between zone block (1995) and grid block (2002, 2009). As a result, this affect the investigation of the overall trends of land value assessment, which are not readily apparent. In addition, the next concern is the availability of the historical data. With the lack of collecting historical data for land value assessment by the government, some of data of land values and aerial photos are not available to cover the entire study area. Finally, the different formats of using aerial photos between hard-copy (1995) and digital photo (2002, 2009) made difficult for measuring distances. Therefore, these problems also affect the accuracy of the results of the statistical analyses.Keywords: airport development area, economic rents, spatial pattern, suvarnabhumi international airport
Procedia PDF Downloads 27443386 Geographic Information System Using Google Fusion Table Technology for the Delivery of Disease Data Information
Authors: I. Nyoman Mahayasa Adiputra
Abstract:
Data in the field of health can be useful for the purposes of data analysis, one example of health data is disease data. Disease data is usually in a geographical plot in accordance with the area. Where the data was collected, in the city of Denpasar, Bali. Disease data report is still published in tabular form, disease information has not been mapped in GIS form. In this research, disease information in Denpasar city will be digitized in the form of a geographic information system with the smallest administrative area in the form of district. Denpasar City consists of 4 districts of North Denpasar, East Denpasar, West Denpasar and South Denpasar. In this research, we use Google fusion table technology for map digitization process, where this technology can facilitate from the administrator and from the recipient information. From the administrator side of the input disease, data can be done easily and quickly. From the receiving end of the information, the resulting GIS application can be published in a website-based application so that it can be accessed anywhere and anytime. In general, the results obtained in this study, divided into two, namely: (1) Geolocation of Denpasar and all of Denpasar districts, the process of digitizing the map of Denpasar city produces a polygon geolocation of each - district of Denpasar city. These results can be utilized in subsequent GIS studies if you want to use the same administrative area. (2) Dengue fever mapping in 2014 and 2015. Disease data used in this study is dengue fever case data taken in 2014 and 2015. Data taken from the profile report Denpasar Health Department 2015 and 2016. This mapping can be useful for the analysis of the spread of dengue hemorrhagic fever in the city of Denpasar.Keywords: geographic information system, Google fusion table technology, delivery of disease data information, Denpasar city
Procedia PDF Downloads 12943385 A Human Centered Design of an Exoskeleton Using Multibody Simulation
Authors: Sebastian Kölbl, Thomas Reitmaier, Mathias Hartmann
Abstract:
Trial and error approaches to adapt wearable support structures to human physiology are time consuming and elaborate. However, during preliminary design, the focus lies on understanding the interaction between exoskeleton and the human body in terms of forces and moments, namely body mechanics. For the study at hand, a multi-body simulation approach has been enhanced to evaluate actual forces and moments in a human dummy model with and without a digital mock-up of an active exoskeleton. Therefore, different motion data have been gathered and processed to perform a musculosceletal analysis. The motion data are ground reaction forces, electromyography data (EMG) and human motion data recorded with a marker-based motion capture system. Based on the experimental data, the response of the human dummy model has been calibrated. Subsequently, the scalable human dummy model, in conjunction with the motion data, is connected with the exoskeleton structure. The results of the human-machine interaction (HMI) simulation platform are in particular resulting contact forces and human joint forces to compare with admissible values with regard to the human physiology. Furthermore, it provides feedback for the sizing of the exoskeleton structure in terms of resulting interface forces (stress justification) and the effect of its compliance. A stepwise approach for the setup and validation of the modeling strategy is presented and the potential for a more time and cost-effective development of wearable support structures is outlined.Keywords: assistive devices, ergonomic design, inverse dynamics, inverse kinematics, multibody simulation
Procedia PDF Downloads 16243384 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework
Procedia PDF Downloads 28443383 Q-Map: Clinical Concept Mining from Clinical Documents
Authors: Sheikh Shams Azam, Manoj Raju, Venkatesh Pagidimarri, Vamsi Kasivajjala
Abstract:
Over the past decade, there has been a steep rise in the data-driven analysis in major areas of medicine, such as clinical decision support system, survival analysis, patient similarity analysis, image analytics etc. Most of the data in the field are well-structured and available in numerical or categorical formats which can be used for experiments directly. But on the opposite end of the spectrum, there exists a wide expanse of data that is intractable for direct analysis owing to its unstructured nature which can be found in the form of discharge summaries, clinical notes, procedural notes which are in human written narrative format and neither have any relational model nor any standard grammatical structure. An important step in the utilization of these texts for such studies is to transform and process the data to retrieve structured information from the haystack of irrelevant data using information retrieval and data mining techniques. To address this problem, the authors present Q-Map in this paper, which is a simple yet robust system that can sift through massive datasets with unregulated formats to retrieve structured information aggressively and efficiently. It is backed by an effective mining technique which is based on a string matching algorithm that is indexed on curated knowledge sources, that is both fast and configurable. The authors also briefly examine its comparative performance with MetaMap, one of the most reputed tools for medical concepts retrieval and present the advantages the former displays over the latter.Keywords: information retrieval, unified medical language system, syntax based analysis, natural language processing, medical informatics
Procedia PDF Downloads 13343382 Classification Earthquake Distribution in the Banda Sea Collision Zone with Point Process Approach
Authors: H. J. Wattimanela, U. S. Passaribu, N. T. Puspito, S. W. Indratno
Abstract:
Banda Sea collision zone (BSCZ) of is the result of the interaction and convergence of Indo-Australian plate, Eurasian plate and Pacific plate. This location in the eastern part of Indonesia. This zone has a very high seismic activity. In this research, we will be calculated rate (λ) and Mean Square Eror (MSE). By this result, we will identification of Poisson distribution of earthquakes in the BSCZ with the point process approach. Chi-square test approach and test Anscombe made in the process of identifying a Poisson distribution in the partition area. The data used are earthquakes with Magnitude ≥ 6 SR and its period 1964-2013 and sourced from BMKG Jakarta. This research is expected to contribute to the Moluccas Province and surrounding local governments in performing spatial plan document related to disaster management.Keywords: molluca banda sea collision zone, earthquakes, mean square error, poisson distribution, chi-square test, anscombe test
Procedia PDF Downloads 30043381 Using Non-Negative Matrix Factorization Based on Satellite Imagery for the Collection of Agricultural Statistics
Authors: Benyelles Zakaria, Yousfi Djaafar, Karoui Moussa Sofiane
Abstract:
Agriculture is fundamental and remains an important objective in the Algerian economy, based on traditional techniques and structures, it generally has a purpose of consumption. Collection of agricultural statistics in Algeria is done using traditional methods, which consists of investigating the use of land through survey and field survey. These statistics suffer from problems such as poor data quality, the long delay between collection of their last final availability and high cost compared to their limited use. The objective of this work is to develop a processing chain for a reliable inventory of agricultural land by trying to develop and implement a new method of extracting information. Indeed, this methodology allowed us to combine data from remote sensing and field data to collect statistics on areas of different land. The contribution of remote sensing in the improvement of agricultural statistics, in terms of area, has been studied in the wilaya of Sidi Bel Abbes. It is in this context that we applied a method for extracting information from satellite images. This method is called the non-negative matrix factorization, which does not consider the pixel as a single entity, but will look for components the pixel itself. The results obtained by the application of the MNF were compared with field data and the results obtained by the method of maximum likelihood. We have seen a rapprochement between the most important results of the FMN and those of field data. We believe that this method of extracting information from satellite data leads to interesting results of different types of land uses.Keywords: blind source separation, hyper-spectral image, non-negative matrix factorization, remote sensing
Procedia PDF Downloads 42343380 The Economic Limitations of Defining Data Ownership Rights
Authors: Kacper Tomasz Kröber-Mulawa
Abstract:
This paper will address the topic of data ownership from an economic perspective, and examples of economic limitations of data property rights will be provided, which have been identified using methods and approaches of economic analysis of law. To properly build a background for the economic focus, in the beginning a short perspective of data and data ownership in the EU’s legal system will be provided. It will include a short introduction to its political and social importance and highlight relevant viewpoints. This will stress the importance of a Single Market for data but also far-reaching regulations of data governance and privacy (including the distinction of personal and non-personal data, data held by public bodies and private businesses). The main discussion of this paper will build upon the briefly referred to legal basis as well as methods and approaches of economic analysis of law.Keywords: antitrust, data, data ownership, digital economy, property rights
Procedia PDF Downloads 8243379 Prediction and Reduction of Cracking Issue in Precision Forging of Engine Valves Using Finite Element Method
Authors: Xi Yang, Bulent Chavdar, Alan Vonseggern, Taylan Altan
Abstract:
Fracture in hot precision forging of engine valves was investigated in this paper. The entire valve forging procedure was described and the possible cause of the fracture was proposed. Finite Element simulation was conducted for the forging process, with commercial Finite Element code DEFORMTM. The effects of material properties, the effect of strain rate and temperature were considered in the FE simulation. Two fracture criteria were discussed and compared, based on the accuracy and reliability of the FE simulation results. The selected criterion predicted the fracture location and shows the trend of damage increasing with good accuracy, which matches the experimental observation. Additional modification of the punch shapes was proposed to further reduce the tendency of fracture in forging. Finite Element comparison shows a great potential of such application in the mass production.Keywords: hotforging, engine valve, fracture, tooling
Procedia PDF Downloads 28043378 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores
Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay
Abstract:
Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.Keywords: retail stores, faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition
Procedia PDF Downloads 157