Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18288

Search results for: data logger

18288 A Novel Technological Approach to Maintaining the Cold Chain during Transportation

Authors: Philip J. Purnell

Abstract:

Innovators propose to use the Internet of Things to solve the problem of maintaining the cold chain during the transport of biopharmaceutical products. Sending a data logger with refrigerated goods is only useful to inform the recipient of the goods that they have either breached the cold chain and are therefore potentially spoiled or that they have not breached it and are therefore assumed to be in good condition. Connecting the data logger to the Internet of Things means that the supply chain manager will be informed in real-time of the exact location and the precise temperature of the material at any point on earth. Readable using a simple online interface, the supply chain manager will watch the progress of their material on a Google map together with accurate and crucially real-time temperature readings. The data logger will also send alarms to the supply chain manager if a cold chain breach becomes imminent allowing them time to contact the transporter and restore the cold chain before the material is affected. This development is expected to save billions of dollars in wasted biologics that currently arrive either spoiled or in an unreliable condition.

Keywords: internet of things, cold chain, data logger, transportation

Procedia PDF Downloads 325
18287 Impact of External Temperature on the Speleothem Growth in the Moravian Karst

Authors: Frantisek Odvarka

Abstract:

Based on the data from the Moravian Karst, the influence of the calcite speleothem growth by selected meteorological factors was evaluated. External temperature was determined as one of the main factors influencing speleothem growth in Moravian Karst. This factor significantly influences the CO₂ concentration in soil/epikarst, and cave atmosphere in the Moravian Karst and significantly contributes to the changes in the CO₂ partial pressure differences between soil/epikarst and cave atmosphere in Moravian Karst, which determines the drip water supersaturation with respect to the calcite and quantity of precipitated calcite in the Moravian Karst cave environment. External air temperatures and cave air temperatures were measured using a COMET S3120 data logger, which can measure temperatures in the range from -30 to +80 °C with an accuracy of ± 0.4 °C. CO₂ concentrations in the cave and soils were measured with a FT A600 CO₂H Ahlborn probe (value range 0 ppmv to 10,000 ppmv, accuracy 1 ppmv), which was connected to the data logger ALMEMO 2290-4, V5 Ahlborn. The soil temperature was measured with a FHA646E1 Ahlborn probe (temperature range -20 to 70 °C, accuracy ± 0.4 °C) connected to an ALMEMO 2290-4 V5 Ahlborn data logger. The airflow velocities into and out of the cave were monitored by a FVA395 TH4 Thermo anemometer (speed range from 0.05 to 2 m s⁻¹, accuracy ± 0.04 m s⁻¹), which was connected to the ALMEMO 2590-4 V5 Ahlborn data logger for recording. The flow was measured in the lower and upper entrance of the Imperial Cave. The data were analyzed in MS Office Excel 2019 and PHREEQC.

Keywords: speleothem growth, carbon dioxide partial pressure, Moravian Karst, external temperature

Procedia PDF Downloads 40
18286 A Study of Fatigue Life Estimation of a Modular Unmanned Aerial Vehicle by Developing a Structural Health Monitoring System

Authors: Zain Ul Hassan, Muhammad Zain Ul Abadin, Muhammad Zubair Khan

Abstract:

Unmanned aerial vehicles (UAVs) have now become of predominant importance for various operations, and an immense amount of work is going on in this specific category. The structural stability and life of these UAVs is key factor that should be considered while deploying them to different intelligent operations as their failure leads to loss of sensitive real-time data and cost. This paper presents an applied research on the development of a structural health monitoring system for a UAV designed and fabricated by deploying modular approach. Firstly, a modular UAV has been designed which allows to dismantle and to reassemble the components of the UAV without effecting the whole assembly of UAV. This novel approach makes the vehicle very sustainable and decreases its maintenance cost to a significant value by making possible to replace only the part leading to failure. Then the SHM for the designed architecture of the UAV had been specified as a combination of wings integrated with strain gauges, on-board data logger, bridge circuitry and the ground station. For the research purpose sensors have only been attached to the wings being the most load bearing part and as per analysis was done on ANSYS. On the basis of analysis of the load time spectrum obtained by the data logger during flight, fatigue life of the respective component has been predicted using fracture mechanics techniques of Rain Flow Method and Miner’s Rule. Thus allowing us to monitor the health of a specified component time to time aiding to avoid any failure.

Keywords: fracture mechanics, rain flow method, structural health monitoring system, unmanned aerial vehicle

Procedia PDF Downloads 188
18285 Mobile and Hot Spot Measurement with Optical Particle Counting Based Dust Monitor EDM264

Authors: V. Ziegler, F. Schneider, M. Pesch

Abstract:

With the EDM264, GRIMM offers a solution for mobile short- and long-term measurements in outdoor areas and at production sites. For research as well as permanent areal observations on a near reference quality base. The model EDM264 features a powerful and robust measuring cell based on optical particle counting (OPC) principle with all the advantages that users of GRIMM's portable aerosol spectrometers are used to. The system is embedded in a compact weather-protection housing with all-weather sampling, heated inlet system, data logger, and meteorological sensor. With TSP, PM10, PM4, PM2.5, PM1, and PMcoarse, the EDM264 provides all fine dust fractions real-time, valid for outdoor applications and calculated with the proven GRIMM enviro-algorithm, as well as six additional dust mass fractions pm10, pm2.5, pm1, inhalable, thoracic and respirable for IAQ and workplace measurements. This highly versatile instrument performs real-time monitoring of particle number, particle size and provides information on particle surface distribution as well as dust mass distribution. GRIMM's EDM264 has 31 equidistant size channels, which are PSL traceable. A high-end data logger enables data acquisition and wireless communication via LTE, WLAN, or wired via Ethernet. Backup copies of the measurement data are stored in the device directly. The rinsing air function, which protects the laser and detector in the optical cell, further increases the reliability and long term stability of the EDM264 under different environmental and climatic conditions. The entire sample volume flow of 1.2 L/min is analyzed by 100% in the optical cell, which assures excellent counting efficiency at low and high concentrations and complies with the ISO 21501-1standard for OPCs. With all these features, the EDM264 is a world-leading dust monitor for precise monitoring of particulate matter and particle number concentration. This highly reliable instrument is an indispensable tool for many users who need to measure aerosol levels and air quality outdoors, on construction sites, or at production facilities.

Keywords: aerosol research, aerial observation, fence line monitoring, wild fire detection

Procedia PDF Downloads 48
18284 A Building Structure Health Monitoring DeviceBased on Cost Effective 1-Axis Accelerometers

Authors: Chih Hsing Lin, Wen-Ching Chen, Ssu-Ying Chen, Chih-Chyau Yang, Chien-Ming Wu, Chun-Ming Huang

Abstract:

Critical structures such as buildings, bridges and dams require periodic inspections to ensure safe operation. The reliable inspection of structures can be achieved by combing temperature sensor and accelerometers. In this work, we propose a building structure health monitoring device (BSHMD) with using three 1-axis accelerometers, gateway, analog to digital converter (ADC), and data logger to monitoring the building structure. The proposed BSHMD achieves the features of low cost by using three 1-axis accelerometers with the data synchronization problem being solved, and easily installation and removal. Furthermore, we develop a packet acquisition program to receive the sensed data and then classify it based on time and date. Compared with 3-axis accelerometer, our proposed 1-axis accelerometers based device achieves 64.3% cost saving. Compared with previous structural monitoring device, the BSHMD achieves 89% area saving. Therefore, with using the proposed device, the realtime diagnosis system for building damage monitoring can be conducted effectively.

Keywords: building structure health monitoring, cost effective, 1-axis accelerometers, real-time diagnosis

Procedia PDF Downloads 208
18283 Establishment of Landslide Warning System Using Surface or Sub-Surface Sensors Data

Authors: Neetu Tyagi, Sumit Sharma

Abstract:

The study illustrates the results of an integrated study done on Tangni landslide located on NH-58 at Chamoli, Uttarakhand. Geological, geo-morphological and geotechnical investigations were carried out to understand the mechanism of landslide and to plan further investigation and monitoring. At any rate, the movements were favored by continuous rainfall water infiltration from the zones where the phyllites/slates and Dolomites outcrop. The site investigations were carried out including the monitoring of landslide movements and of the water level fluctuations due to rainfall give us a better understanding of landslide dynamics that have been causing in time soil instability at Tangni landslide site. The Early Warning System (EWS) installed different types of sensors and all sensors were directly connected to data logger and raw data transfer to the Defence Terrain Research Laboratory (DTRL) server room with the help of File Transfer Protocol (FTP). The slip surfaces were found at depths ranging from 8 to 10 m from Geophysical survey and hence sensors were installed to the depth of 15m at various locations of landslide. Rainfall is the main triggering factor of landslide. In this study, the developed model of unsaturated soil slope stability is carried out. The analysis of sensors data available for one year, indicated the sliding surface of landslide at depth between 6 to 12m with total displacement up to 6cm per year recorded at the body of landslide. The aim of this study is to set the threshold and generate early warning. Local peoples already alert towards landslide, if they have any types of warning system.

Keywords: early warning system, file transfer protocol, geo-morphological, geotechnical, landslide

Procedia PDF Downloads 67
18282 Validation of Solar PV Inverter Harmonics Behaviour at Different Power Levels in a Test Network

Authors: Wilfred Fritz

Abstract:

Grid connected solar PV inverters need to be compliant to standard regulations regarding unwanted harmonic generation. This paper gives an introduction to harmonics, solar PV inverter voltage regulation and balancing through compensation and investigates the behaviour of harmonic generation at different power levels. Practical measurements of harmonics and power levels with a power quality data logger were made, on a test network at a university in Germany. The test setup and test results are discussed. The major finding was that between the morning and afternoon load peak windows when the PV inverters operate under low solar insolation and low power levels, more unwanted harmonics are generated. This has a huge impact on the power quality of the grid as well as capital and maintenance costs. The design of a single-tuned harmonic filter towards harmonic mitigation is presented.

Keywords: harmonics, power quality, pulse width modulation, total harmonic distortion

Procedia PDF Downloads 90
18281 Using Log Files to Improve Work Efficiency

Authors: Salman Hussam

Abstract:

As a monitoring system to manage employees' time and employers' business, this system (logger) will monitor the employees at work and will announce them if they spend too much time on social media (even if they are using proxy it will catch them). In this way, people will spend less time at work and more time with family.

Keywords: clients, employees, employers, family, monitoring, systems, social media, time

Procedia PDF Downloads 368
18280 Processing Big Data: An Approach Using Feature Selection

Authors: Nikat Parveen, M. Ananthi

Abstract:

Big data is one of the emerging technology, which collects the data from various sensors and those data will be used in many fields. Data retrieval is one of the major issue where there is a need to extract the exact data as per the need. In this paper, large amount of data set is processed by using the feature selection. Feature selection helps to choose the data which are actually needed to process and execute the task. The key value is the one which helps to point out exact data available in the storage space. Here the available data is streamed and R-Center is proposed to achieve this task.

Keywords: big data, key value, feature selection, retrieval, performance

Procedia PDF Downloads 152
18279 Applications of Big Data in Education

Authors: Faisal Kalota

Abstract:

Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners’ needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in education. Additionally, it discusses some of the concerns related to Big Data and current research trends. While Big Data can provide big benefits, it is important that institutions understand their own needs, infrastructure, resources, and limitation before jumping on the Big Data bandwagon.

Keywords: big data, learning analytics, analytics, big data in education, Hadoop

Procedia PDF Downloads 250
18278 Performance Investigation of Thermal Insulation Materials for Walls: A Case Study in Nicosia (Turkish Republic of North Cyprus)

Authors: L. Vafaei, McDominic Eze

Abstract:

The performance of thermal energy in homes and buildings is a significant factor in terms of energy efficiency of a building. In a large sense, the performance of thermal energy is dependent on many factors of which the amount of thermal insulation is at one end a considerable factor, as likewise the essence of mass and the wall thickness and also the thermal resistance of wall material. This study is aimed at illustrating the different wall system in Turkish Republic of North Cyprus (TRNC), acknowledge the problem and suggest a solution through comparing the effect of thermal radiation two model rooms- L1 (Ytong wall) and L2 (heat insulated wall using stone wool) set up for experimentation. The model room has four face walls. The study consists of two stage, the first test is to access the effect of solar radiation for south facing wall and the second stage is to test the thermal performance of Ytong and heat insulated wall, the effects of climatic condition during winter. The heat insulated wall contains material hollow brick, stone wool, and gypsum while the Ytong wall contains cement concrete, for the outer surface and the inner surface and Ytong stone. The total heat of the wall was determined, 7T-Type thermocouple was used with a data logger system to record the data, temperature change recorded at an interval of 10 minutes. The result obtained was that Ytong wall save more energy than the heat insulated wall at night while heat insulated wall saves energy during the day when intensity is at maximum.

Keywords: heat insulation, hollow bricks, south facing, Ytong bricks wall

Procedia PDF Downloads 161
18277 Analysis of Big Data

Authors: Sandeep Sharma, Sarabjit Singh

Abstract:

As per the user demand and growth trends of large free data the storage solutions are now becoming more challenge-able to protect, store and to retrieve data. The days are not so far when the storage companies and organizations are start saying 'no' to store our valuable data or they will start charging a huge amount for its storage and protection. On the other hand as per the environmental conditions it becomes challenge-able to maintain and establish new data warehouses and data centers to protect global warming threats. A challenge of small data is over now, the challenges are big that how to manage the exponential growth of data. In this paper we have analyzed the growth trend of big data and its future implications. We have also focused on the impact of the unstructured data on various concerns and we have also suggested some possible remedies to streamline big data.

Keywords: big data, unstructured data, volume, variety, velocity

Procedia PDF Downloads 395
18276 Research of Data Cleaning Methods Based on Dependency Rules

Authors: Yang Bao, Shi Wei Deng, WangQun Lin

Abstract:

This paper introduces the concept and principle of data cleaning, analyzes the types and causes of dirty data, and proposes several key steps of typical cleaning process, puts forward a well scalability and versatility data cleaning framework, in view of data with attribute dependency relation, designs several of violation data discovery algorithms by formal formula, which can obtain inconsistent data to all target columns with condition attribute dependent no matter data is structured (SQL) or unstructured (NoSQL), and gives 6 data cleaning methods based on these algorithms.

Keywords: data cleaning, dependency rules, violation data discovery, data repair

Procedia PDF Downloads 309
18275 Mining Big Data in Telecommunications Industry: Challenges, Techniques, and Revenue Opportunity

Authors: Hoda A. Abdel Hafez

Abstract:

Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.

Keywords: mining big data, big data, machine learning, telecommunication

Procedia PDF Downloads 268
18274 Multi-Source Data Fusion for Urban Comprehensive Management

Authors: Bolin Hua

Abstract:

In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.

Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data

Procedia PDF Downloads 265
18273 Reviewing Privacy Preserving Distributed Data Mining

Authors: Sajjad Baghernezhad, Saeideh Baghernezhad

Abstract:

Nowadays considering human involved in increasing data development some methods such as data mining to extract science are unavoidable. One of the discussions of data mining is inherent distribution of the data usually the bases creating or receiving such data belong to corporate or non-corporate persons and do not give their information freely to others. Yet there is no guarantee to enable someone to mine special data without entering in the owner’s privacy. Sending data and then gathering them by each vertical or horizontal software depends on the type of their preserving type and also executed to improve data privacy. In this study it was attempted to compare comprehensively preserving data methods; also general methods such as random data, coding and strong and weak points of each one are examined.

Keywords: data mining, distributed data mining, privacy protection, privacy preserving

Procedia PDF Downloads 363
18272 The Right to Data Portability and Its Influence on the Development of Digital Services

Authors: Roman Bieda

Abstract:

The General Data Protection Regulation (GDPR) will come into force on 25 May 2018 which will create a new legal framework for the protection of personal data in the European Union. Article 20 of GDPR introduces a right to data portability. This right allows for data subjects to receive the personal data which they have provided to a data controller, in a structured, commonly used and machine-readable format, and to transmit this data to another data controller. The right to data portability, by facilitating transferring personal data between IT environments (e.g.: applications), will also facilitate changing the provider of services (e.g. changing a bank or a cloud computing service provider). Therefore, it will contribute to the development of competition and the digital market. The aim of this paper is to discuss the right to data portability and its influence on the development of new digital services.

Keywords: data portability, digital market, GDPR, personal data

Procedia PDF Downloads 333
18271 The Design and Implementation of a Calorimeter for Evaluation of the Thermal Performance of Materials: The Case of Phase Change Materials

Authors: Ebrahim Solgi, Zahra Hamedani, Behrouz Mohammad Kari, Ruwan Fernando, Henry Skates

Abstract:

The use of thermal energy storage (TES) as part of a passive design strategy can reduce a building’s energy demand. TES materials do this by increasing the lag between energy consumption and energy supply by absorbing, storing and releasing energy in a controlled manner. The increase of lightweight construction in the building industry has made it harder to utilize thermal mass. Consequently, Phase Change Materials (PCMs) are a promising alternative as they can be manufactured in thin layers and used with lightweight construction to store latent heat. This research investigates utilizing PCMs, with the first step being measuring their performance under experimental conditions. To do this requires three components. The first is a calorimeter for measuring indoor thermal conditions, the second is a pyranometer for recording the solar conditions: global, diffuse and direct radiation and the third is a data-logger for recording temperature and humidity for the studied period. This paper reports on the design and implementation of an experimental setup used to measure the thermal characteristics of PCMs as part of a wall construction. The experimental model has been simulated with the software EnergyPlus to create a reliable simulation model that warrants further investigation.

Keywords: phase change materials, EnergyPlus, experimental evaluation, night ventilation

Procedia PDF Downloads 166
18270 Comparison of On-Site Stormwater Detention Real Performance and Theoretical Simulations

Authors: Pedro P. Drumond, Priscilla M. Moura, Marcia M. L. P. Coelho

Abstract:

The purpose of On-site Stormwater Detention (OSD) system is to promote the detention of addition stormwater runoff caused by impervious areas, in order to maintain the peak flow the same as the pre-urbanization condition. In recent decades, these systems have been built in many cities around the world. However, its real efficiency continues to be unknown due to the lack of research, especially with regard to monitoring its real performance. Thus, this study aims to compare the water level monitoring data of an OSD built in Belo Horizonte/Brazil with the results of theoretical methods simulations, usually adopted in OSD design. There were made two theoretical simulations, one using the Rational Method and Modified Puls method and another using the Soil Conservation Service (SCS) method and Modified Puls method. The monitoring data were obtained with a water level sensor, installed inside the reservoir and connected to a data logger. The comparison of OSD performance was made for 48 rainfall events recorded from April/2015 to March/2017. The comparison of maximum water levels in the OSD showed that the results of the simulations with Rational/Puls and SCS/Puls methods were, on average 33% and 73%, respectively, lower than those monitored. The Rational/Puls results were significantly higher than the SCS/Puls results, only in the events with greater frequency. In the events with average recurrence interval of 5, 10 and 200 years, the maximum water heights were similar in both simulations. Also, the results showed that the duration of rainfall events was close to the duration of monitored hydrograph. The rising time and recession time of the hydrographs calculated with the Rational Method represented better the monitored hydrograph than SCS Method. The comparison indicates that the real discharge coefficient value could be higher than 0.61, adopted in Puls simulations. New researches evaluating OSD real performance should be developed. In order to verify the peak flow damping efficiency and the value of the discharge coefficient is necessary to monitor the inflow and outflow of an OSD, in addition to monitor the water level inside it.

Keywords: best management practices, on-site stormwater detention, source control, urban drainage

Procedia PDF Downloads 103
18269 Recent Advances in Data Warehouse

Authors: Fahad Hanash Alzahrani

Abstract:

This paper describes some recent advances in a quickly developing area of data storing and processing based on Data Warehouses and Data Mining techniques, which are associated with software, hardware, data mining algorithms and visualisation techniques having common features for any specific problems and tasks of their implementation.

Keywords: data warehouse, data mining, knowledge discovery in databases, on-line analytical processing

Procedia PDF Downloads 280
18268 How to Use Big Data in Logistics Issues

Authors: Mehmet Akif Aslan, Mehmet Simsek, Eyup Sensoy

Abstract:

Big Data stands for today’s cutting-edge technology. As the technology becomes widespread, so does Data. Utilizing massive data sets enable companies to get competitive advantages over their adversaries. Out of many area of Big Data usage, logistics has significance role in both commercial sector and military. This paper lays out what big data is and how it is used in both military and commercial logistics.

Keywords: big data, logistics, operational efficiency, risk management

Procedia PDF Downloads 401
18267 Implementation of an IoT Sensor Data Collection and Analysis Library

Authors: Jihyun Song, Kyeongjoo Kim, Minsoo Lee

Abstract:

Due to the development of information technology and wireless Internet technology, various data are being generated in various fields. These data are advantageous in that they provide real-time information to the users themselves. However, when the data are accumulated and analyzed, more various information can be extracted. In addition, development and dissemination of boards such as Arduino and Raspberry Pie have made it possible to easily test various sensors, and it is possible to collect sensor data directly by using database application tools such as MySQL. These directly collected data can be used for various research and can be useful as data for data mining. However, there are many difficulties in using the board to collect data, and there are many difficulties in using it when the user is not a computer programmer, or when using it for the first time. Even if data are collected, lack of expert knowledge or experience may cause difficulties in data analysis and visualization. In this paper, we aim to construct a library for sensor data collection and analysis to overcome these problems.

Keywords: clustering, data mining, DBSCAN, k-means, k-medoids, sensor data

Procedia PDF Downloads 267
18266 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.

Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review

Procedia PDF Downloads 40
18265 An Investigation on the Effect of Window Tinting on Thermal Comfort inside Office Buildings

Authors: S. El-Azzeh, A. Al-Aqqad, M. Salem, H. Al-Khaldi, S. Thaher

Abstract:

Thermal comfort studies are very important during the early stages of the building’s design. If this study was ignored, problems will start to occur for the occupants in the future. In hot climates, where solar radiations are entering buildings all year long, occupant’s thermal comfort in office buildings needs to be examined. This study aims to investigate the thermal comfort at an existing office building at the Australian College of Kuwait and test its validity and improve occupant’s thermal satisfaction by covering windows with a heat rejection tint material that enables sunlight to pass through the office while reflecting solar heat outside. Environmental variables were measured using thermal comfort data logger INNOVA 1221 to find the predicted mean vote (PMV) in the selected location. Also, subjective variables were measured to find the actual mean vote (AMV) through surveys distributed among occupants in the selected case study office. All the variables collected were analyzed and classified according to international standards ISO 7730 and ASHRAE55. The results of this study showed improvement in both PMV and AMV. The mean value of PMV based on the original design was 0.691 which dropped to 0.32 after installation and it still at comfort zone. Also, the mean value of the AMV has improved for the first occupant, where before it was -0.46 and it became -1 which is cooler. For the other occupant, it was slightly warm with a mean value of 0.9 and it was improved and became cooler with a -0.25 mean value based on American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) seven-point scale.

Keywords: thermal comfort, office buildings, indoor environments, predicted mean vote

Procedia PDF Downloads 67
18264 Government Big Data Ecosystem: A Systematic Literature Review

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Data that is high in volume, velocity, veracity and comes from a variety of sources is usually generated in all sectors including the government sector. Globally public administrations are pursuing (big) data as new technology and trying to adopt a data-centric architecture for hosting and sharing data. Properly executed, big data and data analytics in the government (big) data ecosystem can be led to data-driven government and have a direct impact on the way policymakers work and citizens interact with governments. In this research paper, we conduct a systematic literature review. The main aims of this paper are to highlight essential aspects of the government (big) data ecosystem and to explore the most critical socio-technical factors that contribute to the successful implementation of government (big) data ecosystem. The essential aspects of government (big) data ecosystem include definition, data types, data lifecycle models, and actors and their roles. We also discuss the potential impact of (big) data in public administration and gaps in the government data ecosystems literature. As this is a new topic, we did not find specific articles on government (big) data ecosystem and therefore focused our research on various relevant areas like humanitarian data, open government data, scientific research data, industry data, etc.

Keywords: applications of big data, big data, big data types. big data ecosystem, critical success factors, data-driven government, egovernment, gaps in data ecosystems, government (big) data, literature review, public administration, systematic review

Procedia PDF Downloads 40
18263 A Machine Learning Decision Support Framework for Industrial Engineering Purposes

Authors: Anli Du Preez, James Bekker

Abstract:

Data is currently one of the most critical and influential emerging technologies. However, the true potential of data is yet to be exploited since, currently, about 1% of generated data are ever actually analyzed for value creation. There is a data gap where data is not explored due to the lack of data analytics infrastructure and the required data analytics skills. This study developed a decision support framework for data analytics by following Jabareen’s framework development methodology. The study focused on machine learning algorithms, which is a subset of data analytics. The developed framework is designed to assist data analysts with little experience, in choosing the appropriate machine learning algorithm given the purpose of their application.

Keywords: Data analytics, Industrial engineering, Machine learning, Value creation

Procedia PDF Downloads 44
18262 Providing Security to Private Cloud Using Advanced Encryption Standard Algorithm

Authors: Annapureddy Srikant Reddy, Atthanti Mahendra, Samala Chinni Krishna, N. Neelima

Abstract:

In our present world, we are generating a lot of data and we, need a specific device to store all these data. Generally, we store data in pen drives, hard drives, etc. Sometimes we may loss the data due to the corruption of devices. To overcome all these issues, we implemented a cloud space for storing the data, and it provides more security to the data. We can access the data with just using the internet from anywhere in the world. We implemented all these with the java using Net beans IDE. Once user uploads the data, he does not have any rights to change the data. Users uploaded files are stored in the cloud with the file name as system time and the directory will be created with some random words. Cloud accepts the data only if the size of the file is less than 2MB.

Keywords: cloud space, AES, FTP, NetBeans IDE

Procedia PDF Downloads 72
18261 Business Intelligence for Profiling of Telecommunication Customer

Authors: Rokhmatul Insani, Hira Laksmiwati Soemitro

Abstract:

Business Intelligence is a methodology that exploits the data to produce information and knowledge systematically, business intelligence can support the decision-making process. Some methods in business intelligence are data warehouse and data mining. A data warehouse can store historical data from transactional data. For data modelling in data warehouse, we apply dimensional modelling by Kimball. While data mining is used to extracting patterns from the data and get insight from the data. Data mining has many techniques, one of which is segmentation. For profiling of telecommunication customer, we use customer segmentation according to customer’s usage of services, customer invoice and customer payment. Customers can be grouped according to their characteristics and can be identified the profitable customers. We apply K-Means Clustering Algorithm for segmentation. The input variable for that algorithm we use RFM (Recency, Frequency and Monetary) model. All process in data mining, we use tools IBM SPSS modeller.

Keywords: business intelligence, customer segmentation, data warehouse, data mining

Procedia PDF Downloads 299
18260 Imputation Technique for Feature Selection in Microarray Data Set

Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam

Abstract:

Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.

Keywords: DNA microarray, feature selection, missing data, bioinformatics

Procedia PDF Downloads 382
18259 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework

Authors: Lutful Karim, Mohammed S. Al-kahtani

Abstract:

Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.

Keywords: big data, clustering, tree topology, data aggregation, sensor networks

Procedia PDF Downloads 211