Search results for: cavitation cloud
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 775

Search results for: cavitation cloud

325 Advancing Sustainable Futures: A Study on Low Carbon Ventures

Authors: Gaurav Kumar Sinha

Abstract:

As the world grapples with climate challenges, this study highlights the instrumental role of AWS services in amplifying the impact of LCVs. Their ability to harness the cloud, data analytics, and scalable infrastructure offered by AWS empowers LCVs to innovate, scale, and drive meaningful change in the quest for a sustainable future. This study serves as a rallying cry, urging stakeholders to recognize, embrace, and maximize the potential of AWS-powered solutions in advancing sustainable and resilient global initiatives.

Keywords: low carbon ventures, sustainability solutions, AWS services, data analytics

Procedia PDF Downloads 64
324 Simple Ways to Enhance the Security of Web Services

Authors: Majid Azarniush, Soroush Mokallaei

Abstract:

Although robust security software, including anti-viruses, anti spy wares, anti-spam and firewalls, are amalgamated with new technologies such as Safe Zone, Hybrid Cloud, Sand Box etc., and it can be said that they have managed to prepare highest level of security against viruses, spy wares and other malwares in 2012, but in fact hackers' attacks to websites are increasingly becoming more and more complicated. Because of security matters and developments, it can be said that it was expected to happen so. Here in this work, we try to point out to some functional and vital notes to enhance security on the web enabling the user to browse safely in no limit web world and to use virtual space securely.

Keywords: firewalls, security, web services, software

Procedia PDF Downloads 511
323 Predictive Maintenance: Machine Condition Real-Time Monitoring and Failure Prediction

Authors: Yan Zhang

Abstract:

Predictive maintenance is a technique to predict when an in-service machine will fail so that maintenance can be planned in advance. Analytics-driven predictive maintenance is gaining increasing attention in many industries such as manufacturing, utilities, aerospace, etc., along with the emerging demand of Internet of Things (IoT) applications and the maturity of technologies that support Big Data storage and processing. This study aims to build an end-to-end analytics solution that includes both real-time machine condition monitoring and machine learning based predictive analytics capabilities. The goal is to showcase a general predictive maintenance solution architecture, which suggests how the data generated from field machines can be collected, transmitted, stored, and analyzed. We use a publicly available aircraft engine run-to-failure dataset to illustrate the streaming analytics component and the batch failure prediction component. We outline the contributions of this study from four aspects. First, we compare the predictive maintenance problems from the view of the traditional reliability centered maintenance field, and from the view of the IoT applications. When evolving to the IoT era, predictive maintenance has shifted its focus from ensuring reliable machine operations to improve production/maintenance efficiency via any maintenance related tasks. It covers a variety of topics, including but not limited to: failure prediction, fault forecasting, failure detection and diagnosis, and recommendation of maintenance actions after failure. Second, we review the state-of-art technologies that enable a machine/device to transmit data all the way through the Cloud for storage and advanced analytics. These technologies vary drastically mainly based on the power source and functionality of the devices. For example, a consumer machine such as an elevator uses completely different data transmission protocols comparing to the sensor units in an environmental sensor network. The former may transfer data into the Cloud via WiFi directly. The latter usually uses radio communication inherent the network, and the data is stored in a staging data node before it can be transmitted into the Cloud when necessary. Third, we illustrate show to formulate a machine learning problem to predict machine fault/failures. By showing a step-by-step process of data labeling, feature engineering, model construction and evaluation, we share following experiences: (1) what are the specific data quality issues that have crucial impact on predictive maintenance use cases; (2) how to train and evaluate a model when training data contains inter-dependent records. Four, we review the tools available to build such a data pipeline that digests the data and produce insights. We show the tools we use including data injection, streaming data processing, machine learning model training, and the tool that coordinates/schedules different jobs. In addition, we show the visualization tool that creates rich data visualizations for both real-time insights and prediction results. To conclude, there are two key takeaways from this study. (1) It summarizes the landscape and challenges of predictive maintenance applications. (2) It takes an example in aerospace with publicly available data to illustrate each component in the proposed data pipeline and showcases how the solution can be deployed as a live demo.

Keywords: Internet of Things, machine learning, predictive maintenance, streaming data

Procedia PDF Downloads 386
322 Determination of Various Properties of Biodiesel Produced from Different Feedstocks

Authors: Faisal Anwar, Dawar Zaidi, Shubham Dixit, Nafees Ahmedii

Abstract:

This paper analyzes the various properties of biodiesel such as pour point, cloud point, viscosity, calorific value, etc produced from different feedstocks. The aim of the work is to analyze change in these properties after converting feedstocks to biodiesel and then comparring it with ASTM 6751-02 standards to check whether they are suitable for diesel engines or not. The conversion of feedstocks is carried out by a process called transesterification. This conversion is carried out to reduce viscosity, pour point, etc. It has been observed that there is some remarkable change in the properties of oil after conversion.

Keywords: biodiesel, ethyl ester, free fatty acid, production

Procedia PDF Downloads 366
321 Biobased Toughening Filler for Polylactic Acid from Ultrafine Fully Vulcanized Powder Natural Rubber Grafted with Polymethylmethacrylate

Authors: Panyawutthi Rimdusit, Krittapas Charoensuk, Sarawut Rimdusit

Abstract:

A biobased toughening filler for polylactic acid (PLA) based on natural rubber is developed in this work. Deproteinized natural rubber (DPNR) was modified by grafting polymerization with methyl methacrylate monomer (MMA) and further crosslinked by e-beam irradiation and spray drying process to achieve ultrafine full vulcanized powdered natural rubber grafted with polymethylmethacrylate (UFPNRg-PMMA) to solves in the challenges of incompatibility between natural rubber and PLA. Intriguingly, UFPNR-g-PMMA revealed outstanding and unique properties with minimal particle aggregation. The average particle size of rubber powder obtained from UFPNR-g-PMMA at PMMA grafting content of 20 phr reduced to 3.3±1.2 µm, compared to that of neat UFPNR of 5.3±2.3 µm which also showed partial particle aggregation. It is also found that the impact strength of the filled PLA was enhanced to 33.4±5.6 kJ/m2 at PLA/UFPNR-gPMMA 20 wt% compared to neat PLA of 9.6±3 kJ/m2. The thermal degradation temperature of the PLA composites was enhanced with increasing UFPNR-g-PMMA content without affecting the glass transition temperature of the composites. The fracture surface of PLA/ UFPNR-g-PMMA suggested internal cavitation and crazes are the main effects of rubber toughening PLA with substantial interfacial interaction between the filler and the matrix.

Keywords: natural rubber, ultrafine fully vulcanized powder rubber, polylactic acid, polymer composites

Procedia PDF Downloads 11
320 Artificial Neural Network Approach for Vessel Detection Using Visible Infrared Imaging Radiometer Suite Day/Night Band

Authors: Takashi Yamaguchi, Ichio Asanuma, Jong G. Park, Kenneth J. Mackin, John Mittleman

Abstract:

In this paper, vessel detection using the artificial neural network is proposed in order to automatically construct the vessel detection model from the satellite imagery of day/night band (DNB) in visible infrared in the products of Imaging Radiometer Suite (VIIRS) on Suomi National Polar-orbiting Partnership (Suomi-NPP).The goal of our research is the establishment of vessel detection method using the satellite imagery of DNB in order to monitor the change of vessel activity over the wide region. The temporal vessel monitoring is very important to detect the events and understand the circumstances within the maritime environment. For the vessel locating and detection techniques, Automatic Identification System (AIS) and remote sensing using Synthetic aperture radar (SAR) imagery have been researched. However, each data has some lack of information due to uncertain operation or limitation of continuous observation. Therefore, the fusion of effective data and methods is important to monitor the maritime environment for the future. DNB is one of the effective data to detect the small vessels such as fishery ships that is difficult to observe in AIS. DNB is the satellite sensor data of VIIRS on Suomi-NPP. In contrast to SAR images, DNB images are moderate resolution and gave influence to the cloud but can observe the same regions in each day. DNB sensor can observe the lights produced from various artifact such as vehicles and buildings in the night and can detect the small vessels from the fishing light on the open water. However, the modeling of vessel detection using DNB is very difficult since complex atmosphere and lunar condition should be considered due to the strong influence of lunar reflection from cloud on DNB. Therefore, artificial neural network was applied to learn the vessel detection model. For the feature of vessel detection, Brightness Temperature at the 3.7 μm (BT3.7) was additionally used because BT3.7 can be used for the parameter of atmospheric conditions.

Keywords: artificial neural network, day/night band, remote sensing, Suomi National Polar-orbiting Partnership, vessel detection, Visible Infrared Imaging Radiometer Suite

Procedia PDF Downloads 235
319 Protocol for Dynamic Load Distributed Low Latency Web-Based Augmented Reality and Virtual Reality

Authors: Rohit T. P., Sahil Athrij, Sasi Gopalan

Abstract:

Currently, the content entertainment industry is dominated by mobile devices. As the trends slowly shift towards Augmented/Virtual Reality applications the computational demands on these devices are increasing exponentially and we are already reaching the limits of hardware optimizations. This paper proposes a software solution to this problem. By leveraging the capabilities of cloud computing we can offload the work from mobile devices to dedicated rendering servers that are way more powerful. But this introduces the problem of latency. This paper introduces a protocol that can achieve high-performance low latency Augmented/Virtual Reality experience. There are two parts to the protocol, 1) In-flight compression The main cause of latency in the system is the time required to transmit the camera frame from client to server. The round trip time is directly proportional to the amount of data transmitted. This can therefore be reduced by compressing the frames before sending. Using some standard compression algorithms like JPEG can result in minor size reduction only. Since the images to be compressed are consecutive camera frames there won't be a lot of changes between two consecutive images. So inter-frame compression is preferred. Inter-frame compression can be implemented efficiently using WebGL but the implementation of WebGL limits the precision of floating point numbers to 16bit in most devices. This can introduce noise to the image due to rounding errors, which will add up eventually. This can be solved using an improved interframe compression algorithm. The algorithm detects changes between frames and reuses unchanged pixels from the previous frame. This eliminates the need for floating point subtraction thereby cutting down on noise. The change detection is also improved drastically by taking the weighted average difference of pixels instead of the absolute difference. The kernel weights for this comparison can be fine-tuned to match the type of image to be compressed. 2) Dynamic Load distribution Conventional cloud computing architectures work by offloading as much work as possible to the servers, but this approach can cause a hit on bandwidth and server costs. The most optimal solution is obtained when the device utilizes 100% of its resources and the rest is done by the server. The protocol balances the load between the server and the client by doing a fraction of the computing on the device depending on the power of the device and network conditions. The protocol will be responsible for dynamically partitioning the tasks. Special flags will be used to communicate the workload fraction between the client and the server and will be updated in a constant interval of time ( or frames ). The whole of the protocol is designed so that it can be client agnostic. Flags are available to the client for resetting the frame, indicating latency, switching mode, etc. The server can react to client-side changes on the fly and adapt accordingly by switching to different pipelines. The server is designed to effectively spread the load and thereby scale horizontally. This is achieved by isolating client connections into different processes.

Keywords: 2D kernelling, augmented reality, cloud computing, dynamic load distribution, immersive experience, mobile computing, motion tracking, protocols, real-time systems, web-based augmented reality application

Procedia PDF Downloads 72
318 Water Monitoring Sentinel Cloud Platform: Water Monitoring Platform Based on Satellite Imagery and Modeling Data

Authors: Alberto Azevedo, Ricardo Martins, André B. Fortunato, Anabela Oliveira

Abstract:

Water is under severe threat today because of the rising population, increased agricultural and industrial needs, and the intensifying effects of climate change. Due to sea-level rise, erosion, and demographic pressure, the coastal regions are of significant concern to the scientific community. The Water Monitoring Sentinel Cloud platform (WORSICA) service is focused on providing new tools for monitoring water in coastal and inland areas, taking advantage of remote sensing, in situ and tidal modeling data. WORSICA is a service that can be used to determine the coastline, coastal inundation areas, and the limits of inland water bodies using remote sensing (satellite and Unmanned Aerial Vehicles - UAVs) and in situ data (from field surveys). It applies to various purposes, from determining flooded areas (from rainfall, storms, hurricanes, or tsunamis) to detecting large water leaks in major water distribution networks. This service was built on components developed in national and European projects, integrated to provide a one-stop-shop service for remote sensing information, integrating data from the Copernicus satellite and drone/unmanned aerial vehicles, validated by existing online in-situ data. Since WORSICA is operational using the European Open Science Cloud (EOSC) computational infrastructures, the service can be accessed via a web browser and is freely available to all European public research groups without additional costs. In addition, the private sector will be able to use the service, but some usage costs may be applied, depending on the type of computational resources needed by each application/user. Although the service has three main sub-services i) coastline detection; ii) inland water detection; iii) water leak detection in irrigation networks, in the present study, an application of the service to Óbidos lagoon in Portugal is shown, where the user can monitor the evolution of the lagoon inlet and estimate the topography of the intertidal areas without any additional costs. The service has several distinct methodologies implemented based on the computations of the water indexes (e.g., NDWI, MNDWI, AWEI, and AWEIsh) retrieved from the satellite image processing. In conjunction with the tidal data obtained from the FES model, the system can estimate a coastline with the corresponding level or even topography of the inter-tidal areas based on the Flood2Topo methodology. The outcomes of the WORSICA service can be helpful for several intervention areas such as i) emergency by providing fast access to inundated areas to support emergency rescue operations; ii) support of management decisions on hydraulic infrastructures operation to minimize damage downstream; iii) climate change mitigation by minimizing water losses and reduce water mains operation costs; iv) early detection of water leakages in difficult-to-access water irrigation networks, promoting their fast repair.

Keywords: remote sensing, coastline detection, water detection, satellite data, sentinel, Copernicus, EOSC

Procedia PDF Downloads 125
317 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels

Authors: Joshua Buli, David Pietrowski, Samuel Britton

Abstract:

Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.

Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization

Procedia PDF Downloads 84
316 University Clusters Using ICT for Teaching and Learning

Authors: M. Roberts Masillamani

Abstract:

There is a phenomenal difference, as regard to the teaching methodology adopted at the urban and the rural area colleges. However, bright and talented student may be from rural back ground even. But there is huge dearth of the digitization in the rural areas and lesser developed countries. Today’s students need new skills to compete and successful in the future. Education should be combination of practical, intellectual, and social skills. What does this mean for rural classrooms and how can it be achieved. Rural colleges are not able to hire the best resources, since the best teacher’s aim is to move towards the city. If city is provided everywhere, then there will be no rural area. This is possible by forming university clusters (UC). The University cluster is a group of renowned and accredited universities coming together to bridge this dearth. The UC will deliver the live lectures and allow the students’ from remote areas to actively participate in the classroom. This paper tries to present a plan of action of providing a better live classroom teaching and learning system from the city to the rural and the lesser developed countries. This paper titled “University Clusters using ICT for teaching and learning” provides a true concept of opening live digital classroom windows for rural colleges, where resources are not available, thus reducing the digital divide. This is different from pod casting a lecture or distance learning and eLearning. The live lecture can be streamed through digital equipment to another classroom. The rural students can collaborate with their peers and critiques, be assessed, collect information, acquire different techniques in assessment and learning process. This system will benefit rural students and teachers and develop socio economic status. This will also will increase the degree of confidence of the Rural students and teachers. Thus bringing about the concept of ‘Train the Trainee’ in reality. An educational university cloud for each cluster will be built remote infrastructure facilities (RIF) for the above program. The users may be informed, about the available lecture schedules, through the RIF service. RIF with an educational cloud can be set by the universities under one cluster. This paper talks a little more about University clusters and the methodology to be adopted as well as some extended features like, tutorial classes, library grids, remote laboratory login, research and development.

Keywords: lesser developed countries, digital divide, digital learning, education, e-learning, ICT, library grids, live classroom windows, RIF, rural, university clusters and urban

Procedia PDF Downloads 471
315 Landsat Data from Pre Crop Season to Estimate the Area to Be Planted with Summer Crops

Authors: Valdir Moura, Raniele dos Anjos de Souza, Fernando Gomes de Souza, Jose Vagner da Silva, Jerry Adriani Johann

Abstract:

The estimate of the Area of Land to be planted with annual crops and its stratification by the municipality are important variables in crop forecast. Nowadays in Brazil, these information’s are obtained by the Brazilian Institute of Geography and Statistics (IBGE) and published under the report Assessment of the Agricultural Production. Due to the high cloud cover in the main crop growing season (October to March) it is difficult to acquire good orbital images. Thus, one alternative is to work with remote sensing data from dates before the crop growing season. This work presents the use of multitemporal Landsat data gathered on July and September (before the summer growing season) in order to estimate the area of land to be planted with summer crops in an area of São Paulo State, Brazil. Geographic Information Systems (GIS) and digital image processing techniques were applied for the treatment of the available data. Supervised and non-supervised classifications were used for data in digital number and reflectance formats and the multitemporal Normalized Difference Vegetation Index (NDVI) images. The objective was to discriminate the tracts with higher probability to become planted with summer crops. Classification accuracies were evaluated using a sampling system developed basically for this study region. The estimated areas were corrected using the error matrix derived from these evaluations. The classification techniques presented an excellent level according to the kappa index. The proportion of crops stratified by municipalities was derived by a field work during the crop growing season. These proportion coefficients were applied onto the area of land to be planted with summer crops (derived from Landsat data). Thus, it was possible to derive the area of each summer crop by the municipality. The discrepancies between official statistics and our results were attributed to the sampling and the stratification procedures. Nevertheless, this methodology can be improved in order to provide good crop area estimates using remote sensing data, despite the cloud cover during the growing season.

Keywords: area intended for summer culture, estimated area planted, agriculture, Landsat, planting schedule

Procedia PDF Downloads 150
314 Cleaning Performance of High-Frequency, High-Intensity 360 kHz Frequency Operating in Thickness Mode Transducers

Authors: R. Vetrimurugan, Terry Lim, M. J. Goodson, R. Nagarajan

Abstract:

This study investigates the cleaning performance of high intensity 360 kHz frequency on the removal of nano-dimensional and sub-micron particles from various surfaces, uniformity of the cleaning tank and run to run variation of cleaning process. The uniformity of the cleaning tank was measured by two different methods i.e 1. ppbTM meter and 2. Liquid Particle Counting (LPC) technique. In the second method, aluminium metal spacer components was placed at various locations of the cleaning tank (such as centre, top left corner, bottom left corner, top right corner, bottom right corner) and the resultant particles removed by 360 kHz frequency was measured. The result indicates that the energy was distributed more uniformly throughout the entire cleaning vessel even at the corners and edges of the tank when megasonic sweeping technology is applied. The result also shows that rinsing the parts with 360 kHz frequency at final rinse gives lower particle counts, hence higher cleaning efficiency as compared to other frequencies. When megasonic sweeping technology is applied each piezoelectric transducers will operate at their optimum resonant frequency and generates stronger acoustic cavitational force and higher acoustic streaming velocity. These combined forces are helping to enhance the particle removal and at the same time improve the overall cleaning performance. The multiple extractions study was also carried out for various frequencies to measure the cleaning potential and asymptote value.

Keywords: power distribution, megasonic sweeping, cavitation intensity, particle removal, laser particle counting, nano, submicron

Procedia PDF Downloads 418
313 Big Data: Concepts, Technologies and Applications in the Public Sector

Authors: A. Alexandru, C. A. Alexandru, D. Coardos, E. Tudora

Abstract:

Big Data (BD) is associated with a new generation of technologies and architectures which can harness the value of extremely large volumes of very varied data through real time processing and analysis. It involves changes in (1) data types, (2) accumulation speed, and (3) data volume. This paper presents the main concepts related to the BD paradigm, and introduces architectures and technologies for BD and BD sets. The integration of BD with the Hadoop Framework is also underlined. BD has attracted a lot of attention in the public sector due to the newly emerging technologies that allow the availability of network access. The volume of different types of data has exponentially increased. Some applications of BD in the public sector in Romania are briefly presented.

Keywords: big data, big data analytics, Hadoop, cloud

Procedia PDF Downloads 310
312 Numerical Approach for Characterization of Flow Field in Pump Intake Using Two Phase Model: Detached Eddy Simulation

Authors: Rahul Paliwal, Gulshan Maheshwari, Anant S. Jhaveri, Channamallikarjun S. Mathpati

Abstract:

Large pumping facility is the necessary requirement of the cooling water systems for power plants, process and manufacturing facilities, flood control and water or waste water treatment plant. With a large capacity of few hundred to 50,000 m3/hr, cares must be taken to ensure the uniform flow to the pump to limit vibration, flow induced cavitation and performance problems due to formation of air entrained vortex and swirl flow. Successful prediction of these phenomena requires numerical method and turbulence model to characterize the dynamics of these flows. In the past years, single phase shear stress transport (SST) Reynolds averaged Navier Stokes Models (like k-ε, k-ω and RSM) were used to predict the behavior of flow. Literature study showed that two phase model will be more accurate over single phase model. In this paper, a 3D geometries simulated using detached eddy simulation (LES) is used to predict the behavior of the fluid and the results are compared with experimental results. Effect of different grid structure and boundary condition is also studied. It is observed that two phase flow model can more accurately predict the mean flow and turbulence statistics compared to the steady SST model. These validate model will be used for further analysis of vortex structure in lab scale model to generate their frequency-plot and intensity at different location in the set-up. This study will help in minimizing the ill effect of vortex on pump performance.

Keywords: grid structure, pump intake, simulation, vibration, vortex

Procedia PDF Downloads 175
311 Causal Inference Engine between Continuous Emission Monitoring System Combined with Air Pollution Forecast Modeling

Authors: Yu-Wen Chen, Szu-Wei Huang, Chung-Hsiang Mu, Kelvin Cheng

Abstract:

This paper developed a data-driven based model to deal with the causality between the Continuous Emission Monitoring System (CEMS, by Environmental Protection Administration, Taiwan) in industrial factories, and the air quality around environment. Compared to the heavy burden of traditional numerical models of regional weather and air pollution simulation, the lightweight burden of the proposed model can provide forecasting hourly with current observations of weather, air pollution and emissions from factories. The observation data are included wind speed, wind direction, relative humidity, temperature and others. The observations can be collected real time from Open APIs of civil IoT Taiwan, which are sourced from 439 weather stations, 10,193 qualitative air stations, 77 national quantitative stations and 140 CEMS quantitative industrial factories. This study completed a causal inference engine and gave an air pollution forecasting for the next 12 hours related to local industrial factories. The outcomes of the pollution forecasting are produced hourly with a grid resolution of 1km*1km on IIoTC (Industrial Internet of Things Cloud) and saved in netCDF4 format. The elaborated procedures to generate forecasts comprise data recalibrating, outlier elimination, Kriging Interpolation and particle tracking and random walk techniques for the mechanisms of diffusion and advection. The solution of these equations reveals the causality between factories emission and the associated air pollution. Further, with the aid of installed real-time flue emission (Total Suspension Emission, TSP) sensors and the mentioned forecasted air pollution map, this study also disclosed the converting mechanism between the TSP and PM2.5/PM10 for different region and industrial characteristics, according to the long-term data observation and calibration. These different time-series qualitative and quantitative data which successfully achieved a causal inference engine in cloud for factory management control in practicable. Once the forecasted air quality for a region is marked as harmful, the correlated factories are notified and asked to suppress its operation and reduces emission in advance.

Keywords: continuous emission monitoring system, total suspension particulates, causal inference, air pollution forecast, IoT

Procedia PDF Downloads 86
310 An Online 3D Modeling Method Based on a Lossless Compression Algorithm

Authors: Jiankang Wang, Hongyang Yu

Abstract:

This paper proposes a portable online 3D modeling method. The method first utilizes a depth camera to collect data and compresses the depth data using a frame-by-frame lossless data compression method. The color image is encoded using the H.264 encoding format. After the cloud obtains the color image and depth image, a 3D modeling method based on bundlefusion is used to complete the 3D modeling. The results of this study indicate that this method has the characteristics of portability, online, and high efficiency and has a wide range of application prospects.

Keywords: 3D reconstruction, bundlefusion, lossless compression, depth image

Procedia PDF Downloads 82
309 A Framework of Virtualized Software Controller for Smart Manufacturing

Authors: Pin Xiu Chen, Shang Liang Chen

Abstract:

A virtualized software controller is developed in this research to replace traditional hardware control units. This virtualized software controller transfers motion interpolation calculations from the motion control units of end devices to edge computing platforms, thereby reducing the end devices' computational load and hardware requirements and making maintenance and updates easier. The study also applies the concept of microservices, dividing the control system into several small functional modules and then deploy into a cloud data server. This reduces the interdependency among modules and enhances the overall system's flexibility and scalability. Finally, with containerization technology, the system can be deployed and started in a matter of seconds, which is more efficient than traditional virtual machine deployment methods. Furthermore, this virtualized software controller communicates with end control devices via wireless networks, making the placement of production equipment or the redesign of processes more flexible and no longer limited by physical wiring. To handle the large data flow and maintain low-latency transmission, this study integrates 5G technology, fully utilizing its high speed, wide bandwidth, and low latency features to achieve rapid and stable remote machine control. An experimental setup is designed to verify the feasibility and test the performance of this framework. This study designs a smart manufacturing site with a 5G communication architecture, serving as a field for experimental data collection and performance testing. The smart manufacturing site includes one robotic arm, three Computer Numerical Control machine tools, several Input/Output ports, and an edge computing architecture. All machinery information is uploaded to edge computing servers and cloud servers via 5G communication and the Internet of Things framework. After analysis and computation, this information is converted into motion control commands, which are transmitted back to the relevant machinery for motion control through 5G communication. The communication time intervals at each stage are calculated using the C++ chrono library to measure the time difference for each command transmission. The relevant test results will be organized and displayed in the full-text.

Keywords: 5G, MEC, microservices, virtualized software controller, smart manufacturing

Procedia PDF Downloads 82
308 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning

Authors: Yangzhi Li

Abstract:

Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.

Keywords: robotic construction, robotic assembly, visual guidance, machine learning

Procedia PDF Downloads 86
307 Dynamics of Understanding Earthquake Precursors-A Review

Authors: Sarada Nivedita Bhuyan

Abstract:

Earthquake is the sudden, rapid movement of the earth’s crust and is the natural means of releasing stress. Tectonic plates play a major role for earthquakes as tectonic plates are the crust of the planet. The boundary lines of tectonic plates are usually known as fault lines. To understand an earthquake before its occurrence, different types of earthquake precursors are studied by different researchers. Surface temperature, strange cloud cover, earth’s electric field, geomagnetic phenomena, ground water level, active faults, ionospheric anomalies, tectonic movements are taken as parameters for earthquake study by different researchers. In this paper we tried to gather complete and helpful information of earthquake precursors which have been studied until now.

Keywords: earthquake precursors, earthquake, tectonic plates, fault

Procedia PDF Downloads 380
306 High Temperature Deformation Behavior of Al0.2CoCrFeNiMo0.5 High Entropy alloy

Authors: Yasam Palguna, Rajesh Korla

Abstract:

The efficiency of thermally operated systems can be improved by increasing the operating temperature, thereby decreasing the fuel consumption and carbon footprint. Hence, there is a continuous need for replacing the existing materials with new alloys with higher temperature working capabilities. During the last decade, multi principal element alloys, commonly known as high entropy alloys are getting more attention because of their superior high temperature strength along with good high temperature corrosion and oxidation resistance, The present work focused on the microstructure and high temperature tensile behavior of Al0.2CoCrFeNiMo0.5 high entropy alloy (HEA). Wrought Al0.2CoCrFeNiMo0.5 high entropy alloy, produced by vacuum induction melting followed by thermomechanical processing, is tested in the temperature range of 200 to 900oC. It is exhibiting very good resistance to softening with increasing temperature up to 700oC, and thereafter there is a rapid decrease in the strength, especially beyond 800oC, which may be due to simultaneous occurrence of recrystallization and precipitate coarsening. Further, it is exhibiting superplastic kind of behavior with a uniform elongation of ~ 275 % at 900 oC temperature and 1 x 10-3 s-1 strain rate, which may be due to the presence of fine stable equi-axed grains. Strain rate sensitivity of 0.3 was observed, suggesting that solute drag dislocation glide might be the active mechanism during superplastic kind of deformation. Post deformation microstructure suggesting that cavitation at the sigma phase-matrix interface is the failure mechanism during high temperature deformation. Finally, high temperature properties of the present alloy will be compared with the contemporary high temperature materials such as ferritic, austenitic steels, and superalloys.

Keywords: high entropy alloy, high temperature deformation, super plasticity, post-deformation microstructures

Procedia PDF Downloads 164
305 Geomorphology of Karst Features of Shiraz City and Arjan Plain and Development Limitations

Authors: Meysam Jamali, Ebrahim Moghimi, Zean Alabden Jafarpour

Abstract:

Karst term is the determiner of a variety of areas or landforms and unique perspectives that have been formed in result of the ingredients dissolution of rocks constituter by natural waters. Shiraz area with an area of 5322km2 is located in the simple folded belt in the southern part of Zagros Mountain of Fars, and is surrounded with Limestone Mountains (Asmari formation). Shiraz area is located in Calcareous areas. The Infrastructure of this city is lime and absorbing wells that the city has, can influence on the Limestone dissolution and those accelerate its rate and increases the cavitation below the surface. Dasht-e Arjan is a graben, which has been created as the result of activity of two normal faults in its east and west sides. It is a complete sample of Karst plains (Polje) which has been created with the help of tectonic forces (fault) and dissolution process of water in Asmari limestone formation. It is located 60km. off south west of Shiraz (on Kazeroon-Shiraz road). In 1971, UNESCO has recognized this plain as a reserve of biosphere. It is considered as one of the world’s most beautiful geological phenomena, so that most of the world’s geologists are interested in visiting this place. The purpose of this paper is to identify and introduce landscapes of Karst features shiraz city and Dasht-e Arjan including Karst dissolution features (Lapiez, Karst springs, dolines, caves, underground caves, ponors, and Karst valleys), anticlines and synclines, and Arjan Lake, which are studied in this paper.

Keywords: Dasht-eArjan, fault, Karst features, polje, Shiraz city, Zagros

Procedia PDF Downloads 420
304 Cryptosystems in Asymmetric Cryptography for Securing Data on Cloud at Various Critical Levels

Authors: Sartaj Singh, Amar Singh, Ashok Sharma, Sandeep Kaur

Abstract:

With upcoming threats in a digital world, we need to work continuously in the area of security in all aspects, from hardware to software as well as data modelling. The rise in social media activities and hunger for data by various entities leads to cybercrime and more attack on the privacy and security of persons. Cryptography has always been employed to avoid access to important data by using many processes. Symmetric key and asymmetric key cryptography have been used for keeping data secrets at rest as well in transmission mode. Various cryptosystems have evolved from time to time to make the data more secure. In this research article, we are studying various cryptosystems in asymmetric cryptography and their application with usefulness, and much emphasis is given to Elliptic curve cryptography involving algebraic mathematics.

Keywords: cryptography, symmetric key cryptography, asymmetric key cryptography

Procedia PDF Downloads 124
303 Analysis of the Strategic Value at the Usage of Green IT Application for the Organizational Product or Service in Order to Gain the Competitive Advantage; Case: E-Money of a Telecommunication Firm in Indonesia

Authors: I Putu Deny Arthawan Sugih Prabowo, Eko Nugroho, Rudy Hartanto

Abstract:

Known, Green IT is a concept about how to use the technology (IT) wisely, efficiently, and environmentally. However, it exists as the consequence of the rapid-growth of the technology (especially IT) currently. Not only for the environments, the usage of Green IT applications, e.g. Cloud Computing (Cloud Storage) and E-Money (E-Cash), also gives its benefits for the organizational business strategy (especially the organizational product/service strategy) in order to gain the organizational competitive advantage (to be the market leader). This paper takes the case at E-Money as a Value-Added Services (VAS) of a telecommunication firm (company) in Indonesia which it also competes with the competitors’ similar product (service). Although it has been a popular telecommunication firm’s product/service, but its strategic values for the organization (firm) is still unknown, and therefore, the aim of this paper is for analyzing its strategic values for gaining the organizational competitive advantage. However, in this paper, its strategic value analysis is viewed by how to assess (consider) its strategic benefits and also manage the challenges or risks of its implementation at the organization as an organizational product/service. Then the paper uses a research model for investigating the influences of both perceived risks and the organizational cultures to the usage of Green IT Application at the organization and also both the usage of Green IT Application at the organization and the threats-challenges of the organizational products/services to the competitive advantage of the organizational products/services. However, the paper uses the quantitative research method (collecting the information from the field respondents by using the research questionnaires) and then, the primary data is analyzed by both descriptive and inferential statistics. Also in this paper, SmartPLS is used for analyzing the primary data by the quantitative research method. Besides using the quantitative research method, the paper also uses the qualitative research method, such as interviewing the field respondent and/or directly field observation, for deeply confirming the quantitative research method’s analysis results at the certain domain, e.g. both organizational cultures and internal processes that support the usage of Green IT applications for the organizational product/service (E-Money in this paper case). However, the paper is still at an infant stage of in-progress research. Then the paper’s results may be used as a reference for the organization (firm or company) in developing the organizational business strategies, especially about the organizational product/service that relates to Green IT applications. Besides it, the paper may also be the future study, e.g. the influence of knowledge transfer about E-Money and/or other Green IT application-based products/services to the organizational service performance that relates to the product (service) in order to gain the competitive advantage.

Keywords: Green IT, competitive advantage, strategic value, organization (firm or company), organizational product (service)

Procedia PDF Downloads 305
302 Working Mode and Key Technology of Thermal Vacuum Test Software for Spacecraft Test

Authors: Zhang Lei, Zhan Haiyang, Gu Miao

Abstract:

A universal software platform is developed for improving the defects in the practical one. This software platform has distinct advantages in modularization, information management, and the interfaces. Several technologies such as computer technology, virtualization technology, network technology, etc. are combined together in this software platform, and four working modes are introduced in this article including single mode, distributed mode, cloud mode, and the centralized mode. The application area of the software platform is extended through the switch between these working modes. The software platform can arrange the thermal vacuum test process automatically. This function can improve the reliability of thermal vacuum test.

Keywords: software platform, thermal vacuum test, control and measurement, work mode

Procedia PDF Downloads 414
301 Activity Data Analysis for Status Classification Using Fitness Trackers

Authors: Rock-Hyun Choi, Won-Seok Kang, Chang-Sik Son

Abstract:

Physical activity is important for healthy living. Recently wearable devices which motivate physical activity are quickly developing, and become cheaper and more comfortable. In particular, fitness trackers provide a variety of information and need to provide well-analyzed, and user-friendly results. In this study, frequency analysis was performed to classify various data sets of Fitbit into simple activity status. The data from Fitbit cloud server consists of 263 subjects who were healthy factory and office workers in Korea from March 7th to April 30th, 2016. In the results, we found assumptions of activity state classification seem to be sufficient and reasonable.

Keywords: activity status, fitness tracker, heart rate, steps

Procedia PDF Downloads 383
300 Towards a Proof Acceptance by Overcoming Challenges in Collecting Digital Evidence

Authors: Lilian Noronha Nassif

Abstract:

Cybercrime investigation demands an appropriated evidence collection mechanism. If the investigator does not acquire digital proofs in a forensic sound, some important information can be lost, and judges can discard case evidence because the acquisition was inadequate. The correct digital forensic seizing involves preparation of professionals from fields of law, police, and computer science. This paper presents important challenges faced during evidence collection in different perspectives of places. The crime scene can be virtual or real, and technical obstacles and privacy concerns must be considered. All pointed challenges here highlight the precautions to be taken in the digital evidence collection and the suggested procedures contribute to the best practices in the digital forensics field.

Keywords: digital evidence, digital forensics process and procedures, mobile forensics, cloud forensics

Procedia PDF Downloads 406
299 A Fast and Robust Protocol for Reconstruction and Re-Enactment of Historical Sites

Authors: Sanaa I. Abu Alasal, Madleen M. Esbeih, Eman R. Fayyad, Rami S. Gharaibeh, Mostafa Z. Ali, Ahmed A. Freewan, Monther M. Jamhawi

Abstract:

This research proposes a novel reconstruction protocol for restoring missing surfaces and low-quality edges and shapes in photos of artifacts at historical sites. The protocol starts with the extraction of a cloud of points. This extraction process is based on four subordinate algorithms, which differ in the robustness and amount of resultant. Moreover, they use different -but complementary- accuracy to some related features and to the way they build a quality mesh. The performance of our proposed protocol is compared with other state-of-the-art algorithms and toolkits. The statistical analysis shows that our algorithm significantly outperforms its rivals in the resultant quality of its object files used to reconstruct the desired model.

Keywords: meshes, point clouds, surface reconstruction protocols, 3D reconstruction

Procedia PDF Downloads 456
298 Evaluation of SDS (Software Defined Storage) Controller (CorpHD) for Various Storage Demands

Authors: Shreya Bokare, Sanjay Pawar, Shika Nema

Abstract:

Growth in cloud applications is generating the tremendous amount of data, building load on traditional storage management systems. Software Defined Storage (SDS) is a new storage management concept becoming popular to handle this large amount of data. CoprHD is one of the open source SDS controller, available for experimentation and development in the storage industry. In this paper, the storage management techniques provided by CoprHD to manage heterogeneous storage platforms are experimented and analyzed. Various storage management parameters such as time to provision, storage capacity measurement, and heterogeneity are experimentally evaluated along with the theoretical expression to prove the completeness of CoprHD controller for storage management.

Keywords: software defined storage, SDS, CoprHD, open source, SMI-S simulator, clarion, Symmetrix

Procedia PDF Downloads 313
297 Topographic Coast Monitoring Using UAV Photogrammetry: A Case Study in Port of Veracruz Expansion Project

Authors: Francisco Liaño-Carrera, Jorge Enrique Baños-Illana, Arturo Gómez-Barrero, José Isaac Ramírez-Macías, Erik Omar Paredes-JuáRez, David Salas-Monreal, Mayra Lorena Riveron-Enzastiga

Abstract:

Topographical changes in coastal areas are usually assessed with airborne LIDAR and conventional photogrammetry. In recent times Unmanned Aerial Vehicles (UAV) have been used several in photogrammetric applications including coastline evolution. However, its use goes further by using the points cloud associated to generate beach Digital Elevation Models (DEM). We present a methodology for monitoring coastal topographic changes along a 50 km coastline in Veracruz, Mexico using high-resolution images (less than 10 cm ground resolution) and dense points cloud captured with an UAV. This monitoring develops in the context of the port of Veracruz expansion project which construction began in 2015 and intends to characterize coast evolution and prevent and mitigate project impacts on coastal environments. The monitoring began with a historical coastline reconstruction since 1979 to 2015 using aerial photography and Landsat imagery. We could define some patterns: the northern part of the study area showed accretion while the southern part of the study area showed erosion. Since the study area is located off the port of Veracruz, a touristic and economical Mexican urban city, where coastal development structures have been built since 1979 in a continuous way, the local beaches of the touristic area are been refilled constantly. Those areas were not described as accretion since every month sand-filled trucks refill the sand beaches located in front of the hotel area. The construction of marinas and the comitial port of Veracruz, the old and the new expansion were made in the erosion part of the area. Northward from the City of Veracruz the beaches were described as accretion areas while southward from the city, the beaches were described as erosion areas. One of the problems is the expansion of the new development in the southern area of the city using the beach view as an incentive to buy front beach houses. We assessed coastal changes between seasons using high-resolution images and also points clouds during 2016 and preliminary results confirm that UAVs can be used in permanent coast monitoring programs with excellent performance and detail.

Keywords: digital elevation model, high-resolution images, topographic coast monitoring, unmanned aerial vehicle

Procedia PDF Downloads 270
296 Ultrasonic Treatment of Baker’s Yeast Effluent

Authors: Emine Yılmaz, Serap Fındık

Abstract:

Baker’s yeast industry uses molasses as a raw material. Molasses is end product of sugar industry. Wastewater from molasses processing presents large amount of coloured substances that give dark brown color and high organic load to the effluents. The main coloured compounds are known as melanoidins. Melanoidins are product of Maillard reaction between amino acid and carbonyl groups in molasses. Dark colour prevents sunlight penetration and reduces photosynthetic activity and dissolved oxygen level of surface waters. Various methods like biological processes (aerobic and anaerobic), ozonation, wet air oxidation, coagulation/flocculation are used to treatment of baker’s yeast effluent. Before effluent is discharged adequate treatment is imperative. In addition to this, increasingly stringent environmental regulations are forcing distilleries to improve existing treatment and also to find alternative methods of effluent management or combination of treatment methods. Sonochemical oxidation is one of the alternative methods. Sonochemical oxidation employs ultrasound resulting in cavitation phenomena. In this study, decolorization of baker’s yeast effluent was investigated by using ultrasound. Baker’s yeast effluent was supplied from a factory which is located in the north of Turkey. An ultrasonic homogenizator used for this study. Its operating frequency is 20 kHz. TiO2-ZnO catalyst has been used as sonocatalyst. The effects of molar proportion of TiO2-ZnO, calcination temperature and time, catalyst amount were investigated on the decolorization of baker’s yeast effluent. The results showed that prepared composite TiO2-ZnO with 4:1 molar proportion treated at 700°C for 90 min provides better result. Initial decolorization rate at 15 min is 3% without catalyst, 14,5% with catalyst treated at 700°C for 90 min respectively.

Keywords: baker’s yeast effluent, decolorization, sonocatalyst, ultrasound

Procedia PDF Downloads 474