Search results for: photogrammetric point cloud
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5550

Search results for: photogrammetric point cloud

5130 Study on Errors in Estimating the 3D Gaze Point for Different Pupil Sizes Using Eye Vergences

Authors: M. Pomianek, M. Piszczek, M. Maciejewski

Abstract:

The binocular eye tracking technology is increasingly being used in industry, entertainment and marketing analysis. In the case of virtual reality, eye tracking systems are already the basis for user interaction with the environment. In such systems, the high accuracy of determining the user's eye fixation point is very important due to the specificity of the virtual reality head-mounted display (HMD). Often, however, there are unknown errors occurring in the used eye tracking technology, as well as those resulting from the positioning of the devices in relation to the user's eyes. However, can the virtual environment itself influence estimation errors? The paper presents mathematical analyses and empirical studies of the determination of the fixation point and errors resulting from the change in the size of the pupil in response to the intensity of the displayed scene. The article contains both static laboratory tests as well as on the real user. Based on the research results, optimization solutions were proposed that would reduce the errors of gaze estimation errors. Studies show that errors in estimating the fixation point of vision can be minimized both by improving the pupil positioning algorithm in the video image and by using more precise methods to calibrate the eye tracking system in three-dimensional space.

Keywords: eye tracking, fixation point, pupil size, virtual reality

Procedia PDF Downloads 132
5129 Stagnation Point Flow Over a Stretching Cylinder with Variable Thermal Conductivity and Slip Conditions

Authors: M. Y. Malik, Farzana Khan

Abstract:

In this article, we discuss the behavior of viscous fluid near stagnation point over a stretching cylinder with variable thermal conductivity. The effects of slip conditions are also encountered. Thermal conductivity is considered as a linear function of temperature. By using homotopy analysis method and Fehlberg method we compare the graphical results for both momentum and energy equations. The effect of different parameters on velocity and temperature fields are shown graphically.

Keywords: slip conditions, stretching cylinder, heat generation/absorption, stagnation point flow, variable thermal conductivity

Procedia PDF Downloads 423
5128 Open Source, Open Hardware Ground Truth for Visual Odometry and Simultaneous Localization and Mapping Applications

Authors: Janusz Bedkowski, Grzegorz Kisala, Michal Wlasiuk, Piotr Pokorski

Abstract:

Ground-truth data is essential for VO (Visual Odometry) and SLAM (Simultaneous Localization and Mapping) quantitative evaluation using e.g. ATE (Absolute Trajectory Error) and RPE (Relative Pose Error). Many open-access data sets provide raw and ground-truth data for benchmark purposes. The issue appears when one would like to validate Visual Odometry and/or SLAM approaches on data captured using the device for which the algorithm is targeted for example mobile phone and disseminate data for other researchers. For this reason, we propose an open source, open hardware groundtruth system that provides an accurate and precise trajectory with a 3D point cloud. It is based on LiDAR Livox Mid-360 with a non-repetitive scanning pattern, on-board Raspberry Pi 4B computer, battery and software for off-line calculations (camera to LiDAR calibration, LiDAR odometry, SLAM, georeferencing). We show how this system can be used for the evaluation of various the state of the art algorithms (Stella SLAM, ORB SLAM3, DSO) in typical indoor monocular VO/SLAM.

Keywords: SLAM, ground truth, navigation, LiDAR, visual odometry, mapping

Procedia PDF Downloads 69
5127 Realization of Soliton Phase Characteristics in 10 Gbps, Single Channel, Uncompensated Telecommunication System

Authors: A. Jawahar

Abstract:

In this paper, the dependence of soliton pulses with respect to phase in a 10 Gbps, single channel, dispersion uncompensated telecommunication system was studied. The characteristic feature of periodic soliton interaction was noted at the Interaction point (I=6202.5Km) in one collision length of L=12405.1 Km. The interaction point is located for 10Gbps system with an initial relative spacing (qo) of soliton as 5.28 using Perturbation theory. It is shown that, when two in-phase solitons are launched, they interact at the point I=6202.5 Km, but the interaction could be restricted with introduction of different phase initially. When the phase of the input solitons increases, the deviation of soliton pulses at the I also increases. We have successfully demonstrated this effect in a telecommunication set-up in terms of Quality factor (Q), where the Q=0 for in-phase soliton. The Q was noted to be 125.9, 38.63, 47.53, 59.60, 161.37, and 78.04 for different phases such as 10o, 20o, 30o, 45o, 60o and 90o degrees respectively at Interaction point I.

Keywords: Soliton interaction, Initial relative spacing, phase, Perturbation theory and telecommunication system

Procedia PDF Downloads 472
5126 A Monocular Measurement for 3D Objects Based on Distance Area Number and New Minimize Projection Error Optimization Algorithms

Authors: Feixiang Zhao, Shuangcheng Jia, Qian Li

Abstract:

High-precision measurement of the target’s position and size is one of the hotspots in the field of vision inspection. This paper proposes a three-dimensional object positioning and measurement method using a monocular camera and GPS, namely the Distance Area Number-New Minimize Projection Error (DAN-NMPE). Our algorithm contains two parts: DAN and NMPE; specifically, DAN is a picture sequence algorithm, NMPE is a relatively positive optimization algorithm, which greatly improves the measurement accuracy of the target’s position and size. Comprehensive experiments validate the effectiveness of our proposed method on a self-made traffic sign dataset. The results show that with the laser point cloud as the ground truth, the size and position errors of the traffic sign measured by this method are ± 5% and 0.48 ± 0.3m, respectively. In addition, we also compared it with the current mainstream method, which uses a monocular camera to locate and measure traffic signs. DAN-NMPE attains significant improvements compared to existing state-of-the-art methods, which improves the measurement accuracy of size and position by 50% and 15.8%, respectively.

Keywords: monocular camera, GPS, positioning, measurement

Procedia PDF Downloads 144
5125 On Cloud Computing: A Review of the Features

Authors: Assem Abdel Hamed Mousa

Abstract:

The Internet of Things probably already influences your life. And if it doesn’t, it soon will, say computer scientists; Ubiquitous computing names the third wave in computing, just now beginning. First were mainframes, each shared by lots of people. Now we are in the personal computing era, person and machine staring uneasily at each other across the desktop. Next comes ubiquitous computing, or the age of calm technology, when technology recedes into the background of our lives. Alan Kay of Apple calls this "Third Paradigm" computing. Ubiquitous computing is essentially the term for human interaction with computers in virtually everything. Ubiquitous computing is roughly the opposite of virtual reality. Where virtual reality puts people inside a computer-generated world, ubiquitous computing forces the computer to live out here in the world with people. Virtual reality is primarily a horse power problem; ubiquitous computing is a very difficult integration of human factors, computer science, engineering, and social sciences. The approach: Activate the world. Provide hundreds of wireless computing devices per person per office, of all scales (from 1" displays to wall sized). This has required new work in operating systems, user interfaces, networks, wireless, displays, and many other areas. We call our work "ubiquitous computing". This is different from PDA's, dynabooks, or information at your fingertips. It is invisible; everywhere computing that does not live on a personal device of any sort, but is in the woodwork everywhere. The initial incarnation of ubiquitous computing was in the form of "tabs", "pads", and "boards" built at Xerox PARC, 1988-1994. Several papers describe this work, and there are web pages for the Tabs and for the Boards (which are a commercial product now): Ubiquitous computing will drastically reduce the cost of digital devices and tasks for the average consumer. With labor intensive components such as processors and hard drives stored in the remote data centers powering the cloud , and with pooled resources giving individual consumers the benefits of economies of scale, monthly fees similar to a cable bill for services that feed into a consumer’s phone.

Keywords: internet, cloud computing, ubiquitous computing, big data

Procedia PDF Downloads 382
5124 Internet of Things Based Patient Health Monitoring System

Authors: G. Yoga Sairam Teja, K. Harsha Vardhan, A. Vinay Kumar, K. Nithish Kumar, Ch. Shanthi Priyag

Abstract:

The emergence of the Internet of Things (IoT) has facilitated better device control and monitoring in the modern world. The constant monitoring of a patient would be drastically altered by the usage of IoT in healthcare. As we've seen in the case of the COVID-19 pandemic, it's important to keep oneself untouched while continuously checking on the patient's heart rate and temperature. Additionally, patients with paralysis should be closely watched, especially if they are elderly and in need of special care. Our "IoT BASED PATIENT HEALTH MONITORING SYSTEM" project uses IoT to track patient health conditions in an effort to address these issues. In this project, the main board is an 8051 microcontroller that connects a number of sensors, including a heart rate sensor, a temperature sensor (LM-35), and a saline water measuring circuit. These sensors are connected via an ESP832 (WiFi) module, which enables the sending of recorded data directly to the cloud so that the patient's health status can be regularly monitored. An LCD is used to monitor the data in offline mode, and a buzzer will sound if any variation from the regular readings occurs. The data in the cloud may be viewed as a graph, making it simple for a user to spot any unusual conditions.

Keywords: IoT, ESP8266, 8051 microcontrollers, sensors

Procedia PDF Downloads 87
5123 Statistical Shape Analysis of the Human Upper Airway

Authors: Ramkumar Gunasekaran, John Cater, Vinod Suresh, Haribalan Kumar

Abstract:

The main objective of this project is to develop a statistical shape model using principal component analysis that could be used for analyzing the shape of the human airway. The ultimate goal of this project is to identify geometric risk factors for diagnosis and management of Obstructive Sleep Apnoea (OSA). Anonymous CBCT scans of 25 individuals were obtained from the Otago Radiology Group. The airways were segmented between the hard-palate and the aryepiglottic fold using snake active contour segmentation. The point data cloud of the segmented images was then fitted with a bi-cubic mesh, and pseudo landmarks were placed to perform PCA on the segmented airway to analyze the shape of the airway and to find the relationship between the shape and OSA risk factors. From the PCA results, the first four modes of variation were found to be significant. Mode 1 was interpreted to be the overall length of the airway, Mode 2 was related to the anterior-posterior width of the retroglossal region, Mode 3 was related to the lateral dimension of the oropharyngeal region and Mode 4 was related to the anterior-posterior width of the oropharyngeal region. All these regions are subjected to the risk factors of OSA.

Keywords: medical imaging, image processing, FEM/BEM, statistical modelling

Procedia PDF Downloads 514
5122 Analysis of Chatterjea Type F-Contraction in F-Metric Space and Application

Authors: Awais Asif

Abstract:

This article investigates fixed point theorems of Chatterjea type F-contraction in the setting of F-metric space. We relax the conditions of F-contraction and define modified F-contraction for two mappings. The study provides fixed point results for both single-valued and multivalued mappings. The results are further extended to common fixed point theorems for two mappings. Moreover, to discuss the applicability of our results, an application is provided, which shows the role of our results in finding the solution to functional equations in dynamic programming. Our results generalize and extend the existing results in the literature.

Keywords: Chatterjea type F-contraction, F-cauchy sequence, F-convergent, multi valued mappings

Procedia PDF Downloads 143
5121 Photovoltaic Maximum Power-Point Tracking Using Artificial Neural Network

Authors: Abdelazziz Aouiche, El Moundher Aouiche, Mouhamed Salah Soudani

Abstract:

Renewable energy sources now significantly contribute to the replacement of traditional fossil fuel energy sources. One of the most potent types of renewable energy that has developed quickly in recent years is photovoltaic energy. We all know that solar energy, which is sustainable and non-depleting, is the best knowledge form of energy that we have at our disposal. Due to changing weather conditions, the primary drawback of conventional solar PV cells is their inability to track their maximum power point. In this study, we apply artificial neural networks (ANN) to automatically track and measure the maximum power point (MPP) of solar panels. In MATLAB, the complete system is simulated, and the results are adjusted for the external environment. The results are better performance than traditional MPPT methods and the results demonstrate the advantages of using neural networks in solar PV systems.

Keywords: modeling, photovoltaic panel, artificial neural networks, maximum power point tracking

Procedia PDF Downloads 88
5120 Lubricating Grease from Waste Cooking Oil and Waste Motor Sludge

Authors: Aseem Rajvanshi, Pankaj Kumar Pandey

Abstract:

Increase in population has increased the demand of energy to fulfill all its needs. This will result in burden on fossil fuels especially crude oil. Waste oil due to its disposal problem creates environmental degradation. In this context, this paper studies utilization of waste cooking oil and waste motor sludge for making lubricating grease. Experimental studies have been performed by variation in time and concentration of mixture of waste cooking oil and waste motor sludge. The samples were analyzed using penetration test (ASTM D-217), dropping point (ASTM D-566), work penetration (ASTM D-217) and copper strip test (ASTM D-408). Among 6 samples, sample 6 gives the best results with a good drop point and a fine penetration value. The dropping point and penetration test values were found to be 205 °C and 315, respectively. The penetration value falls under the category of NLGI (National Lubricating Grease Institute) consistency number 1.

Keywords: crude oil, copper strip corrosion test, dropping point, penetration test

Procedia PDF Downloads 295
5119 Crow Search Algorithm-Based Task Offloading Strategies for Fog Computing Architectures

Authors: Aniket Ganvir, Ritarani Sahu, Suchismita Chinara

Abstract:

The rapid digitization of various aspects of life is leading to the creation of smart IoT ecosystems, where interconnected devices generate significant amounts of valuable data. However, these IoT devices face constraints such as limited computational resources and bandwidth. Cloud computing emerges as a solution by offering ample resources for offloading tasks efficiently despite introducing latency issues, especially for time-sensitive applications like fog computing. Fog computing (FC) addresses latency concerns by bringing computation and storage closer to the network edge, minimizing data travel distance, and enhancing efficiency. Offloading tasks to fog nodes or the cloud can conserve energy and extend IoT device lifespan. The offloading process is intricate, with tasks categorized as full or partial, and its optimization presents an NP-hard problem. Traditional greedy search methods struggle to address the complexity of task offloading efficiently. To overcome this, the efficient crow search algorithm (ECSA) has been proposed as a meta-heuristic optimization algorithm. ECSA aims to effectively optimize computation offloading, providing solutions to this challenging problem.

Keywords: IoT, fog computing, task offloading, efficient crow search algorithm

Procedia PDF Downloads 58
5118 Full-Spectrum Photo-thermal Conversion of Point-mode Cu₂O/TiN Plasmonic Nanofluids

Authors: Xiaoxiao Yu, Guodu He, Zihua Wu, Yuanyuan Wang, Huaqing Xie

Abstract:

Core-shell composite structure is a common method to regulate the spectral absorption of nanofluids, but there occur complex preparation processes, which limit the applications in some fields, such as photothermal utilization and catalysis. This work proposed point-mode Cu₂O/TiN plasmonic nanofluids to regulate the spectral capturing ability and simplify the preparation process. Non-noble TiN nanoparticles with the localized surface plasmon resonance effect are dispersed in Cu₂O nanoparticles for forming a multi-point resonance source to enhance the spectral absorption performance. The experimental results indicate that the multiple resonance effect of TiN effectively improves the optical absorption and expands the absorption region. When the radius of Cu₂O nanoparticles is equal to 150nm, the optical absorption of point-mode Cu₂O/TiN plasmonic nanoparticles is best. Moreover, the photothermal conversion efficiency of Cu₂O/TiN plasmonic nanofluid can reach 97.5% at a volume fraction of 0.015% and an optical depth of 10mm. The point-mode nanostructure effectively enhances the optical absorption properties and greatly simplifies the preparation process of the composite nanoparticles, which can promote the application of multi-component photonic nanoparticles in the field of solar energy.

Keywords: solar energy, nanofluid, point-mode structure, Cu₂O/TiN, localized surface plasmon resonance effect

Procedia PDF Downloads 61
5117 Modelling Causal Effects from Complex Longitudinal Data via Point Effects of Treatments

Authors: Xiaoqin Wang, Li Yin

Abstract:

Background and purpose: In many practices, one estimates causal effects arising from a complex stochastic process, where a sequence of treatments are assigned to influence a certain outcome of interest, and there exist time-dependent covariates between treatments. When covariates are plentiful and/or continuous, statistical modeling is needed to reduce the huge dimensionality of the problem and allow for the estimation of causal effects. Recently, Wang and Yin (Annals of statistics, 2020) derived a new general formula, which expresses these causal effects in terms of the point effects of treatments in single-point causal inference. As a result, it is possible to conduct the modeling via point effects. The purpose of the work is to study the modeling of these causal effects via point effects. Challenges and solutions: The time-dependent covariates often have influences from earlier treatments as well as on subsequent treatments. Consequently, the standard parameters – i.e., the mean of the outcome given all treatments and covariates-- are essentially all different (null paradox). Furthermore, the dimension of the parameters is huge (curse of dimensionality). Therefore, it can be difficult to conduct the modeling in terms of standard parameters. Instead of standard parameters, we have use point effects of treatments to develop likelihood-based parametric approach to the modeling of these causal effects and are able to model the causal effects of a sequence of treatments by modeling a small number of point effects of individual treatment Achievements: We are able to conduct the modeling of the causal effects from a sequence of treatments in the familiar framework of single-point causal inference. The simulation shows that our method achieves not only an unbiased estimate for the causal effect but also the nominal level of type I error and a low level of type II error for the hypothesis testing. We have applied this method to a longitudinal study of COVID-19 mortality among Scandinavian countries and found that the Swedish approach performed far worse than the other countries' approach for COVID-19 mortality and the poor performance was largely due to its early measure during the initial period of the pandemic.

Keywords: causal effect, point effect, statistical modelling, sequential causal inference

Procedia PDF Downloads 205
5116 Microfluidic Lab on Chip Platform for the Detection of Arthritis Markers from Synovial Organ on Chip by Miniaturizing Enzyme-Linked ImmunoSorbent Assay Protocols

Authors: Laura Boschis, Elena D. Ozzello, Enzo Mastromatteo

Abstract:

Point of care diagnostic finds growing interest in medicine and agri-food because of faster intervention and prevention. EliChip is a microfluidic platform to perform Point of Care immunoenzymatic assay based on ready-to-use kits and a portable instrument to manage fluidics and read reliable quantitative results. Thanks to miniaturization, analyses are faster and more sensible than conventional ELISA. EliChip is one of the crucial assets of the Europen-founded Flamingo project for in-line measuring inflammatory markers.

Keywords: lab on chip, point of care, immunoenzymatic analysis, synovial arthritis

Procedia PDF Downloads 186
5115 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning

Authors: Kaushik Sathupadi, Sandesh Achar

Abstract:

Human action recognition modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view football datasets. Our HMR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH multi-view football datasets, respectively.

Keywords: computer vision, human motion analysis, random forest, machine learning

Procedia PDF Downloads 36
5114 Cloud-Based Multiresolution Geodata Cube for Efficient Raster Data Visualization and Analysis

Authors: Lassi Lehto, Jaakko Kahkonen, Juha Oksanen, Tapani Sarjakoski

Abstract:

The use of raster-formatted data sets in geospatial analysis is increasing rapidly. At the same time, geographic data are being introduced into disciplines outside the traditional domain of geoinformatics, like climate change, intelligent transport, and immigration studies. These developments call for better methods to deliver raster geodata in an efficient and easy-to-use manner. Data cube technologies have traditionally been used in the geospatial domain for managing Earth Observation data sets that have strict requirements for effective handling of time series. The same approach and methodologies can also be applied in managing other types of geospatial data sets. A cloud service-based geodata cube, called GeoCubes Finland, has been developed to support online delivery and analysis of most important geospatial data sets with national coverage. The main target group of the service is the academic research institutes in the country. The most significant aspects of the GeoCubes data repository include the use of multiple resolution levels, cloud-optimized file structure, and a customized, flexible content access API. Input data sets are pre-processed while being ingested into the repository to bring them into a harmonized form in aspects like georeferencing, sampling resolutions, spatial subdivision, and value encoding. All the resolution levels are created using an appropriate generalization method, selected depending on the nature of the source data set. Multiple pre-processed resolutions enable new kinds of online analysis approaches to be introduced. Analysis processes based on interactive visual exploration can be effectively carried out, as the level of resolution most close to the visual scale can always be used. In the same way, statistical analysis can be carried out on resolution levels that best reflect the scale of the phenomenon being studied. Access times remain close to constant, independent of the scale applied in the application. The cloud service-based approach, applied in the GeoCubes Finland repository, enables analysis operations to be performed on the server platform, thus making high-performance computing facilities easily accessible. The developed GeoCubes API supports this kind of approach for online analysis. The use of cloud-optimized file structures in data storage enables the fast extraction of subareas. The access API allows for the use of vector-formatted administrative areas and user-defined polygons as definitions of subareas for data retrieval. Administrative areas of the country in four levels are available readily from the GeoCubes platform. In addition to direct delivery of raster data, the service also supports the so-called virtual file format, in which only a small text file is first downloaded. The text file contains links to the raster content on the service platform. The actual raster data is downloaded on demand, from the spatial area and resolution level required in each stage of the application. By the geodata cube approach, pre-harmonized geospatial data sets are made accessible to new categories of inexperienced users in an easy-to-use manner. At the same time, the multiresolution nature of the GeoCubes repository facilitates expert users to introduce new kinds of interactive online analysis operations.

Keywords: cloud service, geodata cube, multiresolution, raster geodata

Procedia PDF Downloads 135
5113 Integrating Building Information Modeling into Facilities Management Operations

Authors: Mojtaba Valinejadshoubi, Azin Shakibabarough, Ashutosh Bagchi

Abstract:

Facilities such as residential buildings, office buildings, and hospitals house large density of occupants. Therefore, a low-cost facility management program (FMP) should be used to provide a satisfactory built environment for these occupants. Facility management (FM) has been recently used in building projects as a critical task. It has been effective in reducing operation and maintenance cost of these facilities. Issues of information integration and visualization capabilities are critical for reducing the complexity and cost of FM. Building information modeling (BIM) can be used as a strong visual modeling tool and database in FM. The main objective of this study is to examine the applicability of BIM in the FM process during a building’s operational phase. For this purpose, a seven-storey office building is modeled Autodesk Revit software. Authors integrated the cloud-based environment using a visual programming tool, Dynamo, for the purpose of having a real-time cloud-based communication between the facility managers and the participants involved in the project. An appropriate and effective integrated data source and visual model such as BIM can reduce a building’s operational and maintenance costs by managing the building life cycle properly.

Keywords: building information modeling, facility management, operational phase, building life cycle

Procedia PDF Downloads 154
5112 Creating Energy Sustainability in an Enterprise

Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala

Abstract:

As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.

Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure

Procedia PDF Downloads 111
5111 An Experiment of Three-Dimensional Point Clouds Using GoPro

Authors: Jong-Hwa Kim, Mu-Wook Pyeon, Yang-dam Eo, Ill-Woong Jang

Abstract:

Construction of geo-spatial information recently tends to develop as multi-dimensional geo-spatial information. People constructing spatial information is also expanding its area to the general public from some experts. As well as, studies are in progress using a variety of devices, with the aim of near real-time update. In this paper, getting the stereo images using GoPro device used widely also to the general public as well as experts. And correcting the distortion of the images, then by using SIFT, DLT, is acquired the point clouds. It presented a possibility that on the basis of this experiment, using a video device that is readily available in real life, to create a real-time digital map.

Keywords: GoPro, SIFT, DLT, point clouds

Procedia PDF Downloads 469
5110 Study on 3D FE Analysis on Normal and Osteoporosis Mouse Models Based on 3-Point Bending Tests

Authors: Tae-min Byun, Chang-soo Chon, Dong-hyun Seo, Han-sung Kim, Bum-mo Ahn, Hui-suk Yun, Cheolwoong Ko

Abstract:

In this study, a 3-point bending computational analysis of normal and osteoporosis mouse models was performed based on the Micro-CT image information of the femurs. The finite element analysis (FEA) found 1.68 N (normal group) and 1.39 N (osteoporosis group) in the average maximum force, and 4.32 N/mm (normal group) and 3.56 N/mm (osteoporosis group) in the average stiffness. In the comparison of the 3-point bending test results, the maximum force and the stiffness were different about 9.4 times in the normal group and about 11.2 times in the osteoporosis group. The difference between the analysis and the test was greatly significant and this result demonstrated improvement points of the material properties applied to the computational analysis of this study. For the next study, the material properties of the mouse femur will be supplemented through additional computational analysis and test.

Keywords: 3-point bending test, mouse, osteoporosis, FEA

Procedia PDF Downloads 351
5109 Modeling Bessel Beams and Their Discrete Superpositions from the Generalized Lorenz-Mie Theory to Calculate Optical Forces over Spherical Dielectric Particles

Authors: Leonardo A. Ambrosio, Carlos. H. Silva Santos, Ivan E. L. Rodrigues, Ayumi K. de Campos, Leandro A. Machado

Abstract:

In this work, we propose an algorithm developed under Python language for the modeling of ordinary scalar Bessel beams and their discrete superpositions and subsequent calculation of optical forces exerted over dielectric spherical particles. The mathematical formalism, based on the generalized Lorenz-Mie theory, is implemented in Python for its large number of free mathematical (as SciPy and NumPy), data visualization (Matplotlib and PyJamas) and multiprocessing libraries. We also propose an approach, provided by a synchronized Software as Service (SaaS) in cloud computing, to develop a user interface embedded on a mobile application, thus providing users with the necessary means to easily introduce desired unknowns and parameters and see the graphical outcomes of the simulations right at their mobile devices. Initially proposed as a free Android-based application, such an App enables data post-processing in cloud-based architectures and visualization of results, figures and numerical tables.

Keywords: Bessel Beams and Frozen Waves, Generalized Lorenz-Mie Theory, Numerical Methods, optical forces

Procedia PDF Downloads 380
5108 Common Fixed Point Results and Stability of a Modified Jungck Iterative Scheme

Authors: Hudson Akewe

Abstract:

In this study, we introduce a modified Jungck (Dual Jungck) iterative scheme and use the scheme to approximate the unique common fixed point of a pair of generalized contractive-like operators in a Banach space. The iterative scheme is also shown to be stable with respect to the maps (S,T). An example is taken to justify the convergence of the scheme. Our result is a generalization and improvement of several results in the literature on single map T.

Keywords: generalized contractive-like operators, modified Jungck iterative scheme, stability results, weakly compatible maps, unique common fixed point

Procedia PDF Downloads 348
5107 A Comparison of Single Point Incremental Forming Formability between Carbon Steel and Stainless Steel

Authors: Kittiphat Rattanachan

Abstract:

The sheet metal forming process, the raw material mechanical properties are important parameters. This paper is to compare the wall’s incline angle or formability of SS 400 steel and SUS 304 stainless steel in single point incremental forming. The two materials are ferrous base alloy, which have the different cell unit, mechanical property and chemical composition. They were forming into cone shape specimens 100 mm diameter with different wall’s incline angle: 90o, 75o, and 60o. The investigation, the specimens were forming until the surface fracture was occurred. The experimental result showed that both materials with the smaller wall’s incline angle, the higher formability. The formability limited of the ferrous base alloy was approx. 60o wall’s incline angle. By nature, SS 400 was higher formability than SUS 304. This result could be used as the initial utilized data in designing the single point incremental forming parts.

Keywords: NC incremental forming, single point incremental forming, wall incline angle, formability

Procedia PDF Downloads 344
5106 Perception of Nursing Care of Patients in a University Hospital

Authors: Merve Aydin, Mağfiret Kara Kaşikçi

Abstract:

Aim: To determine the perceptions of inpatients about care at Farabi Hospital in KTU. Material and Method: This research was conducted by using the universe known examples of formulas and probability selected by sampling method with 277 chosen patients in the hospital at least 14 days in other internal and surgical clinics except for pediatric, psychiatry, and intensive care unit services between January-March 2014 in KTU Farabi Hospital. The data was collected through the forms of nursing care perception scale of patients and defining characteristics of patients. In the evaluation of data, percentage, mean, Mann Whitney U, Student t and Kurskall Wallis tests were applied. Results: The average point the patients got in nursing care perception scale is 62.64±10.08’dir. 48.7 % of patients regard nursing care well and 36.8 % of them regard it very well. 19 % of the patients regard nursing care badly. When the age, sex, occupation, marital status, educational background, residential place, income level, hospitalization period, hospitalization clinic and having a hospital attendant were compared with nursing care perception average point, the difference among point averages was not found meaningful statistically (p > 0.05). The average point of nursing care perception was found greater in those having chronic disease (p < 0.05). Conclusion: The perception point of patients about nursing care is above the average according to the average of the lowest and highest points. The great majority of patients regard nursing care well or very well.

Keywords: hospital, patient, perception of nursing care, nursing care

Procedia PDF Downloads 396
5105 Performance Evaluation of Distributed Deep Learning Frameworks in Cloud Environment

Authors: Shuen-Tai Wang, Fang-An Kuo, Chau-Yi Chou, Yu-Bin Fang

Abstract:

2016 has become the year of the Artificial Intelligence explosion. AI technologies are getting more and more matured that most world well-known tech giants are making large investment to increase the capabilities in AI. Machine learning is the science of getting computers to act without being explicitly programmed, and deep learning is a subset of machine learning that uses deep neural network to train a machine to learn  features directly from data. Deep learning realizes many machine learning applications which expand the field of AI. At the present time, deep learning frameworks have been widely deployed on servers for deep learning applications in both academia and industry. In training deep neural networks, there are many standard processes or algorithms, but the performance of different frameworks might be different. In this paper we evaluate the running performance of two state-of-the-art distributed deep learning frameworks that are running training calculation in parallel over multi GPU and multi nodes in our cloud environment. We evaluate the training performance of the frameworks with ResNet-50 convolutional neural network, and we analyze what factors that result in the performance among both distributed frameworks as well. Through the experimental analysis, we identify the overheads which could be further optimized. The main contribution is that the evaluation results provide further optimization directions in both performance tuning and algorithmic design.

Keywords: artificial intelligence, machine learning, deep learning, convolutional neural networks

Procedia PDF Downloads 211
5104 Modeling the Elastic Mean Free Path of Electron Collision with Pyrimidine: The Screen Corrected Additivity Rule Method

Authors: Aouina Nabila Yasmina, Chaoui Zine El Abiddine

Abstract:

This study presents a comprehensive investigation into the elastic mean free path (EMFP) of electrons colliding with pyrimidine, a precursor to the pyrimidine bases in DNA, employing the Screen Corrected Additivity Rule (SCAR) method. The SCAR method is introduced as a novel approach that combines classical and quantum mechanical principles to elucidate the interaction of electrons with pyrimidine. One of the most fundamental properties characterizing the propagation of a particle in the nuclear medium is its mean free path. Knowledge of the elastic mean free path is essential to accurately predict the effects of radiation on biological matter, as it contributes to the distances between collisions. Additionally, the mean free path plays a role in the interpretation of almost all experiments in which an excited electron moves through a solid. Pyrimidine, the precursor of the pyrimidine bases of DNA, has interesting physicochemical properties, which make it an interesting molecule to study from a fundamental point of view. These include a relatively large dipole polarizability and dipole moment and an electronic charge cloud with a significant spatial extension, which justifies its choice in this present study.

Keywords: elastic mean free path, elastic collision, pyrimidine, SCAR

Procedia PDF Downloads 64
5103 Comparative Study of IC and Perturb and Observe Method of MPPT Algorithm for Grid Connected PV Module

Authors: Arvind Kumar, Manoj Kumar, Dattatraya H. Nagaraj, Amanpreet Singh, Jayanthi Prattapati

Abstract:

The purpose of this paper is to study and compare two maximum power point tracking (MPPT) algorithms in a photovoltaic simulation system and also show a simulation study of maximum power point tracking (MPPT) for photovoltaic systems using perturb and observe algorithm and Incremental conductance algorithm. Maximum power point tracking (MPPT) plays an important role in photovoltaic systems because it maximize the power output from a PV system for a given set of conditions, and therefore maximize the array efficiency and minimize the overall system cost. Since the maximum power point (MPP) varies, based on the irradiation and cell temperature, appropriate algorithms must be utilized to track the (MPP) and maintain the operation of the system in it. MATLAB/Simulink is used to establish a model of photovoltaic system with (MPPT) function. This system is developed by combining the models established of solar PV module and DC-DC Boost converter. The system is simulated under different climate conditions. Simulation results show that the photovoltaic simulation system can track the maximum power point accurately.

Keywords: incremental conductance algorithm, perturb and observe algorithm, photovoltaic system, simulation results

Procedia PDF Downloads 556
5102 The Role of Knowledge Management in Global Software Engineering

Authors: Samina Khalid, Tehmina Khalil, Smeea Arshad

Abstract:

Knowledge management is essential ingredient of successful coordination in globally distributed software engineering. Various frameworks, KMSs, and tools have been proposed to foster coordination and communication between virtual teams but practical implementation of these solutions has not been found. Organizations have to face challenges to implement knowledge management system. For this purpose at first, a literature review is arranged to investigate about challenges that restrict organizations to implement KMS and then by taking in account these challenges a problem of need of integrated solution in the form of standardized KMS that can easily store tacit and explicit knowledge, has traced down to facilitate coordination and collaboration among virtual teams. Literature review has been already shown that knowledge is a complex perception with profound meanings, and one of the most important resources that contributes to the competitive advantage of an organization. In order to meet the different challenges caused by not properly managing knowledge related to projects among virtual teams in GSE, we suggest making use of the cloud computing model. In this research a distributed architecture to support KM storage is proposed called conceptual framework of KM as a service in cloud. Framework presented is enhanced and conceptual framework of KM is embedded into that framework to store projects related knowledge for future use.

Keywords: management, Globsl software development, global software engineering

Procedia PDF Downloads 526
5101 Bayesian Analysis of Change Point Problems Using Conditionally Specified Priors

Authors: Golnaz Shahtahmassebi, Jose Maria Sarabia

Abstract:

In this talk, we introduce a new class of conjugate prior distributions obtained from conditional specification methodology. We illustrate the application of such distribution in Bayesian change point detection in Poisson processes. We obtain the posterior distribution of model parameters using a general bivariate distribution with gamma conditionals. Simulation from the posterior is readily implemented using a Gibbs sampling algorithm. The Gibbs sampling is implemented even when using conditional densities that are incompatible or only compatible with an improper joint density. The application of such methods will be demonstrated using examples of simulated and real data.

Keywords: change point, bayesian inference, Gibbs sampler, conditional specification, gamma conditional distributions

Procedia PDF Downloads 189