Search results for: real time acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21291

Search results for: real time acquisition

20331 Exploring the Potential of Phase Change Materials in Construction Environments

Authors: A. Ait Ahsene F., B. Boughrara S.

Abstract:

The buildings sector accounts for a significant portion of global energy consumption, with much of this energy used to heat and cool indoor spaces. In this context, the integration of innovative technologies such as phase change materials (PCM) holds promising potential to improve the energy efficiency and thermal comfort of buildings. This research topic explores the benefits and challenges associated with the use of PCMs in buildings, focusing on their ability to store and release thermal energy to regulate indoor temperature. We investigated the different types of PCM available, their thermal properties, and their potential applications in various climate zones and building types. To evaluate and compare the performance of PCMs, our methodology includes a series of laboratory and field experiments. In the laboratory, we measure the thermal storage capacity, melting and solidification temperatures, latent heat, and thermal conductivity of various PCMs. These measurements make it possible to quantify the capacity of each PCM to store and release thermal energy, as well as its capacity to transfer this energy through the construction materials. Additionally, field studies are conducted to evaluate the performance of PCMs in real-world environments. We install PCM systems in real buildings and monitor their operation over time, measuring energy savings, occupant thermal comfort, and material durability. These empirical data allow us to compare the effectiveness of different types of PCMs under real-world use conditions. By combining the results of laboratory and field experiments, we provide a comprehensive analysis of the advantages and limitations of PCMs in buildings, as well as recommendations for their effective application in practice.

Keywords: energy saving, phase change materials, material sustainability, buildings sector

Procedia PDF Downloads 35
20330 Experiences of Timing Analysis of Parallel Embedded Software

Authors: Muhammad Waqar Aziz, Syed Abdul Baqi Shah

Abstract:

The execution time analysis is fundamental to the successful design and execution of real-time embedded software. In such analysis, the Worst-Case Execution Time (WCET) of a program is a key measure, on the basis of which system tasks are scheduled. The WCET analysis of embedded software is also needed for system understanding and to guarantee its behavior. WCET analysis can be performed statically (without executing the program) or dynamically (through measurement). Traditionally, research on the WCET analysis assumes sequential code running on single-core platforms. However, as computation is steadily moving towards using a combination of parallel programs and multi-core hardware, new challenges in WCET analysis need to be addressed. In this article, we report our experiences of performing the WCET analysis of Parallel Embedded Software (PES) running on multi-core platform. The primary purpose was to investigate how WCET estimates of PES can be computed statically, and how they can be derived dynamically. Our experiences, as reported in this article, include the challenges we faced, possible suggestions to these challenges and the workarounds that were developed. This article also provides observations on the benefits and drawbacks of deriving the WCET estimates using the said methods and provides useful recommendations for further research in this area.

Keywords: embedded software, worst-case execution-time analysis, static flow analysis, measurement-based analysis, parallel computing

Procedia PDF Downloads 319
20329 Cytotoxicological Evaluation of a Folate Receptor Targeting Drug Delivery System Based on Cyclodextrins

Authors: Caroline Mendes, Mary McNamara, Orla Howe

Abstract:

For chemotherapy, a drug delivery system should be able to specifically target cancer cells and deliver the therapeutic dose without affecting normal cells. Folate receptors (FR) can be considered key targets since they are commonly over-expressed in cancer cells and they are the molecular marker used in this study. Here, cyclodextrin (CD) has being studied as a vehicle for delivering the chemotherapeutic drug, methotrexate (MTX). CDs have the ability to form inclusion complexes, in which molecules of suitable dimensions are included within the CD cavity. In this study, β-CD has been modified using folic acid so as to specifically target the FR molecular marker. Thus, the system studied here for drug delivery consists of β-CD, folic acid and MTX (CDEnFA:MTX). Cellular uptake of folic acid is mediated with high affinity by folate receptors while the cellular uptake of antifolates, such as MTX, is mediated with high affinity by the reduced folate carriers (RFCs). This study addresses the gene (mRNA) and protein expression levels of FRs and RFCs in the cancer cell lines CaCo-2, SKOV-3, HeLa, MCF-7, A549 and the normal cell line BEAS-2B, quantified by real-time polymerase chain reaction (real-time PCR) and flow cytometry, respectively. From that, four cell lines with different levels of FRs, were chosen for cytotoxicity assays of MTX and CDEnFA:MTX using the MTT assay. Real-time PCR and flow cytometry data demonstrated that all cell lines ubiquitously express moderate levels of RFC. These experiments have also shown that levels of FR protein in CaCo-2 cells are high, while levels in SKOV-3, HeLa and MCF-7 cells are moderate. A549 and BEAS-2B cells express low levels of FR protein. FRs are highly expressed in all the cancer cell lines analysed when compared to the normal cell line BEAS-2B. The cell lines CaCo-2, MCF-7, A549 and BEAS-2B were used in the cell viability assays. 48 hours treatment with the free drug and the complex resulted in IC50 values of 93.9 µM ± 9.2 and 56.0 µM ± 4.0 for CaCo-2 for free MTX and CDEnFA:MTX respectively, 118.2 µM ± 10.8 and 97.8 µM ± 12.3 for MCF-7, 36.4 µM ± 6.9 and 75.0 µM ± 8.5 for A549 and 132.6 µM ± 12.1 and 288.1 µM ± 16.3 for BEAS-2B. These results demonstrate that MTX is more toxic towards cell lines expressing low levels of FR, such as the BEAS-2B. More importantly, these results demonstrate that the inclusion complex CDEnFA:MTX showed greater cytotoxicity than the free drug towards the high FR expressing CaCo-2 cells, indicating that it has potential to target this receptor, enhancing the specificity and the efficiency of the drug.

Keywords: cyclodextrins, cancer treatment, drug delivery, folate receptors, reduced folate carriers

Procedia PDF Downloads 298
20328 Elastic and Plastic Collision Comparison Using Finite Element Method

Authors: Gustavo Rodrigues, Hans Weber, Larissa Driemeier

Abstract:

The prevision of post-impact conditions and the behavior of the bodies during the impact have been object of several collision models. The formulation from Hertz’s theory is generally used dated from the 19th century. These models consider the repulsive force as proportional to the deformation of the bodies under contact and may consider it proportional to the rate of deformation. The objective of the present work is to analyze the behavior of the bodies during impact using the Finite Element Method (FEM) with elastic and plastic material models. The main parameters to evaluate are, the contact force, the time of contact and the deformation of the bodies. An advantage of using the FEM approach is the possibility to apply a plastic deformation to the model according to the material definition: there will be used Johnson–Cook plasticity model whose parameters are obtained through empirical tests of real materials. This model allows analyzing the permanent deformation caused by impact, phenomenon observed in real world depending on the forces applied to the body. These results are compared between them and with the model-based Hertz theory.

Keywords: collision, impact models, finite element method, Hertz Theory

Procedia PDF Downloads 170
20327 Implementation of an IoT Sensor Data Collection and Analysis Library

Authors: Jihyun Song, Kyeongjoo Kim, Minsoo Lee

Abstract:

Due to the development of information technology and wireless Internet technology, various data are being generated in various fields. These data are advantageous in that they provide real-time information to the users themselves. However, when the data are accumulated and analyzed, more various information can be extracted. In addition, development and dissemination of boards such as Arduino and Raspberry Pie have made it possible to easily test various sensors, and it is possible to collect sensor data directly by using database application tools such as MySQL. These directly collected data can be used for various research and can be useful as data for data mining. However, there are many difficulties in using the board to collect data, and there are many difficulties in using it when the user is not a computer programmer, or when using it for the first time. Even if data are collected, lack of expert knowledge or experience may cause difficulties in data analysis and visualization. In this paper, we aim to construct a library for sensor data collection and analysis to overcome these problems.

Keywords: clustering, data mining, DBSCAN, k-means, k-medoids, sensor data

Procedia PDF Downloads 371
20326 Statistical Model to Examine the Impact of the Inflation Rate and Real Interest Rate on the Bahrain Economy

Authors: Ghada Abo-Zaid

Abstract:

Introduction: Oil is one of the most income source in Bahrain. Low oil price influence on the economy growth and the investment rate in Bahrain. For example, the economic growth was 3.7% in 2012, and it reduced to 2.9% in 2015. Investment rate was 9.8% in 2012, and it is reduced to be 5.9% and -12.1% in 2014 and 2015, respectively. The inflation rate is increased to the peak point in 2013 with 3.3 %. Objectives: The objectives here are to build statistical models to examine the effect of the interest rate inflation rate on the growth economy in Bahrain from 2000 to 2018. Methods: This study based on 18 years, and the multiple regression model is used for the analysis. All of the missing data are omitted from the analysis. Results: Regression model is used to examine the association between the Growth national product (GNP), the inflation rate, and real interest rate. We found that (i) Increase the real interest rate decrease the GNP. (ii) Increase the inflation rate does not effect on the growth economy in Bahrain since the average of the inflation rate was almost 2%, and this is considered as a low percentage. Conclusion: There is a positive impact of the real interest rate on the GNP in Bahrain. While the inflation rate does not show any negative influence on the GNP as the inflation rate was not large enough to effect negatively on the economy growth rate in Bahrain.

Keywords: growth national product, egypt, regression model, interest rate

Procedia PDF Downloads 159
20325 Multimodal Employee Attendance Management System

Authors: Khaled Mohammed

Abstract:

This paper presents novel face recognition and identification approaches for the real-time attendance management problem in large companies/factories and government institutions. The proposed uses the Minimum Ratio (MR) approach for employee identification. Capturing the authentic face variability from a sequence of video frames has been considered for the recognition of faces and resulted in system robustness against the variability of facial features. Experimental results indicated an improvement in the performance of the proposed system compared to the Previous approaches at a rate between 2% to 5%. In addition, it decreased the time two times if compared with the Previous techniques, such as Extreme Learning Machine (ELM) & Multi-Scale Structural Similarity index (MS-SSIM). Finally, it achieved an accuracy of 99%.

Keywords: attendance management system, face detection and recognition, live face recognition, minimum ratio

Procedia PDF Downloads 152
20324 An Enhanced SAR-Based Tsunami Detection System

Authors: Jean-Pierre Dubois, Jihad S. Daba, H. Karam, J. Abdallah

Abstract:

Tsunami early detection and warning systems have proved to be of ultimate importance, especially after the destructive tsunami that hit Japan in March 2012. Such systems are crucial to inform the authorities of any risk of a tsunami and of the degree of its danger in order to make the right decision and notify the public of the actions they need to take to save their lives. The purpose of this research is to enhance existing tsunami detection and warning systems. We first propose an automated and miniaturized model of an early tsunami detection and warning system. The model for the operation of a tsunami warning system is simulated using the data acquisition toolbox of Matlab and measurements acquired from specified internet pages due to the lack of the required real-life sensors, both seismic and hydrologic, and building a graphical user interface for the system. In the second phase of this work, we implement various satellite image filtering schemes to enhance the acquired synthetic aperture radar images of the tsunami affected region that are masked by speckle noise. This enables us to conduct a post-tsunami damage extent study and calculate the percentage damage. We conclude by proposing improvements to the existing telecommunication infrastructure of existing warning tsunami systems using a migration to IP-based networks and fiber optics links.

Keywords: detection, GIS, GSN, GTS, GPS, speckle noise, synthetic aperture radar, tsunami, wiener filter

Procedia PDF Downloads 387
20323 Real Time Classification of Political Tendency of Twitter Spanish Users based on Sentiment Analysis

Authors: Marc Solé, Francesc Giné, Magda Valls, Nina Bijedic

Abstract:

What people say on social media has turned into a rich source of information to understand social behavior. Specifically, the growing use of Twitter social media for political communication has arisen high opportunities to know the opinion of large numbers of politically active individuals in real time and predict the global political tendencies of a specific country. It has led to an increasing body of research on this topic. The majority of these studies have been focused on polarized political contexts characterized by only two alternatives. Unlike them, this paper tackles the challenge of forecasting Spanish political trends, characterized by multiple political parties, by means of analyzing the Twitters Users political tendency. According to this, a new strategy, named Tweets Analysis Strategy (TAS), is proposed. This is based on analyzing the users tweets by means of discovering its sentiment (positive, negative or neutral) and classifying them according to the political party they support. From this individual political tendency, the global political prediction for each political party is calculated. In order to do this, two different strategies for analyzing the sentiment analysis are proposed: one is based on Positive and Negative words Matching (PNM) and the second one is based on a Neural Networks Strategy (NNS). The complete TAS strategy has been performed in a Big-Data environment. The experimental results presented in this paper reveal that NNS strategy performs much better than PNM strategy to analyze the tweet sentiment. In addition, this research analyzes the viability of the TAS strategy to obtain the global trend in a political context make up by multiple parties with an error lower than 23%.

Keywords: political tendency, prediction, sentiment analysis, Twitter

Procedia PDF Downloads 232
20322 Clinical Training Simulation Experience of Medical Sector Students

Authors: Tahsien Mohamed Okasha

Abstract:

Simulation is one of the emerging educational strategies that depend on the creation of scenarios to imitate what could happen in real life. At the time of COVID, we faced big obstacles in medical education, specially the clinical part and how we could apply it, the simulation was the golden key. Simulation is a very important tool of education for medical sector students, through creating a safe, changeable, quiet environment with less anxiety level for students to practice and to have repeated trials on their competencies. That impacts the level of practice, achievement, and the way of acting in real situations and experiences. A blind Random sample of students from different specialties and colleges who came and finished their training in an integrated environment was collected and tested, and the responses were graded from (1-5). The results revealed that 77% of the studied subjects agreed that dealing and interacting with different medical sector candidates in the same place was beneficial. 77% of the studied subjects agreed that simulations were challenging in thinking and decision-making skills .75% agreed that using high-fidelity manikins was helpful. 75% agree .76% agreed that working in a safe, prepared environment is helpful for realistic situations.

Keywords: simulation, clinical training, education, medical sector students

Procedia PDF Downloads 23
20321 Dimension Free Rigid Point Set Registration in Linear Time

Authors: Jianqin Qu

Abstract:

This paper proposes a rigid point set matching algorithm in arbitrary dimensions based on the idea of symmetric covariant function. A group of functions of the points in the set are formulated using rigid invariants. Each of these functions computes a pair of correspondence from the given point set. Then the computed correspondences are used to recover the unknown rigid transform parameters. Each computed point can be geometrically interpreted as the weighted mean center of the point set. The algorithm is compact, fast, and dimension free without any optimization process. It either computes the desired transform for noiseless data in linear time, or fails quickly in exceptional cases. Experimental results for synthetic data and 2D/3D real data are provided, which demonstrate potential applications of the algorithm to a wide range of problems.

Keywords: covariant point, point matching, dimension free, rigid registration

Procedia PDF Downloads 165
20320 Time and Cost Prediction Models for Language Classification Over a Large Corpus on Spark

Authors: Jairson Barbosa Rodrigues, Paulo Romero Martins Maciel, Germano Crispim Vasconcelos

Abstract:

This paper presents an investigation of the performance impacts regarding the variation of five factors (input data size, node number, cores, memory, and disks) when applying a distributed implementation of Naïve Bayes for text classification of a large Corpus on the Spark big data processing framework. Problem: The algorithm's performance depends on multiple factors, and knowing before-hand the effects of each factor becomes especially critical as hardware is priced by time slice in cloud environments. Objectives: To explain the functional relationship between factors and performance and to develop linear predictor models for time and cost. Methods: the solid statistical principles of Design of Experiments (DoE), particularly the randomized two-level fractional factorial design with replications. This research involved 48 real clusters with different hardware arrangements. The metrics were analyzed using linear models for screening, ranking, and measurement of each factor's impact. Results: Our findings include prediction models and show some non-intuitive results about the small influence of cores and the neutrality of memory and disks on total execution time, and the non-significant impact of data input scale on costs, although notably impacts the execution time.

Keywords: big data, design of experiments, distributed machine learning, natural language processing, spark

Procedia PDF Downloads 111
20319 The Vision Baed Parallel Robot Control

Authors: Sun Lim, Kyun Jung

Abstract:

In this paper, we describe the control strategy of high speed parallel robot system with EtherCAT network. This work deals the parallel robot system with centralized control on the real-time operating system such as window TwinCAT3. Most control scheme and algorithm is implemented master platform on the PC, the input and output interface is ported on the slave side. The data is transferred by maximum 20usecond with 1000byte. EtherCAT is very high speed and stable industrial network. The control strategy with EtherCAT is very useful and robust on Ethernet network environment. The developed parallel robot is controlled pre-design nonlinear controller for 6G/0.43 cycle time of pick and place motion tracking. The experiment shows the good design and validation of the controller.

Keywords: parallel robot control, etherCAT, nonlinear control, parallel robot inverse kinematic

Procedia PDF Downloads 564
20318 Impact of Audit Committee on Real Earnings Management: Cases of Netherlands

Authors: Sana Masmoudi Mardassi, Yosra Makni Fourati

Abstract:

Regulators highlight the importance of the Audit Committee (AC) as a key internal corporate governance mechanism. One of the most important roles of this committee is to oversee the financial reporting process. The purpose of this paper is to examine the link between the characteristics of an audit committee and the financial reporting quality by investigating whether the characteristics of audit committees are associated with improved financial reporting quality, especially the Real Earnings Management. In the current study, a panel data from 80 nonfinancial companies listed on the Amsterdam Stock Exchange during the period between 2010 and 2017 were used. To measure audit committee characteristics, four proxies have been used, specifically, audit committee independence, financial expertise, gender diversity and AC meetings. For this research, a linear regression model was used to identify the influence of a set of board characteristics of the audit committee on real earnings management after controlling for firm audit committee size, leverage, size, loss, growth and board size. This research provides empirical evidence of the association between audit committee independence, financial expertise, gender diversity and meetings and Real Earnings Management (REM) as a proxy of financial reporting quality. The study finds that independence and AC Gender diversity are strongly related to financial reporting quality. In fact, these two characteristics constrain REM. The results also suggest that AC- financial expertise reduces to some extent, the likelihood of engaging in REM. These conclusions provide support then to the audit committee requirement under the Dutch Corporate Governance Code rules regarding gender diversity and AC meetings.

Keywords: audit committee, financial expertise, independence, real earnings management

Procedia PDF Downloads 161
20317 Enhancing Financial Security: Real-Time Anomaly Detection in Financial Transactions Using Machine Learning

Authors: Ali Kazemi

Abstract:

The digital evolution of financial services, while offering unprecedented convenience and accessibility, has also escalated the vulnerabilities to fraudulent activities. In this study, we introduce a distinct approach to real-time anomaly detection in financial transactions, aiming to fortify the defenses of banking and financial institutions against such threats. Utilizing unsupervised machine learning algorithms, specifically autoencoders and isolation forests, our research focuses on identifying irregular patterns indicative of fraud within transactional data, thus enabling immediate action to prevent financial loss. The data we used in this study included the monetary value of each transaction. This is a crucial feature as fraudulent transactions may have distributions of different amounts than legitimate ones, such as timestamps indicating when transactions occurred. Analyzing transactions' temporal patterns can reveal anomalies (e.g., unusual activity in the middle of the night). Also, the sector or category of the merchant where the transaction occurred, such as retail, groceries, online services, etc. Specific categories may be more prone to fraud. Moreover, the type of payment used (e.g., credit, debit, online payment systems). Different payment methods have varying risk levels associated with fraud. This dataset, anonymized to ensure privacy, reflects a wide array of transactions typical of a global banking institution, ranging from small-scale retail purchases to large wire transfers, embodying the diverse nature of potentially fraudulent activities. By engineering features that capture the essence of transactions, including normalized amounts and encoded categorical variables, we tailor our data to enhance model sensitivity to anomalies. The autoencoder model leverages its reconstruction error mechanism to flag transactions that deviate significantly from the learned normal pattern, while the isolation forest identifies anomalies based on their susceptibility to isolation from the dataset's majority. Our experimental results, validated through techniques such as k-fold cross-validation, are evaluated using precision, recall, and the F1 score alongside the area under the receiver operating characteristic (ROC) curve. Our models achieved an F1 score of 0.85 and a ROC AUC of 0.93, indicating high accuracy in detecting fraudulent transactions without excessive false positives. This study contributes to the academic discourse on financial fraud detection and provides a practical framework for banking institutions seeking to implement real-time anomaly detection systems. By demonstrating the effectiveness of unsupervised learning techniques in a real-world context, our research offers a pathway to significantly reduce the incidence of financial fraud, thereby enhancing the security and trustworthiness of digital financial services.

Keywords: anomaly detection, financial fraud, machine learning, autoencoders, isolation forest, transactional data analysis

Procedia PDF Downloads 54
20316 Social Networks Global Impact on Protest Movements and Human Rights Activism

Authors: Marcya Burden, Savonna Greer

Abstract:

In the wake of social unrest around the world, protest movements have been captured like never before. As protest movements have evolved, so too have their visibility and sources of coverage. Long gone are the days of print media as our only glimpse into the action surrounding a protest. Now, with social networks such as Facebook, Instagram and Snapchat, we have access to real-time video footage of protest movements and human rights activism that can reach millions of people within seconds. This research paper investigated various social media network platforms’ statistical usage data in the areas of human rights activism and protest movements, paralleling with other past forms of media coverage. This research demonstrates that social networks are extremely important to protest movements and human rights activism. With over 2.9 billion users across social media networks globally, these platforms are the heart of most recent protests and human rights activism. This research shows the paradigm shift from the Selma March of 1965 to the more recent protests of Ferguson in 2014, Ni Una Menos in 2015, and End Sars in 2018. The research findings demonstrate that today, almost anyone may use their social networks to protest movement leaders and human rights activists. From a student to an 80-year-old professor, the possibility of reaching billions of people all over the world is limitless. Findings show that 82% of the world’s internet population is on social networks 1 in every 5 minutes. Over 65% of Americans believe social media highlights important issues. Thus, there is no need to have a formalized group of people or even be known online. A person simply needs to be engaged on their respective social media networks (Facebook, Twitter, Instagram, Snapchat) regarding any cause they are passionate about. Information may be exchanged in real time around the world and a successful protest can begin.

Keywords: activism, protests, human rights, networks

Procedia PDF Downloads 91
20315 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings

Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.

Keywords: building system, time series, diagnosis, outliers, delay, data gap

Procedia PDF Downloads 240
20314 Reducing the Computational Overhead of Metaheuristics Parameterization with Exploratory Landscape Analysis

Authors: Iannick Gagnon, Alain April

Abstract:

The performance of a metaheuristic on a given problem class depends on the class itself and the choice of parameters. Parameter tuning is the most time-consuming phase of the optimization process after the main calculations and it often nullifies the speed advantage of metaheuristics over traditional optimization algorithms. Several off-the-shelf parameter tuning algorithms are available, but when the objective function is expensive to evaluate, these can be prohibitively expensive to use. This paper presents a surrogate-like method for finding adequate parameters using fitness landscape analysis on simple benchmark functions and real-world objective functions. The result is a simple compound similarity metric based on the empirical correlation coefficient and a measure of convexity. It is then used to find the best benchmark functions to serve as surrogates. The near-optimal parameter set is then found using fractional factorial design. The real-world problem of NACA airfoil lift coefficient maximization is used as a preliminary proof of concept. The overall aim of this research is to reduce the computational overhead of metaheuristics parameterization.

Keywords: metaheuristics, stochastic optimization, particle swarm optimization, exploratory landscape analysis

Procedia PDF Downloads 149
20313 Low Cost LiDAR-GNSS-UAV Technology Development for PT Garam’s Three Dimensional Stockpile Modeling Needs

Authors: Mohkammad Nur Cahyadi, Imam Wahyu Farid, Ronny Mardianto, Agung Budi Cahyono, Eko Yuli Handoko, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan

Abstract:

Unmanned aerial vehicle (UAV) technology has cost efficiency and data retrieval time advantages. Using technologies such as UAV, GNSS, and LiDAR will later be combined into one of the newest technologies to cover each other's deficiencies. This integration system aims to increase the accuracy of calculating the volume of the land stockpile of PT. Garam (Salt Company). The use of UAV applications to obtain geometric data and capture textures that characterize the structure of objects. This study uses the Taror 650 Iron Man drone with four propellers, which can fly for 15 minutes. LiDAR can classify based on the number of image acquisitions processed in the software, utilizing photogrammetry and structural science principles from Motion point cloud technology. LiDAR can perform data acquisition that enables the creation of point clouds, three-dimensional models, Digital Surface Models, Contours, and orthomosaics with high accuracy. LiDAR has a drawback in the form of coordinate data positions that have local references. Therefore, researchers use GNSS, LiDAR, and drone multi-sensor technology to map the stockpile of salt on open land and warehouses every year, carried out by PT. Garam twice, where the previous process used terrestrial methods and manual calculations with sacks. Research with LiDAR needs to be combined with UAV to overcome data acquisition limitations because it only passes through the right and left sides of the object, mainly when applied to a salt stockpile. The UAV is flown to assist data acquisition with a wide coverage with the help of integration of the 200-gram LiDAR system so that the flying angle taken can be optimal during the flight process. Using LiDAR for low-cost mapping surveys will make it easier for surveyors and academics to obtain pretty accurate data at a more economical price. As a survey tool, LiDAR is included in a tool with a low price, around 999 USD; this device can produce detailed data. Therefore, to minimize the operational costs of using LiDAR, surveyors can use Low-Cost LiDAR, GNSS, and UAV at a price of around 638 USD. The data generated by this sensor is in the form of a visualization of an object shape made in three dimensions. This study aims to combine Low-Cost GPS measurements with Low-Cost LiDAR, which are processed using free user software. GPS Low Cost generates data in the form of position-determining latitude and longitude coordinates. The data generates X, Y, and Z values to help georeferencing process the detected object. This research will also produce LiDAR, which can detect objects, including the height of the entire environment in that location. The results of the data obtained are calibrated with pitch, roll, and yaw to get the vertical height of the existing contours. This study conducted an experimental process on the roof of a building with a radius of approximately 30 meters.

Keywords: LiDAR, unmanned aerial vehicle, low-cost GNSS, contour

Procedia PDF Downloads 87
20312 Towards Printed Green Time-Temperature Indicator

Authors: Mariia Zhuldybina, Ahmed Moulay, Mirko Torres, Mike Rozel, Ngoc-Duc Trinh, Chloé Bois

Abstract:

To reduce the global waste of perishable goods, a solution for monitoring and traceability of their environmental conditions is needed. Temperature is the most controllable environmental parameter determining the kinetics of physical, chemical, and microbial spoilage in food products. To store the time-temperature information, time-temperature indicator (TTI) is a promising solution. Printed electronics (PE) has shown a great potential to produce customized electronic devices using flexible substrates and inks with different functionalities. We propose to fabricate a hybrid printed TTI using environmentally friendly materials. The real-time TTI profile can be stored and transmitted to the smartphone via Near Field Communication (NFC). To ensure environmental performance, Canadian Green Electronics NSERC Network is developing green materials for the ink formulation with different functionalities. In terms of substrate, paper-based electronics has gained the great interest for utilization in a wide area of electronic systems because of their low costs in setup and methodology, as well as their eco-friendly fabrication technologies. The main objective is to deliver a prototype of TTI using small-scale printed techniques under typical printing conditions. All sub-components of the smart labels, including a memristor, a battery, an antenna compatible with NFC protocol, and a circuit compatible with integration performed by an offsite supplier will be fully printed with flexography or flat-bed screen printing.

Keywords: NFC, printed electronics, time-temperature indicator, hybrid electronics

Procedia PDF Downloads 161
20311 Security Issues on Smart Grid and Blockchain-Based Secure Smart Energy Management Systems

Authors: Surah Aldakhl, Dafer Alali, Mohamed Zohdy

Abstract:

The next generation of electricity grid infrastructure, known as the "smart grid," integrates smart ICT (information and communication technology) into existing grids in order to alleviate the drawbacks of existing one-way grid systems. Future power systems' efficiency and dependability are anticipated to significantly increase thanks to the Smart Grid, especially given the desire for renewable energy sources. The security of the Smart Grid's cyber infrastructure is a growing concern, though, as a result of the interconnection of significant power plants through communication networks. Since cyber-attacks can destroy energy data, beginning with personal information leaking from grid members, they can result in serious incidents like huge outages and the destruction of power network infrastructure. We shall thus propose a secure smart energy management system based on the Blockchain as a remedy for this problem. The power transmission and distribution system may undergo a transformation as a result of the inclusion of optical fiber sensors and blockchain technology in smart grids. While optical fiber sensors allow real-time monitoring and management of electrical energy flow, Blockchain offers a secure platform to safeguard the smart grid against cyberattacks and unauthorized access. Additionally, this integration makes it possible to see how energy is produced, distributed, and used in real time, increasing transparency. This strategy has advantages in terms of improved security, efficiency, dependability, and flexibility in energy management. An in-depth analysis of the advantages and drawbacks of combining blockchain technology with optical fiber is provided in this paper.

Keywords: smart grids, blockchain, fiber optic sensor, security

Procedia PDF Downloads 111
20310 The Positive Effects of Processing Instruction on the Acquisition of French as a Second Language: An Eye-Tracking Study

Authors: Cecile Laval, Harriet Lowe

Abstract:

Processing Instruction is a psycholinguistic pedagogical approach drawing insights from the Input Processing Model which establishes the initial innate strategies used by second language learners to connect form and meaning of linguistic features. With the ever-growing use of technology in Second Language Acquisition research, the present study uses eye-tracking to measure the effectiveness of Processing Instruction in the acquisition of French and its effects on learner’s cognitive strategies. The experiment was designed using a TOBII Pro-TX300 eye-tracker to measure participants’ default strategies when processing French linguistic input and any cognitive changes after receiving Processing Instruction treatment. Participants were drawn from lower intermediate adult learners of French at the University of Greenwich and randomly assigned to two groups. The study used a pre-test/post-test methodology. The pre-tests (one per linguistic item) were administered via the eye-tracker to both groups one week prior to instructional treatment. One group received full Processing Instruction treatment (explicit information on the grammatical item and on the processing strategies, and structured input activities) on the primary target linguistic feature (French past tense imperfective aspect). The second group received Processing Instruction treatment except the explicit information on the processing strategies. Three immediate post-tests on the three grammatical structures under investigation (French past tense imperfective aspect, French Subjunctive used for the expression of doubt, and the French causative construction with Faire) were administered with the eye-tracker. The eye-tracking data showed the positive change in learners’ processing of the French target features after instruction with improvement in the interpretation of the three linguistic features under investigation. 100% of participants in both groups made a statistically significant improvement (p=0.001) in the interpretation of the primary target feature (French past tense imperfective aspect) after treatment. 62.5% of participants made an improvement in the secondary target item (French Subjunctive used for the expression of doubt) and 37.5% of participants made an improvement in the cumulative target feature (French causative construction with Faire). Statistically there was no significant difference between the pre-test and post-test scores in the cumulative target feature; however, the variance approximately tripled between the pre-test and the post-test (3.9 pre-test and 9.6 post-test). This suggests that the treatment does not affect participants homogenously and implies a role for individual differences in the transfer-of-training effect of Processing Instruction. The use of eye-tracking provides an opportunity for the study of unconscious processing decisions made during moment-by-moment comprehension. The visual data from the eye-tracking demonstrates changes in participants’ processing strategies. Gaze plots from pre- and post-tests display participants fixation points changing from focusing on content words to focusing on the verb ending. This change in processing strategies can be clearly seen in the interpretation of sentences in both primary and secondary target features. This paper will present the research methodology, design and results of the experimental study using eye-tracking to investigate the primary effects and transfer-of-training effects of Processing Instruction. It will then provide evidence of the cognitive benefits of Processing Instruction in Second Language Acquisition and offer suggestion in second language teaching of grammar.

Keywords: eye-tracking, language teaching, processing instruction, second language acquisition

Procedia PDF Downloads 278
20309 Optical Properties of TlInSe₂<AU> Si̇ngle Crystals

Authors: Gulshan Mammadova

Abstract:

This paper presents the results of studying the surface microrelief in 2D and 3D models and analyzing the spectroscopy of a three-junction TlInSe₂ crystal. Analysis of the results obtained showed that with a change in the composition of the TlInSe₂ crystal, sharp changes occur in the microrelief of its surface. An X-ray optical diffraction analysis of the TlInSe₂ crystal was experimentally carried out. Based on ellipsometric data, optical functions were determined - the real and imaginary parts of the dielectric permittivity of crystals, the coefficients of optical absorption and reflection, the dependence of energy losses and electric field power on the effective density, the spectral dependences of the real (σᵣ) and imaginary (σᵢ) parts, optical electrical conductivity were experimentally studied. The fluorescence spectra of the ternary compound TlInSe₂ were isolated and analyzed when excited by light with a wavelength of 532 nm. X-ray studies of TlInSe₂ showed that this phase crystallizes into tetragonal systems. Ellipsometric measurements showed that the real (ε₁) and imaginary (ε₂) parts of the dielectric constant are components of the dielectric constant tensor of the uniaxial joints under consideration and do not depend on the angle. Analysis of the dependence of the real and imaginary parts of the refractive index of the TlInSe₂ crystal on photon energy showed that the nature of the change in the real and imaginary parts of the dielectric constant does not differ significantly. When analyzing the spectral dependences of the real (σr) and imaginary (σi) parts of the optical electrical conductivity, it was noticed that the real part of the optical electrical conductivity increases exponentially in the energy range 0.894-3.505 eV. In the energy range of 0.654-2.91 eV, the imaginary part of the optical electrical conductivity increases linearly, reaches a maximum value, and decreases at an energy of 2.91 eV. At 3.6 eV, an inversion of the imaginary part of the optical electrical conductivity of the TlInSe₂ compound is observed. From the graphs of the effective power density versus electric field energy losses, it is known that the effective power density increases significantly in the energy range of 0.805–3.52 eV. The fluorescence spectrum of the ternary compound TlInSe₂ upon excitation with light with a wavelength of 532 nm has been studied and it has been established that this phase has luminescent properties.

Keywords: optical properties, dielectric permittivity, real and imaginary dielectric permittivity, optical electrical conductivity

Procedia PDF Downloads 60
20308 Irradion: Portable Small Animal Imaging and Irradiation Unit

Authors: Josef Uher, Jana Boháčová, Richard Kadeřábek

Abstract:

In this paper, we present a multi-robot imaging and irradiation research platform referred to as Irradion, with full capabilities of portable arbitrary path computed tomography (CT). Irradion is an imaging and irradiation unit entirely based on robotic arms for research on cancer treatment with ion beams on small animals (mice or rats). The platform comprises two subsystems that combine several imaging modalities, such as 2D X-ray imaging, CT, and particle tracking, with precise positioning of a small animal for imaging and irradiation. Computed Tomography: The CT subsystem of the Irradion platform is equipped with two 6-joint robotic arms that position a photon counting detector and an X-ray tube independently and freely around the scanned specimen and allow image acquisition utilizing computed tomography. Irradiation measures nearly all conventional 2D and 3D trajectories of X-ray imaging with precisely calibrated and repeatable geometrical accuracy leading to a spatial resolution of up to 50 µm. In addition, the photon counting detectors allow X-ray photon energy discrimination, which can suppress scattered radiation, thus improving image contrast. It can also measure absorption spectra and recognize different materials (tissue) types. X-ray video recording and real-time imaging options can be applied for studies of dynamic processes, including in vivo specimens. Moreover, Irradion opens the door to exploring new 2D and 3D X-ray imaging approaches. We demonstrate in this publication various novel scan trajectories and their benefits. Proton Imaging and Particle Tracking: The Irradion platform allows combining several imaging modules with any required number of robots. The proton tracking module comprises another two robots, each holding particle tracking detectors with position, energy, and time-sensitive sensors Timepix3. Timepix3 detectors can track particles entering and exiting the specimen and allow accurate guiding of photon/ion beams for irradiation. In addition, quantifying the energy losses before and after the specimen brings essential information for precise irradiation planning and verification. Work on the small animal research platform Irradion involved advanced software and hardware development that will offer researchers a novel way to investigate new approaches in (i) radiotherapy, (ii) spectral CT, (iii) arbitrary path CT, (iv) particle tracking. The robotic platform for imaging and radiation research developed for the project is an entirely new product on the market. Preclinical research systems with precision robotic irradiation with photon/ion beams combined with multimodality high-resolution imaging do not exist currently. The researched technology can potentially cause a significant leap forward compared to the current, first-generation primary devices.

Keywords: arbitrary path CT, robotic CT, modular, multi-robot, small animal imaging

Procedia PDF Downloads 85
20307 A Step Towards Automating the Synthesis of a Scene Script

Authors: Americo Pereira, Ricardo Carvalho, Pedro Carvalho, Luis Corte-Real

Abstract:

Generating 3D content is a task mostly done by hand. It requires specific knowledge not only on how to use the tools for the task but also on the fundamentals of a 3D environment. In this work, we show that automatic generation of content can be achieved, from a scene script, by leveraging existing tools so that non-experts can easily engage in a 3D content generation without requiring vast amounts of time in exploring and learning how to use specific tools. This proposal carries several benefits, including flexible scene synthesis with different levels of detail. Our preliminary results show that the automatically generated content is comparable to the content generated by users with low experience in 3D modeling while vastly reducing the amount of time required for the generation and adds support to implement flexible scenarios for visual scene visualization.

Keywords: 3D virtualization, multimedia, scene script, synthesis

Procedia PDF Downloads 260
20306 Development of Web Application for Warehouse Management System: A Case Study of Ceramics Factory

Authors: Thanaphat Suwanaklang, Supaporn Suwannarongsri

Abstract:

Presently, there are many industries in Thailand producing various products for both domestic distribution and export to foreign countries. Warehouse is one of the most important areas of business needing to store their products. Such businesses need to have a suitable warehouse management system for reducing the storage time and using the space as much as possible. This paper proposes the development of a web application for a warehouse management system. One of the ceramics factories in Thailand is conducted as a case study. By applying the ABC analysis, fixed location, commodity system, ECRS, and 7-waste theories and principles, the web application for the warehouse management system of the selected ceramics factory is developed to design the optimal storage area for groups of products and design the optimal routes of forklifts. From experimental results, it was found that the warehouse management system developed via the web application can reduce the travel distance of forklifts and the time of searching for storage area by 100% once compared with the conventional method. In addition, the entire storage area can be on-line and real-time monitored.

Keywords: warehouse management system, warehouse design method, logistics system, web application

Procedia PDF Downloads 132
20305 Face Recognition Using Eigen Faces Algorithm

Authors: Shweta Pinjarkar, Shrutika Yawale, Mayuri Patil, Reshma Adagale

Abstract:

Face recognition is the technique which can be applied to the wide variety of problems like image and film processing, human computer interaction, criminal identification etc. This has motivated researchers to develop computational models to identify the faces, which are easy and simple to implement. In this, demonstrates the face recognition system in android device using eigenface. The system can be used as the base for the development of the recognition of human identity. Test images and training images are taken directly with the camera in android device.The test results showed that the system produces high accuracy. The goal is to implement model for particular face and distinguish it with large number of stored faces. face recognition system detects the faces in picture taken by web camera or digital camera and these images then checked with training images dataset based on descriptive features. Further this algorithm can be extended to recognize the facial expressions of a person.recognition could be carried out under widely varying conditions like frontal view,scaled frontal view subjects with spectacles. The algorithm models the real time varying lightning conditions. The implemented system is able to perform real-time face detection, face recognition and can give feedback giving a window with the subject's info from database and sending an e-mail notification to interested institutions using android application. Face recognition is the technique which can be applied to the wide variety of problems like image and film processing, human computer interaction, criminal identification etc. This has motivated researchers to develop computational models to identify the faces, which are easy and simple to implement. In this , demonstrates the face recognition system in android device using eigenface. The system can be used as the base for the development of the recognition of human identity. Test images and training images are taken directly with the camera in android device.The test results showed that the system produces high accuracy. The goal is to implement model for particular face and distinguish it with large number of stored faces. face recognition system detects the faces in picture taken by web camera or digital camera and these images then checked with training images dataset based on descriptive features. Further this algorithm can be extended to recognize the facial expressions of a person.recognition could be carried out under widely varying conditions like frontal view,scaled frontal view subjects with spectacles. The algorithm models the real time varying lightning conditions. The implemented system is able to perform real-time face detection, face recognition and can give feedback giving a window with the subject's info from database and sending an e-mail notification to interested institutions using android application.

Keywords: face detection, face recognition, eigen faces, algorithm

Procedia PDF Downloads 355
20304 Bringing Design Science Research Methodology into Real World Applications

Authors: Maya Jaber

Abstract:

In today's ever-changing world, organizational leaders will need to transform their organizations to meet the demands they face from employees, consumers, local and federal governments, and the global market. Change agents and leaders will need a new paradigm of thinking for creative problem solving and innovation in a time of uncertainty. A new framework that is developed from Design Science Research foundations with holistic design thinking methodologies (HTDM) and action research approaches has been developed through Dr. Jaber’s research. It combines these philosophies into a three-step process that can be utilized in practice for any sustainability, change, or project management applications. This framework was developed to assist in the pedagogy for the implementation of her holistic strategy formalized framework Integral Design Thinking (IDT). Her work focuses on real world application for the streamlining and adoption of initiatives into organizational culture transformation. This paper will discuss the foundations of this philosophy and the methods for utilization in practice developed in Dr. Jaber's research.

Keywords: design science research, action research, critical thinking, design thinking, organizational transformation, sustainability management, organizational culture change

Procedia PDF Downloads 178
20303 Effect of Three Instructional Strategies on Pre-service Teachers’ Learning Outcomes in Practical Chemistry in Niger State, Nigeria

Authors: Akpokiere Ugbede Roseline

Abstract:

Chemistry is an activity oriented subject in which many students achievement over the years are not encouraging. Among the reasons found to be responsible for student’s poor performance in chemistry are ineffective teaching strategies. This study, therefore, sought to determine the effect of guided inquiry, guided inquiry with demonstration, and demonstration with conventional approach on pre-service teachers’ cognitive attainment and practical skills acquisition on stoichiometry and chemical reactions in practical chemistry, Two research questions and hypotheses were each answered and tested respectively. The study was a quasi-experimental research involving 50 students in each of the experimental groups and 50 students in the control group. Out of the five instruments used for the study, three were on stimulus and two on response (Test of Cognitive Attainment and Test of Practical Skills in Chemistry) instruments administered, and dataobtained were analyzed with t-test and Analysis of Variance. Findings revealed, among others, that there was a significant effect of treatments on students' cognitive attainment and on practical skills acquisition. Students exposed to guided inquiry (with/without demonstration) strategies achieved better than those exposed to demonstration with conventional strategy. It is therefore recommended, among others, that Lecturers in Colleges of Education should utilize the guided inquiry strategy for teaching concepts in chemistry.

Keywords: instructional strategy, practical chemistry, learning outcomes, pre-service teachers

Procedia PDF Downloads 100
20302 Time Travel Testing: A Mechanism for Improving Renewal Experience

Authors: Aritra Majumdar

Abstract:

While organizations strive to expand their new customer base, retaining existing relationships is a key aspect of improving overall profitability and also showcasing how successful an organization is in holding on to its customers. It is an experimentally proven fact that the lion’s share of profit always comes from existing customers. Hence seamless management of renewal journeys across different channels goes a long way in improving trust in the brand. From a quality assurance standpoint, time travel testing provides an approach to both business and technology teams to enhance the customer experience when they look to extend their partnership with the organization for a defined phase of time. This whitepaper will focus on key pillars of time travel testing: time travel planning, time travel data preparation, and enterprise automation. Along with that, it will call out some of the best practices and common accelerator implementation ideas which are generic across verticals like healthcare, insurance, etc. In this abstract document, a high-level snapshot of these pillars will be provided. Time Travel Planning: The first step of setting up a time travel testing roadmap is appropriate planning. Planning will include identifying the impacted systems that need to be time traveled backward or forward depending on the business requirement, aligning time travel with other releases, frequency of time travel testing, preparedness for handling renewal issues in production after time travel testing is done and most importantly planning for test automation testing during time travel testing. Time Travel Data Preparation: One of the most complex areas in time travel testing is test data coverage. Aligning test data to cover required customer segments and narrowing it down to multiple offer sequencing based on defined parameters are keys for successful time travel testing. Another aspect is the availability of sufficient data for similar combinations to support activities like defect retesting, regression testing, post-production testing (if required), etc. This section will talk about the necessary steps for suitable data coverage and sufficient data availability from a time travel testing perspective. Enterprise Automation: Time travel testing is never restricted to a single application. The workflow needs to be validated in the downstream applications to ensure consistency across the board. Along with that, the correctness of offers across different digital channels needs to be checked in order to ensure a smooth customer experience. This section will talk about the focus areas of enterprise automation and how automation testing can be leveraged to improve the overall quality without compromising on the project schedule. Along with the above-mentioned items, the white paper will elaborate on the best practices that need to be followed during time travel testing and some ideas pertaining to accelerator implementation. To sum it up, this paper will be written based on the real-time experience author had on time travel testing. While actual customer names and program-related details will not be disclosed, the paper will highlight the key learnings which will help other teams to implement time travel testing successfully.

Keywords: time travel planning, time travel data preparation, enterprise automation, best practices, accelerator implementation ideas

Procedia PDF Downloads 155