Search results for: digital signal process
17443 Particle Size Distribution Estimation of a Mixture of Regular and Irregular Sized Particles Using Acoustic Emissions
Authors: Ejay Nsugbe, Andrew Starr, Ian Jennions, Cristobal Ruiz-Carcel
Abstract:
This works investigates the possibility of using Acoustic Emissions (AE) to estimate the Particle Size Distribution (PSD) of a mixture of particles that comprise of particles of different densities and geometry. The experiments carried out involved the mixture of a set of glass and polyethylene particles that ranged from 150-212 microns and 150-250 microns respectively and an experimental rig that allowed the free fall of a continuous stream of particles on a target plate which the AE sensor was placed. By using a time domain based multiple threshold method, it was observed that the PSD of the particles in the mixture could be estimated.Keywords: acoustic emissions, particle sizing, process monitoring, signal processing
Procedia PDF Downloads 35317442 Active Treatment of Water Chemistry for Swimming Pools Using Novel Automated System (NAS)
Authors: Saeed Asiri
Abstract:
The Novel Automated System (NAS) has the control system of the level of chlorine and acid (i.e. pH level) through a feedback in three forms of synchronous alerts. The feedback is in the form of an alert voice, a visible color, and a message on a digital screen. In addition, NAS contains a slide-in container in which chemicals are used to treat the problems of chlorine and acid levels independently. Moreover, NAS has a net in front of it to clean the pool on the surface of the water from leaves and wastes and so on which is controlled through a remote control. The material used is a lightweight aluminum with mechanical and electric parts integrated with each other. In fact, NAS is qualified to serve as an assistant security guard for swimming pools because it has the characteristics that make it unique and smart.Keywords: novel automated system, pool safety, maintenance, pH level, digital screen
Procedia PDF Downloads 7117441 The Disposable Identities; Enabling Trust-by-Design to Build Sustainable Data-Driven Value
Authors: Lorna Goulden, Kai M. Hermsen, Jari Isohanni, Mirko Ross, Jef Vanbockryck
Abstract:
This article introduces disposable identities, with reference use cases and explores possible technical approaches. The proposed approach, when fully developed as an open-source toolkit, enables developers of mobile or web apps to employ a self-sovereign identity and data privacy framework, in order to rebuild trust in digital services by providing greater transparency, decentralized control, and GDPR compliance. With a user interface for the management of self-sovereign identity, digital authorizations, and associated data-driven transactions, the advantage of Disposable Identities is that they may also contain verifiable data such as the owner’s photograph, official or even biometric identifiers for more proactive prevention of identity abuse. These Disposable Identities designed for decentralized privacy management can also be time, purpose and context-bound through a secure digital contract; with verification functionalities based on tamper-proof technology.Keywords: dentity, trust, self-sovereign, disposable identity, privacy toolkit, decentralised identity, verifiable credential, cybersecurity, data driven business, PETs, GDPRdentity, trust, self-sovereign, disposable identity, privacy toolkit, decentralised identity, verifiable credential, cybersecurity, data driven business, PETs, GDPRI
Procedia PDF Downloads 21817440 The Impact of Artificial Intelligence on Legislations and Laws
Authors: Keroles Akram Saed Ghatas
Abstract:
The near future will bring significant changes in modern organizations and management due to the growing role of intangible assets and knowledge workers. The area of copyright, intellectual property, digital (intangible) assets and media redistribution appears to be one of the greatest challenges facing business and society in general and management sciences and organizations in particular. The proposed article examines the views and perceptions of fairness in digital media sharing among Harvard Law School's LL.M.s. Students, based on 50 qualitative interviews and 100 surveys. The researcher took an ethnographic approach to her research and entered the Harvard LL.M. in 2016. at, a Face book group that allows people to connect naturally and attend in-person and private events more easily. After listening to numerous students, the researcher conducted a quantitative survey among 100 respondents to assess respondents' perceptions of fairness in digital file sharing in various contexts (based on media price, its availability, regional licenses, copyright holder status, etc.). to understand better . .). Based on the survey results, the researcher conducted long-term, open-ended and loosely structured ethnographic interviews (50 interviews) to further deepen the understanding of the results. The most important finding of the study is that Harvard lawyers generally support digital piracy in certain contexts, despite having the best possible legal and professional knowledge. Interestingly, they are also more accepting of working for the government than the private sector. The results of this study provide a better understanding of how “fairness” is perceived by the younger generation of lawyers and pave the way for a more rational application of licensing laws.Keywords: cognitive impairments, communication disorders, death penalty, executive function communication disorders, cognitive disorders, capital murder, executive function death penalty, egyptian law absence, justice, political cases piracy, digital sharing, perception of fairness, legal profession
Procedia PDF Downloads 6517439 The Impact of Professional Development in the Area of Technology Enhanced Learning on Higher Education Teaching Practices Across Atlantic Technological University – Research Methodology and Preliminary Findings
Authors: Annette Cosgrove
Abstract:
The objectives of this research study is to examine the impact of professional development in Technology Enhanced Learning (TEL) and the digitisation of learning in teaching communities across multiple higher education sites in the ATU (Atlantic Technological University *) ( 2020-2025), including the proposal of an evidence based digital teaching model for use in a future pandemic. The research strategy undertaken for this PhD Study is a multi-site study using mixed methods. Qualitative & quantitative methods are being used in the study to collect data. A pilot study was carried out initially , feedback collected and the research instrument was edited to reflect this feedback, before being administered. The purpose of the staff questionnaire is to evaluate the impact of professional development in the area of TEL, and to capture the practitioners views on the perceived impact on their teaching practice in the higher education sector across ATU (West of Ireland – 5 Higher education locations ). The phenomenon being explored is ‘ the impact of professional development in the area of technology enhanced learning and on teaching practice in a higher education institution.’ The research methodology chosen for this study is an Action based Research Study. The researcher has chosen this approach as it is a prime strategy for developing educational theory and enhancing educational practice . This study includes quantitative and qualitative methods to elicit data which will quantify the impact that continuous professional development in the area of digital teaching practice and technologies has on the practitioner’s teaching practice in higher education. The research instruments / data collection tools for this study include a lecturer survey with a targeted TEL Practice group ( Pre and post covid experience) and semi-structured interviews with lecturers.. This research is currently being conducted across the ATU multisite campus and targeting Higher education lecturers that have completed formal CPD in the area of digital teaching. ATU, a west of Ireland university is the focus of the study , The research questionnaire has been deployed, with 75 respondents to date across the ATU - the primary questionnaire and semi- formal interviews are ongoing currently – the purpose being to evaluate the impact of formal professional development in the area of TEL and its perceived impact on the practitioners teaching practice in the area of digital teaching and learning . This paper will present initial findings, reflections and data from this ongoing research study.Keywords: TEL, DTL, digital teaching, digital assessment
Procedia PDF Downloads 7017438 The Development of a Digitally Connected Factory Architecture to Enable Product Lifecycle Management for the Assembly of Aerostructures
Authors: Nicky Wilson, Graeme Ralph
Abstract:
Legacy aerostructure assembly is defined by large components, low build rates, and manual assembly methods. With an increasing demand for commercial aircraft and emerging markets such as the eVTOL (electric vertical take-off and landing) market, current methods of manufacturing are not capable of efficiently hitting these higher-rate demands. This project will look at how legacy manufacturing processes can be rate enabled by taking a holistic view of data usage, focusing on how data can be collected to enable fully integrated digital factories and supply chains. The study will focus on how data is flowed both up and down the supply chain to create a digital thread specific to each part and assembly while enabling machine learning through real-time, closed-loop feedback systems. The study will also develop a bespoke architecture to enable connectivity both within the factory and the wider PLM (product lifecycle management) system, moving away from traditional point-to-point systems used to connect IO devices to a hub and spoke architecture that will exploit report-by-exception principles. This paper outlines the key issues facing legacy aircraft manufacturers, focusing on what future manufacturing will look like from adopting Industry 4 principles. The research also defines the data architecture of a PLM system to enable the transfer and control of a digital thread within the supply chain and proposes a standardised communications protocol to enable a scalable solution to connect IO devices within a production environment. This research comes at a critical time for aerospace manufacturers, who are seeing a shift towards the integration of digital technologies within legacy production environments, while also seeing build rates continue to grow. It is vital that manufacturing processes become more efficient in order to meet these demands while also securing future work for many manufacturers.Keywords: Industry 4, digital transformation, IoT, PLM, automated assembly, connected factories
Procedia PDF Downloads 7917437 Knowledge, Technology and Empowerment in Contemporary Scenario
Authors: Samir Roy
Abstract:
This paper investigates the relationship among knowledge, technology, and empowerment. In Physics power is defined as rate of doing work. In everyday use, the meaning of the word power is related to the capacity to bring change of value in the world. It appears that the popular aphorism “Knowledge is power” should be revisited in the context of contemporary states of affairs. For instance, classical mechanics is a system of knowledge, so also thermodynamics. But neither of them, per se, is sufficient to produce automobilin es. Boolean algebra, the logical foundation of digital electronic computers, was introduced by George Boole in 1847. But that knowledge was practically useless for almost one hundred years until digital electronics was developed in early twentieth century, which eventually led to invention of digital electronic computers. Empowerment of women is a burning issue in the arena of social justice. However, if we carefully analyze the functional elements of women’s empowerment, we find them to be highly technology driven as well as technology dependent in real life. On the other hand, technology has empowered modern states to maintain social order and promote democracy in an effective manner. This paper includes a few case studies to establish the close correspondence between knowledge, especially scientific knowledge, technology, and empowerment. It appears that in contemporary scenario, “Technology is power” is a more appropriate statement than the traditional aphorism “Knowledge is power”.Keywords: knowledge, science, technology, empowerment, change, social justice
Procedia PDF Downloads 4217436 Design and Assessment of Traffic Management Strategies for Improved Mobility on Major Arterial Roads in Lahore City
Authors: N. Ali, S. Nakayama, H. Yamaguchi, M. Nadeem
Abstract:
Traffic congestion is a matter of prime concern in developing countries. This can be primarily attributed due to poor design practices and biased allocation of resources based on political will neglecting the technical feasibilities in infrastructure design. During the last decade, Lahore has expanded at an unprecedented rate as compared to surrounding cities due to more funding and resource allocation by the previous governments. As a result of this, people from surrounding cities and areas moved to the Lahore city for better opportunities and quality of life. This migration inflow inherited the city with an increased population yielding the inefficiency of the existing infrastructure to accommodate enhanced traffic demand. This leads to traffic congestion on major arterial roads of the city. In this simulation study, a major arterial road was selected to evaluate the performance of the five intersections by changing the geometry of the intersections or signal control type. Simulations were done in two software; Highway Capacity Software (HCS) and Synchro Studio and Sim Traffic Software. Some of the traffic management strategies that were employed include actuated-signal control, semi-actuated signal control, fixed-time signal control, and roundabout. The most feasible solution for each intersection in the above-mentioned traffic management techniques was selected with the least delay time (seconds) and improved Level of Service (LOS). The results showed that Jinnah Hospital Intersection and Akbar Chowk Intersection improved 92.97% and 92.67% in delay time reduction, respectively. These results can be used by traffic planners and policy makers for decision making for the expansion of these intersections keeping in mind the traffic demand in future years.Keywords: traffic congestion, traffic simulation, traffic management, congestion problems
Procedia PDF Downloads 47017435 LTE Performance Analysis in the City of Bogota Northern Zone for Two Different Mobile Broadband Operators over Qualipoc
Authors: Víctor D. Rodríguez, Edith P. Estupiñán, Juan C. Martínez
Abstract:
The evolution in mobile broadband technologies has allowed to increase the download rates in users considering the current services. The evaluation of technical parameters at the link level is of vital importance to validate the quality and veracity of the connection, thus avoiding large losses of data, time and productivity. Some of these failures may occur between the eNodeB (Evolved Node B) and the user equipment (UE), so the link between the end device and the base station can be observed. LTE (Long Term Evolution) is considered one of the IP-oriented mobile broadband technologies that work stably for data and VoIP (Voice Over IP) for those devices that have that feature. This research presents a technical analysis of the connection and channeling processes between UE and eNodeB with the TAC (Tracking Area Code) variables, and analysis of performance variables (Throughput, Signal to Interference and Noise Ratio (SINR)). Three measurement scenarios were proposed in the city of Bogotá using QualiPoc, where two operators were evaluated (Operator 1 and Operator 2). Once the data were obtained, an analysis of the variables was performed determining that the data obtained in transmission modes vary depending on the parameters BLER (Block Error Rate), performance and SNR (Signal-to-Noise Ratio). In the case of both operators, differences in transmission modes are detected and this is reflected in the quality of the signal. In addition, due to the fact that both operators work in different frequencies, it can be seen that Operator 1, despite having spectrum in Band 7 (2600 MHz), together with Operator 2, is reassigning to another frequency, a lower band, which is AWS (1700 MHz), but the difference in signal quality with respect to the establishment with data by the provider Operator 2 and the difference found in the transmission modes determined by the eNodeB in Operator 1 is remarkable.Keywords: BLER, LTE, network, qualipoc, SNR.
Procedia PDF Downloads 11517434 Multicasting Characteristics of All-Optical Triode Based on Negative Feedback Semiconductor Optical Amplifiers
Authors: S. Aisyah Azizan, M. Syafiq Azmi, Yuki Harada, Yoshinobu Maeda, Takaomi Matsutani
Abstract:
We introduced an all-optical multi-casting characteristics with wavelength conversion based on a novel all-optical triode using negative feedback semiconductor optical amplifier. This study was demonstrated with a transfer speed of 10 Gb/s to a non-return zero 231-1 pseudorandom bit sequence system. This multi-wavelength converter device can simultaneously provide three channels of output signal with the support of non-inverted and inverted conversion. We studied that an all-optical multi-casting and wavelength conversion accomplishing cross gain modulation is effective in a semiconductor optical amplifier which is effective to provide an inverted conversion thus negative feedback. The relationship of received power of back to back signal and output signals with wavelength 1535 nm, 1540 nm, 1545 nm, 1550 nm, and 1555 nm with bit error rate was investigated. It was reported that the output signal wavelengths were successfully converted and modulated with a power penalty of less than 8.7 dB, which the highest is 8.6 dB while the lowest is 4.4 dB. It was proved that all-optical multi-casting and wavelength conversion using an optical triode with a negative feedback by three channels at the same time at a speed of 10 Gb/s is a promising device for the new wavelength conversion technology.Keywords: cross gain modulation, multicasting, negative feedback optical amplifier, semiconductor optical amplifier
Procedia PDF Downloads 68417433 Motor Controller Implementation Using Model Based Design
Authors: Cau Tran, Tu Nguyen, Tien Pham
Abstract:
Model-based design (MBD) is a mathematical and visual technique for addressing design issues in the fields of communications, signal processing, and complicated control systems. It is utilized in several automotive, aerospace, industrial, and motion control applications. Virtual models are at the center of the software development process with model based design. A method used in the creation of embedded software is model-based design. In this study, the LAT motor is modeled in a simulation environment, and the LAT motor control is designed with a cascade structure, a speed and current control loop, and a controller that is used in the next part. A PID structure serves as this controller. Based on techniques and motor parameters that match the design goals, the PID controller is created for the model using traditional design principles. The MBD approach will be used to build embedded software for motor control. The paper will be divided into three distinct sections. The first section will introduce the design process and the benefits and drawbacks of the MBD technique. The design of control software for LAT motors will be the main topic of the next section. The experiment's results are the subject of the last section.Keywords: model based design, limited angle torque, intellectual property core, hardware description language, controller area network, user datagram protocol
Procedia PDF Downloads 9417432 Collaborative Economy in Developing Countries: Perspectives from the Philippines
Authors: Ivy Jessen Galvan
Abstract:
Over the past decade, a phenomenon has emerged at the frontier of the digital economy: a wave of ‘disruptive’ technologies that offer digital solutions to variety of everyday problems, challenging the way traditional industries operate. Most of these disruptive technologies are applications ('apps') that rely on the Internet to connect people to people for sharing, selling, renting, or lending, creating a unique economic model wherein users provide for other users’ demand – called 'collaborative economy.' Although collaborative economy is spreading in every part of the world, there may be different ways in which this phenomenon is unfolding throughout the developing countries. In this study, the characteristics of collaborative economy in the Philippines are highlighted and compared from observations in the developed world. The paper looks at two leading collaborative economy ventures in the Philippines – Grab and Shopee – probing into how these smartphone-based platforms place technology into the 'micro-frictions' of the Philippine developing context. Using framing analysis on interviews conducted among Grab and Shopee users in Metro Manila, three frames have been identified: 1) metropolitan solution; 2) financial inclusion and; 3) formalization of labor. This research contextualizes the Fourth Industrial Revolution in ASEAN by analyzing the effect of a digital economy in everyday life.Keywords: ASEAN Unicorns, collaborative economy, developing countries, fourth industrial revolution
Procedia PDF Downloads 11817431 Preparation of Silver and Silver-Gold, Universal and Repeatable, Surface Enhanced Raman Spectroscopy Platforms from SERSitive
Authors: Pawel Albrycht, Monika Ksiezopolska-Gocalska, Robert Holyst
Abstract:
Surface Enhanced Raman Spectroscopy (SERS) is a technique of growing importance not only in purely scientific research related to analytical chemistry. It finds more and more applications in broadly understood testing - medical, forensic, pharmaceutical, food - and everywhere works perfectly, on one condition that SERS substrates used for testing give adequate enhancement, repeatability, and homogeneity of SERS signal. This is a problem that has existed since the invention of this technique. Some laboratories use as SERS amplifiers colloids with silver or gold nanoparticles, others form rough silver or gold surfaces, but results are generally either weak or unrepeatable. Furthermore, these structures are very often highly specific - they amplify the signal only of a small group of compounds. It means that they work with some kinds of analytes but only with those which were used at a developer’s laboratory. When it comes to research on different compounds, completely new SERS 'substrates' are required. That underlay our decision to develop universal substrates for the SERS spectroscopy. Generally, each compound has different affinity for both silver and gold, which have the best SERS properties, and that's what depends on what signal we get in the SERS spectrum. Our task was to create the platform that gives a characteristic 'fingerprint' of the largest number of compounds with very high repeatability - even at the expense of the intensity of the enhancement factor (EF) (possibility to repeat research results is of the uttermost importance). As specified above SERS substrates are offered by SERSitive company. Applied method is based on cyclic potentiodynamic electrodeposition of silver or silver-gold nanoparticles on the conductive surface of ITO-coated glass at controlled temperature of the reaction solution. Silver nanoparticles are supplied in the form of silver nitrate (AgNO₃, 10 mM), gold nanoparticles are derived from tetrachloroauric acid (10 mM) while sodium sulfite (Na₂O₃, 5 mM) is used as a reductor. To limit and standardize the size of the SERS surface on which nanoparticles are deposited, photolithography is used. We secure the desired ITO-coated glass surface, and then etch the unprotected ITO layer which prevents nanoparticles from settling at these sites. On the prepared surface, we carry out the process described above, obtaining SERS surface with nanoparticles of sizes 50-400 nm. The SERSitive platforms present highly sensitivity (EF = 10⁵-10⁶), homogeneity and repeatability (70-80%).Keywords: electrodeposition, nanoparticles, Raman spectroscopy, SERS, SERSitive, SERS platforms, SERS substrates
Procedia PDF Downloads 15517430 Serious Digital Video Game for Solving Algebraic Equations
Authors: Liliana O. Martínez, Juan E González, Manuel Ramírez-Aranda, Ana Cervantes-Herrera
Abstract:
A serious game category mobile application called Math Dominoes is presented. The main objective of this applications is to strengthen the teaching-learning process of solving algebraic equations and is based on the board game "Double 6" dominoes. Math Dominoes allows the practice of solving first, second-, and third-degree algebraic equations. This application is aimed to students who seek to strengthen their skills in solving algebraic equations in a dynamic, interactive, and fun way, to reduce the risk of failure in subsequent courses that require mastery of this algebraic tool.Keywords: algebra, equations, dominoes, serious games
Procedia PDF Downloads 13117429 L2 Learning and Teaching through Digital Tools
Authors: Bâlc Denisa-Maria
Abstract:
This paper aims to present some ways of preserving a language heritage in the global era. Teaching a second language to foreign students does not imply only teaching the grammar and the vocabulary in order to reach the 4 skills, but it means constant work on developing strategies to make the students aware of the heritage that the language they learn has. Teachers and professors need to be aware of the fact that language is in constant change, they need to adjust their techniques to the digital era, but they also have to be aware of the changes, the good and the bad parts of globalizations. How is it possible to preserve the patrimony of a certain language in a globalized era? What transformations does a language face in time? What does it mean to preserve the heritage of a language through L2 teaching? What makes a language special? What impact does it have on the foreign students? How can we, as teachers, preserve the heritage of our language? Would it be everything about books, films, music, cultural events or what else? How is it possible to include digital programs in your teaching and preserving the patrimony of a language at the same time? How does computational linguistics help us in teaching a certain language? All these questions will be tackled during the essay, with special accent on the definition of a language heritage, the new perspectives for teachers/ professors, everything in a multimodal and complex way of presenting the context. The objectives of this research are: - to present some ways of preserving the heritage of a certain language against globalization - to illustrate what preservation means for L2 teaching - to encourage teachers to be aware of their language patrimony The main contributions of my research are on moving the discussion of preserving a certain language patrimony in the context of L2 teaching.Keywords: preservation, globalization, language heritage, L2 teaching
Procedia PDF Downloads 6217428 The Essence and Attribution of Intellectual Property Rights Generated in the Digitization of Intangible Cultural Heritage
Authors: Jiarong Zhang
Abstract:
Digitizing intangible cultural heritage is a complex and comprehensive process from which sorts of intellectual property rights may be generated. Digitizing may be a repacking process of cultural heritage, which creates copyrights; recording folk songs and indigenous performances can create 'related rights'. At the same time, digitizing intangible cultural heritage may infringe the intellectual property rights of others unintentionally. Recording religious rituals of indigenous communities without authorization can violate the moral right of the ceremony participants of the community; making digital copies of rock paintings may infringe the right of reproduction. In addition, several parties are involved in the digitization process: indigenous peoples, museums, and archives can be holders of cultural heritage; companies and research institutions can be technology providers; internet platforms can be promoters and sellers; the public and groups above can be beneficiaries. When diverse intellectual property rights versus various parties, problems and disputes can arise easily. What are the types of intellectual property rights generated in the digitization process? What is the essence of these rights? Who should these rights belong to? How to use intellectual property to protect the digitalization of cultural heritage? How to avoid infringing on the intellectual property rights of others? While the digitization has been regarded as an effective approach to preserve intangible cultural heritage, related intellectual property issues have not received the attention and full discussion. Thus, parties involving in the digitization process may face intellectual property infringement lawsuits. The article will explore those problems from the intersection perspective of intellectual property law and cultural heritage. From a comparative approach, the paper will analysis related legal documents and cases, and shed some lights of those questions listed. The findings show, although there are no intellectual property laws targeting the cultural heritage in most countries, the involved stakeholders can seek protection from existing intellectual property rights following the suggestions of the article. The research will contribute to the digitization of intangible cultural heritage from a legal and policy aspect.Keywords: copyright, digitization, intangible cultural heritage, intellectual property, Internet platforms
Procedia PDF Downloads 14617427 Block Mining: Block Chain Enabled Process Mining Database
Authors: James Newman
Abstract:
Process mining is an emerging technology that looks to serialize enterprise data in time series data. It has been used by many companies and has been the subject of a variety of research papers. However, the majority of current efforts have looked at how to best create process mining from standard relational databases. This paper is the first pass at outlining a database custom-built for the minimal viable product of process mining. We present Block Miner, a blockchain protocol to store process mining data across a distributed network. We demonstrate the feasibility of storing process mining data on the blockchain. We present a proof of concept and show how the intersection of these two technologies helps to solve a variety of issues, including but not limited to ransomware attacks, tax documentation, and conflict resolution.Keywords: blockchain, process mining, memory optimization, protocol
Procedia PDF Downloads 10317426 Monitoring the Drying and Grinding Process during Production of Celitement through a NIR-Spectroscopy Based Approach
Authors: Carolin Lutz, Jörg Matthes, Patrick Waibel, Ulrich Precht, Krassimir Garbev, Günter Beuchle, Uwe Schweike, Peter Stemmermann, Hubert B. Keller
Abstract:
Online measurement of the product quality is a challenging task in cement production, especially in the production of Celitement, a novel environmentally friendly hydraulic binder. The mineralogy and chemical composition of clinker in ordinary Portland cement production is measured by X-ray diffraction (XRD) and X ray fluorescence (XRF), where only crystalline constituents can be detected. But only a small part of the Celitement components can be measured via XRD, because most constituents have an amorphous structure. This paper describes the development of algorithms suitable for an on-line monitoring of the final processing step of Celitement based on NIR-data. For calibration intermediate products were dried at different temperatures and ground for variable durations. The products were analyzed using XRD and thermogravimetric analyses together with NIR-spectroscopy to investigate the dependency between the drying and the milling processes on one and the NIR-signal on the other side. As a result, different characteristic parameters have been defined. A short overview of the Celitement process and the challenging tasks of the online measurement and evaluation of the product quality will be presented. Subsequently, methods for systematic development of near-infrared calibration models and the determination of the final calibration model will be introduced. The application of the model on experimental data illustrates that NIR-spectroscopy allows for a quick and sufficiently exact determination of crucial process parameters.Keywords: calibration model, celitement, cementitious material, NIR spectroscopy
Procedia PDF Downloads 50017425 Analysis of Fixed Beamforming Algorithms for Smart Antenna Systems
Authors: Muhammad Umair Shahid, Abdul Rehman, Mudassir Mukhtar, Muhammad Nauman
Abstract:
The smart antenna is the prominent technology that has become known in recent years to meet the growing demands of wireless communications. In an overcrowded atmosphere, its application is growing gradually. A methodical evaluation of the performance of Fixed Beamforming algorithms for smart antennas such as Multiple Sidelobe Canceller (MSC), Maximum Signal-to-interference ratio (MSIR) and minimum variance (MVDR) has been comprehensively presented in this paper. Simulation results show that beamforming is helpful in providing optimized response towards desired directions. MVDR beamformer provides the most optimal solution.Keywords: fixed weight beamforming, array pattern, signal to interference ratio, power efficiency, element spacing, array elements, optimum weight vector
Procedia PDF Downloads 18517424 ePA-Coach: Design of the Intelligent Virtual Learning Coach for Senior Learners in Support of Digital Literacy in the Context of Electronic Patient Record
Authors: Ilona Buchem, Carolin Gellner
Abstract:
Over the last few years, the call for the support of senior learners in the development of their digital literacy has become prevalent, mainly due to the progression towards ageing societies paired with advances in digitalisation in all spheres of life, including e-health and electronic patient record (EPA). While major research efforts in supporting senior learners in developing digital literacy have been invested so far in e-learning focusing on knowledge acquisition and cognitive tasks, little research exists in learning models which target virtual mentoring and coaching with the help of pedagogical agents and address the social dimensions of learning. Research from studies with students in the context of formal education has already provided methods for designing intelligent virtual agents in support of personalised learning. However, this research has mostly focused on cognitive skills and has not yet been applied to the context of mentoring/coaching of senior learners, who have different characteristics and learn in different contexts. In this paper, we describe how insights from previous research can be used to develop an intelligent virtual learning coach (agent) for senior learners with a focus on building the social relationship between the agent and the learner and the key task of the agent to socialize learners to the larger context of digital literacy with a focus on electronic health records. Following current approaches to mentoring and coaching, the agent is designed not to enhance and monitor the cognitive performance of the learner but to serve as a trusted friend and advisor, whose role is to provide one-to-one guidance and support sharing of experiences among learners (peers). Based on literature review and synopsis of research on virtual agents and current coaching/mentoring models under consideration of the specific characteristics and requirements of senior learners, we describe the design framework which was applied to design an intelligent virtual learning coach as part of the e-learning system for digital literacy of senior learners in the ePA-Coach project founded by the German Ministry of Education and Research. This paper also presents the results from the evaluation study, which compared the use of the first prototype of the virtual learning coach designed according to the design framework with a voice narration in a multimedia learning environment with senior learners. The focus of the study was to validate the agent design in the context of the persona effect (Lester et al., 1997). Since the persona effect is related to the hypothesis that animated agents are perceived as more socially engaging, the study evaluated possible impacts of agent coaching in comparison with voice coaching on motivation, engagement, experience, and digital literacy.Keywords: virtual learning coach, virtual mentor, pedagogical agent, senior learners, digital literacy, electronic health records
Procedia PDF Downloads 11717423 Data-Mining Approach to Analyzing Industrial Process Information for Real-Time Monitoring
Authors: Seung-Lock Seo
Abstract:
This work presents a data-mining empirical monitoring scheme for industrial processes with partially unbalanced data. Measurement data of good operations are relatively easy to gather, but in unusual special events or faults it is generally difficult to collect process information or almost impossible to analyze some noisy data of industrial processes. At this time some noise filtering techniques can be used to enhance process monitoring performance in a real-time basis. In addition, pre-processing of raw process data is helpful to eliminate unwanted variation of industrial process data. In this work, the performance of various monitoring schemes was tested and demonstrated for discrete batch process data. It showed that the monitoring performance was improved significantly in terms of monitoring success rate of given process faults.Keywords: data mining, process data, monitoring, safety, industrial processes
Procedia PDF Downloads 40117422 Experimental Characterization of Composite Material with Non Contacting Methods
Authors: Nikolaos Papadakis, Constantinos Condaxakis, Konstantinos Savvakis
Abstract:
The aim of this paper is to determine the elastic properties (elastic modulus and Poisson ratio) of a composite material based on noncontacting imaging methods. More specifically, the significantly reduced cost of digital cameras has given the opportunity of the high reliability of low-cost strain measurement. The open source platform Ncorr is used in this paper which utilizes the method of digital image correlation (DIC). The use of digital image correlation in measuring strain uses random speckle preparation on the surface of the gauge area, image acquisition, and postprocessing the image correlation to obtain displacement and strain field on surface under study. This study discusses technical issues relating to the quality of results to be obtained are discussed. [0]8 fabric glass/epoxy composites specimens were prepared and tested at different orientations 0[o], 30[o], 45[o], 60[o], 90[o]. Each test was recorded with the camera at a constant frame rate and constant lighting conditions. The recorded images were processed through the use of the image processing software. The parameters of the test are reported. The strain map output which is obtained through strain measurement using Ncorr is validated by a) comparing the elastic properties with expected values from Classical laminate theory, b) through finite element analysis.Keywords: composites, Ncorr, strain map, videoextensometry
Procedia PDF Downloads 14417421 Fractional-Order Modeling of GaN High Electron Mobility Transistors for Switching Applications
Authors: Anwar H. Jarndal, Ahmed S. Elwakil
Abstract:
In this paper, a fraction-order model for pad parasitic effect of GaN HEMT on Si substrate is developed and validated. Open de-embedding structure is used to characterize and de-embed substrate loading parasitic effects. Unbiased device measurements are implemented to extract parasitic inductances and resistances. The model shows very good simulation for S-parameter measurements under different bias conditions. It has been found that this approach can improve the simulation of intrinsic part of the transistor, which is very important for small- and large-signal modeling process.Keywords: fractional-order modeling, GaNHEMT, si-substrate, open de-embedding structure
Procedia PDF Downloads 35617420 Multiscale Process Modeling of Ceramic Matrix Composites
Authors: Marianna Maiaru, Gregory M. Odegard, Josh Kemppainen, Ivan Gallegos, Michael Olaya
Abstract:
Ceramic matrix composites (CMCs) are typically used in applications that require long-term mechanical integrity at elevated temperatures. CMCs are usually fabricated using a polymer precursor that is initially polymerized in situ with fiber reinforcement, followed by a series of cycles of pyrolysis to transform the polymer matrix into a rigid glass or ceramic. The pyrolysis step typically generates volatile gasses, which creates porosity within the polymer matrix phase of the composite. Subsequent cycles of monomer infusion, polymerization, and pyrolysis are often used to reduce the porosity and thus increase the durability of the composite. Because of the significant expense of such iterative processing cycles, new generations of CMCs with improved durability and manufacturability are difficult and expensive to develop using standard Edisonian approaches. The goal of this research is to develop a computational process-modeling-based approach that can be used to design the next generation of CMC materials with optimized material and processing parameters for maximum strength and efficient manufacturing. The process modeling incorporates computational modeling tools, including molecular dynamics (MD), to simulate the material at multiple length scales. Results from MD simulation are used to inform the continuum-level models to link molecular-level characteristics (material structure, temperature) to bulk-level performance (strength, residual stresses). Processing parameters are optimized such that process-induced residual stresses are minimized and laminate strength is maximized. The multiscale process modeling method developed with this research can play a key role in the development of future CMCs for high-temperature and high-strength applications. By combining multiscale computational tools and process modeling, new manufacturing parameters can be established for optimal fabrication and performance of CMCs for a wide range of applications.Keywords: digital engineering, finite elements, manufacturing, molecular dynamics
Procedia PDF Downloads 9817419 The Role Of Digital Technology In Crime Prevention
Authors: Muhammad Ashfaq
Abstract:
Main theme: This prime focus of this study is on the role of digital technology in crime prevention, with special focus on Cellular Forensic Unit, Capital City Police Peshawar-Khyber Pakhtunkhwa-Pakistan. Objective(s) of the study: The prime objective of this study is to provide statistics, strategies and pattern of analysis used for crime prevention in Cellular Forensic Unit of Capital City Police Peshawar, Khyber Pakhtunkhwa-Pakistan. Research Method and Procedure: Qualitative method of research has been used in the study for obtaining secondary data from research wing and Information Technology (IT) section of Peshawar police. Content analysis was the method used for the conduction of the study. This study is delimited to Capital City Police and Cellular Forensic Unit Peshawar-KP, Pakistan. information technologies. Major finding(s): It is evident that the old traditional approach will never provide solutions for better management in controlling crimes. The best way to control crimes and promotion of proactive policing is to adopt new technologies. The study reveals that technology have transformed police more effective and vigilant as compared to traditional policing. The heinous crimes like abduction, missing of an individual, snatching, burglaries and blind murder cases are now traceable with the help of technology. Recommendation(s): From the analysis of the data, it is reflected that Information Technology (IT) expert should be recruited along with research analyst to timely assist and facilitate operational as well as investigation units of police.A mobile locator should be Provided to Cellular Forensic Unit to timely apprehend the criminals .Latest digital analysis software should be provided to equip the Cellular Forensic Unit.Keywords: crime prevention, digital technology, pakistan, police
Procedia PDF Downloads 6517418 A Stepwise Approach for Piezoresistive Microcantilever Biosensor Optimization
Authors: Amal E. Ahmed, Levent Trabzon
Abstract:
Due to the low concentration of the analytes in biological samples, the use of Biological Microelectromechanical System (Bio-MEMS) biosensors for biomolecules detection results in a minuscule output signal that is not good enough for practical applications. In response to this, a need has arisen for an optimized biosensor capable of giving high output signal in response the detection of few analytes in the sample; the ultimate goal is being able to convert the attachment of a single biomolecule into a measurable quantity. For this purpose, MEMS microcantilevers based biosensors emerged as a promising sensing solution because it is simple, cheap, very sensitive and more importantly does not need analytes optical labeling (Label-free). Among the different microcantilever transducing techniques, piezoresistive based microcantilever biosensors became more prominent because it works well in liquid environments and has an integrated readout system. However, the design of piezoresistive microcantilevers is not a straightforward problem due to coupling between the design parameters, constraints, process conditions, and performance. It was found that the parameters that can be optimized to enhance the sensitivity of Piezoresistive microcantilever-based sensors are: cantilever dimensions, cantilever material, cantilever shape, piezoresistor material, piezoresistor doping level, piezoresistor dimensions, piezoresistor position, Stress Concentration Region's (SCR) shape and position. After a systematic analyzation of the effect of each design and process parameters on the sensitivity, a step-wise optimization approach was developed in which almost all these parameters were variated one at each step while fixing the others to get the maximum possible sensitivity at the end. At each step, the goal was to optimize the parameter in a way that it maximizes and concentrates the stress in the piezoresistor region for the same applied force thus get the higher sensitivity. Using this approach, an optimized sensor that has 73.5x times higher electrical sensitivity (ΔR⁄R) than the starting sensor was obtained. In addition to that, this piezoresistive microcantilever biosensor it is more sensitive than the other similar sensors previously reported in the open literature. The mechanical sensitivity of the final senior is -1.5×10-8 Ω/Ω ⁄pN; which means that for each 1pN (10-10 g) biomolecules attach to this biosensor; the piezoresistor resistivity will decrease by 1.5×10-8 Ω. Throughout this work COMSOL Multiphysics 5.0, a commercial Finite Element Analysis (FEA) tool, has been used to simulate the sensor performance.Keywords: biosensor, microcantilever, piezoresistive, stress concentration region (SCR)
Procedia PDF Downloads 57117417 Damage Analysis in Open Hole Composite Specimens by Digital Image Correlation: Experimental Investigation
Authors: Faci Youcef
Abstract:
In the present work, an experimental study is carried out using the digital image correlation (DIC) technique to analyze the damage and behavior of woven composite carbon/epoxy under tensile loading. The tension mechanisms associated with failure modes of bolted joints in advanced composites are studied, as well as displacement distribution and strain distribution. The evolution value of bolt angle inclination during tensile tests was studied. In order to compare the distribution of displacements and strains along the surface, figures of image mapping are made. Several factors that are responsible for the failure of fiber-reinforced polymer composite materials are observed. It was found that strain concentrations observed in the specimens can be used to identify full-field damage onset and to monitor damage progression during loading. Moreover, there is an interaction between laminate pattern, laminate thickness, fastener size and type, surface strain concentrations, and out-of-plane displacement. Conclusions include a failure analysis associated with bolt angle inclinations and supported by microscopic visualizations of the composite specimen. The DIC results can be used to develop and accurately validate numerical models.Keywords: Carbone, woven, damage, digital image, bolted joint, the inclination of angle
Procedia PDF Downloads 8017416 Operator Optimization Based on Hardware Architecture Alignment Requirements
Authors: Qingqing Gai, Junxing Shen, Yu Luo
Abstract:
Due to the hardware architecture characteristics, some operators tend to acquire better performance if the input/output tensor dimensions are aligned to a certain minimum granularity, such as convolution and deconvolution commonly used in deep learning. Furthermore, if the requirements are not met, the general strategy is to pad with 0 to satisfy the requirements, potentially leading to the under-utilization of the hardware resources. Therefore, for the convolution and deconvolution whose input and output channels do not meet the minimum granularity alignment, we propose to transfer the W-dimensional data to the C-dimension for computation (W2C) to enable the C-dimension to meet the hardware requirements. This scheme also reduces the number of computations in the W-dimension. Although this scheme substantially increases computation, the operator’s speed can improve significantly. It achieves remarkable speedups on multiple hardware accelerators, including Nvidia Tensor cores, Qualcomm digital signal processors (DSPs), and Huawei neural processing units (NPUs). All you need to do is modify the network structure and rearrange the operator weights offline without retraining. At the same time, for some operators, such as the Reducemax, we observe that transferring the Cdimensional data to the W-dimension(C2W) and replacing the Reducemax with the Maxpool can accomplish acceleration under certain circumstances.Keywords: convolution, deconvolution, W2C, C2W, alignment, hardware accelerator
Procedia PDF Downloads 10417415 Online Learning for Modern Business Models: Theoretical Considerations and Algorithms
Authors: Marian Sorin Ionescu, Olivia Negoita, Cosmin Dobrin
Abstract:
This scientific communication reports and discusses learning models adaptable to modern business problems and models specific to digital concepts and paradigms. In the PAC (probably approximately correct) learning model approach, in which the learning process begins by receiving a batch of learning examples, the set of learning processes is used to acquire a hypothesis, and when the learning process is fully used, this hypothesis is used in the prediction of new operational examples. For complex business models, a lot of models should be introduced and evaluated to estimate the induced results so that the totality of the results are used to develop a predictive rule, which anticipates the choice of new models. In opposition, for online learning-type processes, there is no separation between the learning (training) and predictive phase. Every time a business model is approached, a test example is considered from the beginning until the prediction of the appearance of a model considered correct from the point of view of the business decision. After choosing choice a part of the business model, the label with the logical value "true" is known. Some of the business models are used as examples of learning (training), which helps to improve the prediction mechanisms for future business models.Keywords: machine learning, business models, convex analysis, online learning
Procedia PDF Downloads 14117414 The Selection of the Nearest Anchor Using Received Signal Strength Indication (RSSI)
Authors: Hichem Sassi, Tawfik Najeh, Noureddine Liouane
Abstract:
The localization information is crucial for the operation of WSN. There are principally two types of localization algorithms. The Range-based localization algorithm has strict requirements on hardware; thus, it is expensive to be implemented in practice. The Range-free localization algorithm reduces the hardware cost. However, it can only achieve high accuracy in ideal scenarios. In this paper, we locate unknown nodes by incorporating the advantages of these two types of methods. The proposed algorithm makes the unknown nodes select the nearest anchor using the Received Signal Strength Indicator (RSSI) and choose two other anchors which are the most accurate to achieve the estimated location. Our algorithm improves the localization accuracy compared with previous algorithms, which has been demonstrated by the simulating results.Keywords: WSN, localization, DV-Hop, RSSI
Procedia PDF Downloads 361