Search results for: tool holder interface
4645 A Novel Combined Finger Counting and Finite State Machine Technique for ASL Translation Using Kinect
Authors: Rania Ahmed Kadry Abdel Gawad Birry, Mohamed El-Habrouk
Abstract:
This paper presents a brief survey of the techniques used for sign language recognition along with the types of sensors used to perform the task. It presents a modified method for identification of an isolated sign language gesture using Microsoft Kinect with the OpenNI framework. It presents the way of extracting robust features from the depth image provided by Microsoft Kinect and the OpenNI interface and to use them in creating a robust and accurate gesture recognition system, for the purpose of ASL translation. The Prime Sense’s Natural Interaction Technology for End-user - NITE™ - was also used in the C++ implementation of the system. The algorithm presents a simple finger counting algorithm for static signs as well as directional Finite State Machine (FSM) description of the hand motion in order to help in translating a sign language gesture. This includes both letters and numbers performed by a user, which in-turn may be used as an input for voice pronunciation systems.Keywords: American sign language, finger counting, hand tracking, Microsoft Kinect
Procedia PDF Downloads 2964644 Numerical Investigation on Tsunami Suppression by Submerged Breakwater
Authors: Tasuku Hongo, Hiroya Mamori, Naoya Fukushima, Makoto Yamamoto
Abstract:
A tsunami induced by an earthquake gives a severe disaster in coastal area. As well known, the huge earthquake in Japan 2011 induced a huge tsunami and the tsunami caused serious damage in the Tohoku and Kanto area. Although breakwaters were constructed in the coast to suppress the tsunami, these were collapsed, and it resulted in severe disasters. In order to decrease the tsunami disaster, we propose the submerged breakwaters and investigate its effect on the tsunami behavior by means of numerical simulations. In order to reproduce tsunami and capture its interface, we employed a moving particle method which is one of the Lagragian methods. Different from ordinary breakwaters, the present breakwater is located in the under-sea. An effective installation condition is investigated by the parametric study. The results show that the submerged breakwater can decrease the wave force by the tsunami. Moreover, the combination of two submerged breakwaters can reduce the tsunami safely and effectively. Therefore, the present results give the effective condition of the installation of the under-sea breakwaters and its mechanism.Keywords: coastal area, tsunami force reduction, MPS method, submerged breakwater
Procedia PDF Downloads 1644643 A Novel Technological Approach to Maintaining the Cold Chain during Transportation
Authors: Philip J. Purnell
Abstract:
Innovators propose to use the Internet of Things to solve the problem of maintaining the cold chain during the transport of biopharmaceutical products. Sending a data logger with refrigerated goods is only useful to inform the recipient of the goods that they have either breached the cold chain and are therefore potentially spoiled or that they have not breached it and are therefore assumed to be in good condition. Connecting the data logger to the Internet of Things means that the supply chain manager will be informed in real-time of the exact location and the precise temperature of the material at any point on earth. Readable using a simple online interface, the supply chain manager will watch the progress of their material on a Google map together with accurate and crucially real-time temperature readings. The data logger will also send alarms to the supply chain manager if a cold chain breach becomes imminent allowing them time to contact the transporter and restore the cold chain before the material is affected. This development is expected to save billions of dollars in wasted biologics that currently arrive either spoiled or in an unreliable condition.Keywords: internet of things, cold chain, data logger, transportation
Procedia PDF Downloads 4424642 Surface-Enhanced Raman Detection in Chip-Based Chromatography via a Droplet Interface
Authors: Renata Gerhardt, Detlev Belder
Abstract:
Raman spectroscopy has attracted much attention as a structurally descriptive and label-free detection method. It is particularly suited for chemical analysis given as it is non-destructive and molecules can be identified via the fingerprint region of the spectra. In this work possibilities are investigated how to integrate Raman spectroscopy as a detection method for chip-based chromatography, making use of a droplet interface. A demanding task in lab-on-a-chip applications is the specific and sensitive detection of low concentrated analytes in small volumes. Fluorescence detection is frequently utilized but restricted to fluorescent molecules. Furthermore, no structural information is provided. Another often applied technique is mass spectrometry which enables the identification of molecules based on their mass to charge ratio. Additionally, the obtained fragmentation pattern gives insight into the chemical structure. However, it is only applicable as an end-of-the-line detection because analytes are destroyed during measurements. In contrast to mass spectrometry, Raman spectroscopy can be applied on-chip and substances can be processed further downstream after detection. A major drawback of Raman spectroscopy is the inherent weakness of the Raman signal, which is due to the small cross-sections associated with the scattering process. Enhancement techniques, such as surface enhanced Raman spectroscopy (SERS), are employed to overcome the poor sensitivity even allowing detection on a single molecule level. In SERS measurements, Raman signal intensity is improved by several orders of magnitude if the analyte is in close proximity to nanostructured metal surfaces or nanoparticles. The main gain of lab-on-a-chip technology is the building block-like ability to seamlessly integrate different functionalities, such as synthesis, separation, derivatization and detection on a single device. We intend to utilize this powerful toolbox to realize Raman detection in chip-based chromatography. By interfacing on-chip separations with a droplet generator, the separated analytes are encapsulated into numerous discrete containers. These droplets can then be injected with a silver nanoparticle solution and investigated via Raman spectroscopy. Droplet microfluidics is a sub-discipline of microfluidics which instead of a continuous flow operates with the segmented flow. Segmented flow is created by merging two immiscible phases (usually an aqueous phase and oil) thus forming small discrete volumes of one phase in the carrier phase. The study surveys different chip designs to realize coupling of chip-based chromatography with droplet microfluidics. With regards to maintaining a sufficient flow rate for chromatographic separation and ensuring stable eluent flow over the column different flow rates of eluent and oil phase are tested. Furthermore, the detection of analytes in droplets with surface enhanced Raman spectroscopy is examined. The compartmentalization of separated compounds preserves the analytical resolution since the continuous phase restricts dispersion between the droplets. The droplets are ideal vessels for the insertion of silver colloids thus making use of the surface enhancement effect and improving the sensitivity of the detection. The long-term goal of this work is the first realization of coupling chip based chromatography with droplets microfluidics to employ surface enhanced Raman spectroscopy as means of detection.Keywords: chip-based separation, chip LC, droplets, Raman spectroscopy, SERS
Procedia PDF Downloads 2454641 Mitigation of Electromagnetic Interference Generated by GPIB Control-Network in AC-DC Transfer Measurement System
Authors: M. M. Hlakola, E. Golovins, D. V. Nicolae
Abstract:
The field of instrumentation electronics is undergoing an explosive growth, due to its wide range of applications. The proliferation of electrical devices in a close working proximity can negatively influence each other’s performance. The degradation in the performance is due to electromagnetic interference (EMI). This paper investigates the negative effects of electromagnetic interference originating in the General Purpose Interface Bus (GPIB) control-network of the ac-dc transfer measurement system. Remedial measures of reducing measurement errors and failure of range of industrial devices due to EMI have been explored. The ac-dc transfer measurement system was analyzed for the common-mode (CM) EMI effects. Further investigation of coupling path as well as more accurate identification of noise propagation mechanism has been outlined. To prevent the occurrence of common-mode (ground loops) which was identified between the GPIB system control circuit and the measurement circuit, a microcontroller-driven GPIB switching isolator device was designed, prototyped, programmed and validated. This mitigation technique has been explored to reduce EMI effectively.Keywords: CM, EMI, GPIB, ground loops
Procedia PDF Downloads 2884640 Research on the Aero-Heating Prediction Based on Hybrid Meshes and Hybrid Schemes
Authors: Qiming Zhang, Youda Ye, Qinxue Jiang
Abstract:
Accurate prediction of external flowfield and aero-heating at the wall of hypersonic vehicle is very crucial for the design of aircrafts. Unstructured/hybrid meshes have more powerful advantages than structured meshes in terms of pre-processing, parallel computing and mesh adaptation, so it is imperative to develop high-resolution numerical methods for the calculation of aerothermal environment on unstructured/hybrid meshes. The inviscid flux scheme is one of the most important factors affecting the accuracy of unstructured/ hybrid mesh heat flux calculation. Here, a new hybrid flux scheme is developed and the approach of interface type selection is proposed: i.e. 1) using the exact Riemann scheme solution to calculate the flux on the faces parallel to the wall; 2) employing Sterger-Warming (S-W) scheme to improve the stability of the numerical scheme in other interfaces. The results of the heat flux fit the one observed experimentally and have little dependence on grids, which show great application prospect in unstructured/ hybrid mesh.Keywords: aero-heating prediction, computational fluid dynamics, hybrid meshes, hybrid schemes
Procedia PDF Downloads 2494639 Building an Arithmetic Model to Assess Visual Consistency in Townscape
Authors: Dheyaa Hussein, Peter Armstrong
Abstract:
The phenomenon of visual disorder is prominent in contemporary townscapes. This paper provides a theoretical framework for the assessment of visual consistency in townscape in order to achieve more favourable outcomes for users. In this paper, visual consistency refers to the amount of similarity between adjacent components of townscape. The paper investigates parameters which relate to visual consistency in townscape, explores the relationships between them and highlights their significance. The paper uses arithmetic methods from outside the domain of urban design to enable the establishment of an objective approach of assessment which considers subjective indicators including users’ preferences. These methods involve the standard of deviation, colour distance and the distance between points. The paper identifies urban space as a key representative of the visual parameters of townscape. It focuses on its two components, geometry and colour in the evaluation of the visual consistency of townscape. Accordingly, this article proposes four measurements. The first quantifies the number of vertices, which are points in the three-dimensional space that are connected, by lines, to represent the appearance of elements. The second evaluates the visual surroundings of urban space through assessing the location of their vertices. The last two measurements calculate the visual similarity in both vertices and colour in townscape by the calculation of their variation using methods including standard of deviation and colour difference. The proposed quantitative assessment is based on users’ preferences towards these measurements. The paper offers a theoretical basis for a practical tool which can alter the current understanding of architectural form and its application in urban space. This tool is currently under development. The proposed method underpins expert subjective assessment and permits the establishment of a unified framework which adds to creativity by the achievement of a higher level of consistency and satisfaction among the citizens of evolving townscapes.Keywords: townscape, urban design, visual assessment, visual consistency
Procedia PDF Downloads 3134638 Implementation of Building Information Modelling to Monitor, Assess, and Control the Indoor Environmental Quality of Higher Education Buildings
Authors: Mukhtar Maigari
Abstract:
The landscape of Higher Education (HE) institutions, especially following the CVID-19 pandemic, necessitates advanced approaches to manage Indoor Environmental Quality (IEQ) which is crucial for the comfort, health, and productivity of students and staff. This study investigates the application of Building Information Modelling (BIM) as a multifaceted tool for monitoring, assessing, and controlling IEQ in HE buildings aiming to bridge the gap between traditional management practices and the innovative capabilities of BIM. Central to the study is a comprehensive literature review, which lays the foundation by examining current knowledge and technological advancements in both IEQ and BIM. This review sets the stage for a deeper investigation into the practical application of BIM in IEQ management. The methodology consists of Post-Occupancy Evaluation (POE) which encompasses physical monitoring, questionnaire surveys, and interviews under the umbrella of case studies. The physical data collection focuses on vital IEQ parameters such as temperature, humidity, CO2 levels etc, conducted by using different equipment including dataloggers to ensure accurate data. Complementing this, questionnaire surveys gather perceptions and satisfaction levels from students, providing valuable insights into the subjective aspects of IEQ. The interview component, targeting facilities management teams, offers an in-depth perspective on IEQ management challenges and strategies. The research delves deeper into the development of a conceptual BIM-based framework, informed by the insight findings from case studies and empirical data. This framework is designed to demonstrate the critical functions necessary for effective IEQ monitoring, assessment, control and automation with real time data handling capabilities. This BIM-based framework leads to the developing and testing a BIM-based prototype tool. This prototype leverages on software such as Autodesk Revit with its visual programming tool i.e., Dynamo and an Arduino-based sensor network thereby allowing for real-time flow of IEQ data for monitoring, control and even automation. By harnessing the capabilities of BIM technology, the study presents a forward-thinking approach that aligns with current sustainability and wellness goals, particularly vital in the post-COVID-19 era. The integration of BIM in IEQ management promises not only to enhance the health, comfort, and energy efficiency of educational environments but also to transform them into more conducive spaces for teaching and learning. Furthermore, this research could influence the future of HE buildings by prompting universities and government bodies to revaluate and improve teaching and learning environments. It demonstrates how the synergy between IEQ and BIM can empower stakeholders to monitor IEQ conditions more effectively and make informed decisions in real-time. Moreover, the developed framework has broader applications as well; it can serve as a tool for other sustainability assessments, like energy analysis in HE buildings, leveraging measured data synchronized with the BIM model. In conclusion, this study bridges the gap between theoretical research and real-world application by practicalizing how advanced technologies like BIM can be effectively integrated to enhance environmental quality in educational institutions. It portrays the potential of integrating advanced technologies like BIM in the pursuit of improved environmental conditions in educational institutions.Keywords: BIM, POE, IEQ, HE-buildings
Procedia PDF Downloads 494637 Voice and Head Controlled Intelligent Wheelchair
Authors: Dechrit Maneetham
Abstract:
The aim of this paper was to design a void and head controlled electric power wheelchair (EPW). A novel activate the control system for quadriplegics with voice, head and neck mobility. Head movement has been used as a control interface for people with motor impairments in a range of applications. Acquiring measurements from the module is simplified through a synchronous a motor. Axis measures the two directions namely x and y. At the same time, patients can control the motorized wheelchair using voice signals (forward, backward, turn left, turn right, and stop) given by it self. The model of a dc motor is considered as a speed control by selection of a PID parameters using genetic algorithm. An experimental set-up constructed, which consists of micro controller as controller, a DC motor driven EPW and feedback elements. This paper is tuning methods of parameter for a pulse width modulation (PWM) control system. A speed controller has been designed successfully for closed loop of the dc motor so that the motor runs very closed to the reference speed and angle. Intelligent wheelchair can be used to ensure the person’s voice and head are attending the direction of travel asserted by a conventional, direction and speed control.Keywords: wheelchair, quadriplegia, rehabilitation , medical devices, speed control
Procedia PDF Downloads 5404636 An Implementation of a Configurable UART-to-Ethernet Converter
Authors: Jungho Moon, Myunggon Yoon
Abstract:
This paper presents an implementation of a configurable UART-to-Ethernet converter using an ARM-based 32-bit microcontroller as well as a dedicated configuration program running on a PC for configuring the operating parameters of the converter. The program was written in Python. Various parameters pertaining to the operation of the converter can be modified by the configuration program through the Ethernet interface of the converter. The converter supports 3 representative asynchronous serial communication protocols, RS-232, RS-422, and RS-485 and supports 3 network modes, TCP/IP server, TCP/IP client, and UDP client. The TCP/IP and UDP protocols were implemented on the microcontroller using an open source TCP/IP protocol stack called lwIP (A lightweight TCP/IP) and FreeRTOS, a free real-time operating system for embedded systems. Due to the use of a real-time operating system, the firmware of the converter was implemented as a multi-thread application and as a result becomes more modular and easier to develop. The converter can provide a seamless bridge between a serial port and an Ethernet port, thereby allowing existing legacy apparatuses with no Ethernet connectivity to communicate using the Ethernet protocol.Keywords: converter, embedded systems, ethernet, lwIP, UART
Procedia PDF Downloads 7064635 Drama, a Microcosm of Life Experiences: An Analysis of Symbolic Order and Social Relationships in Olu Obafemi’s Play
Authors: Victor Ademulegun Arijeniwa
Abstract:
This is a sociolinguistic study of Olu Obafemi’s Naira Has No Gender as a microcosm of life experiences. The paper assesses how Olu Obafemi’s use of language in the dramatic world serves as both social relationships and symbolic order of communicative roadmap that are capable of yielding well expressed and richly articulated sociolinguistic implications. Being the interface between language and social institutions, sociolinguistics and its application is highly utilitarian in linguistics analysis, especially where the language of a text appears to be deeply tensed, such as found in dramatic texts. The aim of this paper has been (i) to assess the symbolic orderly presentation of form in Olu Obafemi’Naira Has No Gender; (ii) to find out the linguistic elements and textual organization that represent social relationships in Olu Obafemi’s Naira Has No Gender. Using qualitative research design in data generation with insights from John Gumperz Interactional Sociolinguistics Theory with particular reference to contextualization cues and miscommunication, the paper identifies the implication of the dramatic discourse on society.Keywords: sociolinguistics, Microcosm, contextualisation, miscommunication variable, identity, symbolic order
Procedia PDF Downloads 1974634 Designing Automated Embedded Assessment to Assess Student Learning in a 3D Educational Video Game
Authors: Mehmet Oren, Susan Pedersen, Sevket C. Cetin
Abstract:
Despite the frequently criticized disadvantages of the traditional used paper and pencil assessment, it is the most frequently used method in our schools. Although assessments do an acceptable measurement, they are not capable of measuring all the aspects and the richness of learning and knowledge. Also, many assessments used in schools decontextualize the assessment from the learning, and they focus on learners’ standing on a particular topic but do not concentrate on how student learning changes over time. For these reasons, many scholars advocate that using simulations and games (S&G) as a tool for assessment has significant potentials to overcome the problems in traditionally used methods. S&G can benefit from the change in technology and provide a contextualized medium for assessment and teaching. Furthermore, S&G can serve as an instructional tool rather than a method to test students’ learning at a particular time point. To investigate the potentials of using educational games as an assessment and teaching tool, this study presents the implementation and the validation of an automated embedded assessment (AEA), which can constantly monitor student learning in the game and assess their performance without intervening their learning. The experiment was conducted on an undergraduate level engineering course (Digital Circuit Design) with 99 participant students over a period of five weeks in Spring 2016 school semester. The purpose of this research study is to examine if the proposed method of AEA is valid to assess student learning in a 3D Educational game and present the implementation steps. To address this question, this study inspects three aspects of the AEA for the validation. First, the evidence-centered design model was used to lay out the design and measurement steps of the assessment. Then, a confirmatory factor analysis was conducted to test if the assessment can measure the targeted latent constructs. Finally, the scores of the assessment were compared with an external measure (a validated test measuring student learning on digital circuit design) to evaluate the convergent validity of the assessment. The results of the confirmatory factor analysis showed that the fit of the model with three latent factors with one higher order factor was acceptable (RMSEA < 0.00, CFI =1, TLI=1.013, WRMR=0.390). All of the observed variables significantly loaded to the latent factors in the latent factor model. In the second analysis, a multiple regression analysis was used to test if the external measure significantly predicts students’ performance in the game. The results of the regression indicated the two predictors explained 36.3% of the variance (R2=.36, F(2,96)=27.42.56, p<.00). It was found that students’ posttest scores significantly predicted game performance (β = .60, p < .000). The statistical results of the analyses show that the AEA can distinctly measure three major components of the digital circuit design course. It was aimed that this study can help researchers understand how to design an AEA, and showcase an implementation by providing an example methodology to validate this type of assessment.Keywords: educational video games, automated embedded assessment, assessment validation, game-based assessment, assessment design
Procedia PDF Downloads 4214633 The Power of in situ Characterization Techniques in Heterogeneous Catalysis: A Case Study of Deacon Reaction
Authors: Ramzi Farra, Detre Teschner, Marc Willinger, Robert Schlögl
Abstract:
Introduction: The conventional approach of characterizing solid catalysts under static conditions, i.e., before and after reaction, does not provide sufficient knowledge on the physicochemical processes occurring under dynamic conditions at the molecular level. Hence, the necessity of improving new in situ characterizing techniques with the potential of being used under real catalytic reaction conditions is highly desirable. In situ Prompt Gamma Activation Analysis (PGAA) is a rapidly developing chemical analytical technique that enables us experimentally to assess the coverage of surface species under catalytic turnover and correlate these with the reactivity. The catalytic HCl oxidation (Deacon reaction) over bulk ceria will serve as our example. Furthermore, the in situ Transmission Electron Microscopy is a powerful technique that can contribute to the study of atmosphere and temperature induced morphological or compositional changes of a catalyst at atomic resolution. The application of such techniques (PGAA and TEM) will pave the way to a greater and deeper understanding of the dynamic nature of active catalysts. Experimental/Methodology: In situ Prompt Gamma Activation Analysis (PGAA) experiments were carried out to determine the Cl uptake and the degree of surface chlorination under reaction conditions by varying p(O2), p(HCl), p(Cl2), and the reaction temperature. The abundance and dynamic evolution of OH groups on working catalyst under various steady-state conditions were studied by means of in situ FTIR with a specially designed homemade transmission cell. For real in situ TEM we use a commercial in situ holder with a home built gas feeding system and gas analytics. Conclusions: Two complimentary in situ techniques, namely in situ PGAA and in situ FTIR were utilities to investigate the surface coverage of the two most abundant species (Cl and OH). The OH density and Cl uptake were followed under multiple steady-state conditions as a function of p(O2), p(HCl), p(Cl2), and temperature. These experiments have shown that, the OH density positively correlates with the reactivity whereas Cl negatively. The p(HCl) experiments give rise to increased activity accompanied by Cl-coverage increase (opposite trend to p(O2) and T). Cl2 strongly inhibits the reaction, but no measurable increase of the Cl uptake was found. After considering all previous observations we conclude that only a minority of the available adsorption sites contribute to the reactivity. In addition, the mechanism of the catalysed reaction was proposed. The chlorine-oxygen competition for the available active sites renders re-oxidation as the rate-determining step of the catalysed reaction. Further investigations using in situ TEM are planned and will be conducted in the near future. Such experiments allow us to monitor active catalysts at the atomic scale under the most realistic conditions of temperature and pressure. The talk will shed a light on the potential and limitations of in situ PGAA and in situ TEM in the study of catalyst dynamics.Keywords: CeO2, deacon process, in situ PGAA, in situ TEM, in situ FTIR
Procedia PDF Downloads 2914632 Software Quality Assurance in 5G Technology-Redefining Wireless Communication: A Comprehensive Survey
Authors: Sumbal Riaz, Sardar-un-Nisa, Mehreen Sirshar
Abstract:
5G - The 5th generation of mobile phone and data communication standards is the next edge of innovation for whole mobile industry. 5G is Real Wireless World System and it will provide a totally wireless communication system all over the world without limitations. 5G uses many 4g technologies and it will hit the market in 2020. This research is the comprehensive survey on the quality parameters of 5G technology.5G provide High performance, Interoperability, easy roaming, fully converged services, friendly interface and scalability at low cost. To meet the traffic demands in future fifth generation wireless communications systems will include i) higher densification of heterogeneous networks with massive deployment of small base stations supporting various Radio Access Technologies (RATs), ii) use of massive Multiple Input Multiple Output (MIMO) arrays, iii) use of millimetre Wave spectrum where larger wider frequency bands are available, iv) direct device to device (D2D) communication, v) simultaneous transmission and reception, vi) cognitive radio technology.Keywords: 5G, 5th generation, innovation, standard, wireless communication
Procedia PDF Downloads 4444631 Psychometric Properties of the Social Skills Rating System: Teacher Version
Authors: Amani Kappi, Ana Maria Linares, Gia Mudd-Martin
Abstract:
Children with Attention Deficit Hyperactivity Disorder (ADHD) are more likely to develop social skills deficits that can lead to academic underachievement, peer rejection, and maladjustment. Surveying teachers about children's social skills with ADHD will become a significant factor in identifying whether the children will be diagnosed with social skills deficits. The teacher-specific version of the Social Skills Rating System scale (SSRS-T) has been used as a screening tool for children's social behaviors. The psychometric properties of the SSRS-T have been evaluated in various populations and settings, such as when used by teachers to assess social skills for children with learning disabilities. However, few studies have been conducted to examine the psychometric properties of the SSRS-T when used to assess children with ADHD. The purpose of this study was to examine the psychometric properties of the SSRS-T and two SSRS-T subscales, Social Skills and Problem Behaviors. This was a secondary analysis of longitudinal data from the Fragile Families and Child Well-Being Study. This study included a sample of 194 teachers who used the SSRS-T to assess the social skills of children aged 8 to 10 years with ADHD. Exploratory principal components factor analysis was used to assess the construct validity of the SSRS-T scale. Cronbach’s alpha value was used to assess the internal consistency reliability of the total SSRS-T scale and the subscales. Item analyses included item-item intercorrelations, item-to-subscale correlations, and Cronbach’s alpha value changes with item deletion. The results of internal consistency reliability for both the total scale and subscales were acceptable. The results of the exploratory factor analysis supported the five factors of SSRS-T (Cooperation, Self-control, Assertion, Internalize behaviors, and Externalize behaviors) reported in the original version. Findings indicated that SSRS-T is a reliable and valid tool for assessing the social behaviors of children with ADHD.Keywords: ADHD, children, social skills, SSRS-T, psychometric properties
Procedia PDF Downloads 1314630 Rotary Entrainment in Two Phase Stratified Gas-Liquid Layers: An Experimental Study
Authors: Yagya Sharma, Basanta K. Rana, Arup K. Das
Abstract:
Rotary entrainment is a phenomenon in which the interfaces of two immiscible fluids are subjected to external flux in the form of rotation. Present work reports the experimental study on rotary motion of a horizontal cylinder between the interface of air and water to observe the penetration of gas inside the liquid. Experiments have been performed to establish entrainment of air mass in water alongside the cylindrical surface. The movement of tracer and seeded particles have been tracked to calculate the speed and path of the entrained air inside water. Simplified particle image velocimetry technique has been used to trace the movement of particles/tracers at the moment they are injected inside the entrainment zone and suspended beads have been used to replicate the particle movement with respect to time in order to determine the flow dynamics of the fluid along the cylinder. Present paper establishes a thorough experimental analysis of the rotary entrainment phenomenon between air and water keeping in interest the extent to which we can intermix the two and also to study its entrainment trajectories.Keywords: entrainment, gas-liquid flow, particle image velocimetry, stratified layer mixing
Procedia PDF Downloads 3394629 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 394628 The Influence of Fiber Fillers on the Bonding Safety of Wood-Adhesive Interfaces: A Fracture Energetic Approach
Authors: M. H. Brandtner-Hafner
Abstract:
Adhesives have established themselves as an innovative joining technology in the wood industry. The strengths of adhesive bonding lie in the realization of lightweight designs, the avoidance of material weakening, and the joining of different types of materials. There is now a number of ways to positively influence the properties of bonded joints. One way is to add fiber fillers. This leads to an improvement in adhesion, structural integrity, and fracture toughness. In this study, the effectiveness of fiber-modified adhesives for bonding wooden joints is reviewed. A series of experimental tests were performed using the fracture analytical GF-principle to study the adhesive bonding safety and performance of the wood-adhesive interface. Two different construction adhesives based on epoxy and PUR were modified with different fiber materials and applied to bond wooden joints. The results show that bonding efficiency by adding fibrous materials to the bonding matrix leads to significant improvements in structural material properties.Keywords: fiber-modified adhesives, bonding safety, wood-adhesive interfaces, fracture analysis
Procedia PDF Downloads 974627 Analysis of Vapor-Phase Diffusion of Benzene from Contaminated Soil
Authors: Asma A. Parlin, K. Nakamura, N. Watanabe, T. Komai
Abstract:
Understanding the effective diffusion of benzene vapor in the soil-atmosphere interface is important as an intrusion of benzene into the atmosphere from the soil is largely driven by diffusion. To analyze the vertical one dimensional effective diffusion of benzene vapor in porous medium with high water content, diffusion experiments were conducted in soil columns using Andosol soil and Toyoura silica sand with different water content; for soil water content was from 0 to 30 wt.% and for sand it was from 0.06 to 10 wt.%. In soil, a linear relation was found between water content and effective diffusion coefficient while the effective diffusion coefficient didn’t change in the sand with increasing water. A numerical transport model following unsteady-state approaches based on Fick’s second law was used to match the required time for a steady state of the gas phase concentration profile of benzene to the experimentally measured concentration profile gas phase in the column. The result highlighted that both the water content and porosity might increase vertical diffusion of benzene vapor in soil.Keywords: benzene vapor-phase, effective diffusion, subsurface soil medium, unsteady state
Procedia PDF Downloads 1434626 Comparison of Transparent Nickel Doped Cobalt Sulfide and Platinum Counter Electrodes Used in Quasi-Solid State Dye Sensitized Solar Cells
Authors: Dimitra Sygkridou, Dimitrios Karageorgopoulos, Elias Stathatos, Evangelos Vitoratos
Abstract:
Transparent nickel doped cobalt sulfide was fabricated on a SnO2:F electrode and tested as an efficient electrocatalyst and as an alternative to the expensive platinum counter electrode. In order to investigate how this electrode could affect the electrical characteristics of a dye-sensitized solar cell, we manufactured cells with the same TiO2 photoanode sensitized with dye (N719) and employing the same quasi-solid electrolyte, altering only the counter electrode used. The cells were electrically and electrochemically characterized and it was observed that the ones with the Ni doped CoS2 outperformed the efficiency of the cells with the Pt counter electrode (3.76% and 3.44% respectively). Particularly, the higher efficiency of the cells with the Ni doped CoS2 counter electrode (CE) is mainly because of the enhanced photocurrent density which is attributed to the enhanced electrocatalytic ability of the CE and the low charge transfer resistance at the CE/electrolyte interface.Keywords: nickel doped cobalt sulfide, counter electrodes, dye-sensitized solar cells, quasi-solid state electrolyte, hybrid organic-inorganic materials
Procedia PDF Downloads 7604625 Implementation of a Lattice Boltzmann Method for Multiphase Flows with High Density Ratios
Authors: Norjan Jumaa, David Graham
Abstract:
We present a Lattice Boltzmann Method (LBM) for multiphase flows with high viscosity and density ratios. The motion of the interface between fluids is modelled by solving the Cahn-Hilliard (CH) equation with LBM. Incompressibility of the velocity fields in each phase is imposed by using a pressure correction scheme. We use a unified LBM approach with separate formulations for the phase field, the pressure less Naiver-Stokes (NS) equations and the pressure Poisson equation required for correction of the velocity field. The implementation has been verified for various test case. Here, we present results for some complex flow problems including two dimensional single and multiple mode Rayleigh-Taylor instability and we obtain good results when comparing with those in the literature. The main focus of our work is related to interactions between aerated or non-aerated waves and structures so we also present results for both high viscosity and low viscosity waves.Keywords: lattice Boltzmann method, multiphase flows, Rayleigh-Taylor instability, waves
Procedia PDF Downloads 2344624 Measurement and Monitoring of Graduate Attributes via iCGPA Implementation and ACADEMIA Programming: UNIMAS Case Study
Authors: Shanti Faridah Salleh, Azzahrah Anuar, Hamimah Ujir, Rohana Sapawi, Wan Hashim Wan Ibrahim, Noraziah Abdul Wahab, Majina Sulaiman, Raudhah Ahmadi, Al-Khalid Othman, Johari Abdullah
Abstract:
Integrated Cumulative Grade Point Average or iCGPA is an evaluation and reporting system that represents a comprehensive development of students’ achievement in their academic programs. Universiti Malaysia Sarawak, UNIMAS has started its implementation of iCGPA in 2016. iCGPA is driven by the Outcome-Based Education (OBE) system that has been long integrated into the higher education in Malaysia. iCGPA is not only a tool to enhance the OBE concept through constructive alignment but it is also an integrated mechanism to assist various stakeholders in making decisions or planning for program improvement. The outcome of this integrated system is the reporting of students’ academic performance in terms of cognitive (knowledge), psychomotor (skills), and affective (attitude) of which the students acquire throughout the duration of their study. The iCGPA reporting illustrates the attainment of student’s attribute in the eight domains of learning outcomes listed in the Malaysian Qualifications Framework (MQF). This paper discusses on the implementation of iCGPA in UNIMAS on the policy and strategy to direct the whole university to implement the iCGPA. The steps and challenges in integrating the exsting Outcome-Based Education and utilising iCGPA as a tool to quantify the students’ achievement are also highlighted in this paper. Finally, the ACADEMIA system, which is a dedicated centralised program ensure the implementation of iCGPA is a success has been developed. This paper discusses the structure and the analysis of ACADEMIA program and concludes the analysis made on the improvement made on the implementation of constructive alignment in all 40 programs involves in iCGPA implementation.Keywords: constructive alignment, holistic graduates, mapping of assessment, programme outcome
Procedia PDF Downloads 2084623 Realization of Wearable Inertial Measurement Units-Sensor-Fusion Harness to Control Therapeutic Smartphone Applications
Authors: Svilen Dimitrov, Manthan Pancholi, Norbert Schmitz, Didier Stricker
Abstract:
This paper presents the end-to-end development of a wearable motion sensing harness consisting of computational unit and four inertial measurement units to control three smartphone therapeutic games for children. The inertial data is processed in real time to obtain lower body motion information like knee raises, feet taps and squads. By providing a Wi-Fi connection interface the sensor harness acts wireless remote control for smartphone applications. By performing various lower body movements the users provoke corresponding game state changes. In contrary to the current similar offers, like Nintendo Wii Remote, Xbox Kinect and Playstation Move, this product, consisting of the sensor harness and the applications on top of it, are fully wearable, which means they do not rely on the user to be bound to concrete soft- or hardwareequipped space.Keywords: wearable harness, inertial measurement units, smartphone therapeutic games, motion tracking, lower-body activity monitoring
Procedia PDF Downloads 4034622 Understanding Student Engagement through Sentiment Analytics of Response Times to Electronically Shared Feedback
Authors: Yaxin Bi, Peter Nicholl
Abstract:
The rapid advancement of Information and communication technologies (ICT) is extremely influencing every aspect of Higher Education. It has transformed traditional teaching, learning, assessment and feedback into a new era of Digital Education. This also introduces many challenges in capturing and understanding student engagement with their studies in Higher Education. The School of Computing at Ulster University has developed a Feedback And Notification (FAN) Online tool that has been used to send students links to personalized feedback on their submitted assessments and record students’ frequency of review of the shared feedback as well as the speed of collection. The feedback that the students initially receive is via a personal email directing them through to the feedback via a URL link that maps to the feedback created by the academic marker. This feedback is typically a Word or PDF report including comments and the final mark for the work submitted approximately three weeks before. When the student clicks on the link, the student’s personal feedback is viewable in the browser and they can view the contents. The FAN tool provides the academic marker with a report that includes when and how often a student viewed the feedback via the link. This paper presents an investigation into student engagement through analyzing the interaction timestamps and frequency of review by the student. We have proposed an approach to modeling interaction timestamps and use sentiment classification techniques to analyze the data collected over the last five years for a set of modules. The data studied is across a number of final years and second-year modules in the School of Computing. The paper presents the details of quantitative analysis methods and describes further their interactions with the feedback overtime on each module studied. We have projected the students into different groups of engagement based on sentiment analysis results and then provide a suggestion of early targeted intervention for the set of students seen to be under-performing via our proposed model.Keywords: feedback, engagement, interaction modelling, sentiment analysis
Procedia PDF Downloads 1034621 Modeling of Timing in a Cyber Conflict to Inform Critical Infrastructure Defense
Authors: Brian Connett, Bryan O'Halloran
Abstract:
Systems assets within critical infrastructures were seemingly safe from the exploitation or attack by nefarious cyberspace actors. Now, critical infrastructure is a target and the resources to exploit the cyber physical systems exist. These resources are characterized in terms of patience, stealth, replication-ability and extraordinary robustness. System owners are obligated to maintain a high level of protection measures. The difficulty lies in knowing when to fortify a critical infrastructure against an impending attack. Models currently exist that demonstrate the value of knowing the attacker’s capabilities in the cyber realm and the strength of the target. The shortcomings of these models are that they are not designed to respond to the inherent fast timing of an attack, an impetus that can be derived based on open-source reporting, common knowledge of exploits of and the physical architecture of the infrastructure. A useful model will inform systems owners how to align infrastructure architecture in a manner that is responsive to the capability, willingness and timing of the attacker. This research group has used an existing theoretical model for estimating parameters, and through analysis, to develop a decision tool for would-be target owners. The continuation of the research develops further this model by estimating the variable parameters. Understanding these parameter estimations will uniquely position the decision maker to posture having revealed the vulnerabilities of an attacker’s, persistence and stealth. This research explores different approaches to improve on current attacker-defender models that focus on cyber threats. An existing foundational model takes the point of view of an attacker who must decide what cyber resource to use and when to use it to exploit a system vulnerability. It is valuable for estimating parameters for the model, and through analysis, develop a decision tool for would-be target owners.Keywords: critical infrastructure, cyber physical systems, modeling, exploitation
Procedia PDF Downloads 1924620 Finite Element Simulation of Embankment Bumps at Bridge Approaches, Comparison Study
Authors: F. A. Hassona, M. D. Hashem, R. I. Melek, B. M. Hakeem
Abstract:
A differential settlement at the end of a bridge near the interface between the abutment and the embankment is a persistent problem for highway agencies. The differential settlement produces the common ‘bump at the end of the bridge’. Reduction in steering response, distraction to the driver, added risk and expense to maintenance operation, and reduction in a transportation agency’s public image are all undesirable effects of these uneven and irregular transitions. This paper attempts to simulate the bump at the end of the bridge using PLAXIS finite element 2D program. PLAXIS was used to simulate a laboratory model called Bridge to Embankment Simulator of Transition (B.E.S.T.) device which was built by others to investigate this problem. A total of six numerical simulations were conducted using hardening- soil model with rational assumptions of missing soil parameters to estimate the bump at the end of the bridge. The results show good agreements between the numerical and the laboratory models. Important factors influencing bumps at bridge ends were also addressed in light of the model results.Keywords: bridge approach slabs, bridge bump, hardening-soil, PLAXIS 2D, settlement
Procedia PDF Downloads 3484619 Virtual Reality as a Tool in Modern Education
Authors: Łukasz Bis
Abstract:
The author is going to discuss virtual reality and its importance for new didactic methods. It has been known for years that experience-based education gives much better results in terms of long-term memory than theoretical study. However, practice is expensive - virtual reality allows the use of an empirical approach to learning, with minimized production costs. The author defines what makes a given VR experience appropriate (adequate) for the didactic and cognitive process. The article is a kind of a list of guidelines and their importance for the VR experience under development.Keywords: virtual reality, education, universal design, guideline
Procedia PDF Downloads 1064618 BIM4Cult Leveraging BIM and IoT for Enhancing Fire Safety in Historical Buildings
Authors: Anastasios Manos, Despina Elisabeth Filippidou
Abstract:
Introduction: Historical buildings are an inte-gral part of the cultural heritage of every place, and beyond the obvious need for protection against risks, they have specific requirements regarding the handling of hazards and disasters such as fire, floods, earthquakes, etc. Ensuring high levels of protection and safety for these buildings is impera-tive for two distinct but interconnected reasons: a) they themselves constitute cultural heritage, and b) they are often used as museums/cultural spaces, necessitating the protection of both human life (vis-itors and workers) and the cultural treasures they house. However, these buildings present serious constraints in implementing the necessary measures to protect them from destruction due to their unique architecture, construction methods, and/or the structural materials used in the past, which have created an existing condition that is sometimes challenging to reshape and operate within the framework of modern regulations and protection measures. One of the most devastating risks that threaten historical buildings is fire. Catastrophic fires demonstrate the need for timely evaluation of fire safety measures in historical buildings. Recog-nizing the criticality of protecting historical build-ings from the risk of fire, the Confederation of Fire Protection Associations in Europe (CFPA E) issued specific guidelines in 2013 (CFPA-E Guideline No 30:2013 F) for the fire protection of historical buildings at the European level. However, until now, few actions have been implemented towards leveraging modern technologies in the field of con-struction and maintenance of buildings, such as Building Information Modeling (BIM) and the Inter-net of Things (IoT), for the protection of historical buildings from risks like fires, floods, etc. The pro-ject BIM4Cult has bee developed in order to fill this gap. It is a tool for timely assessing and monitoring of the fire safety level of historical buildings using BIM and IoT technologies in an integrated manner. The tool serves as a decision support expert system for improving the fire safety of historical buildings by continuously monitoring, controlling and as-sessing critical risk factors for fire.Keywords: Iot, fire, BIM, expert system
Procedia PDF Downloads 714617 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products
Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry
Abstract:
The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively
Procedia PDF Downloads 904616 Canned Sealless Pumps for Hazardous Applications
Authors: Shuja Alharbi
Abstract:
Oil and Gas industry has many applications considered as toxic or hazardous, where process fluid leakage is not permitted and leads to health, safety, and environmental impacts. Caustic/Acidic applications, High Benzene Concentrations, Hydrogen sulfide rich oil/gas as well as liquids operating above their auto-ignition temperatures are examples of such liquids that pose as a risk to the industry operation, and for those, special arrangements are in place to allow for the safe operation environment. Pumps in the industry requires special attention, specifically in the interface between the fluid and the environment, where the potential of leakages are foreseen. Mechanical Seals are used to contain the fluid within the equipment, but the prices are ever increasing for such seals, along with maintenance, design, and operating requirements. Several alternatives to seals are being employed nowadays, such as Sealless systems, which is hermitically sealed from the atmosphere and does not require sealing. This technology is considered relatively new and requires more studies to understand the limitations and factors associated from an owner and design perspective. Things like financial factors, maintenance factors, and design limitation should be studies further in order to have a mature and reliable technical solution available to end users.Keywords: pump, sealless, selection, failure
Procedia PDF Downloads 100