Search results for: sensory processing patterns
5767 Deconvolution of Anomalous Fast Fourier Transform Patterns for Tin Sulfide
Authors: I. Shuro
Abstract:
The crystal structure of Tin Sulfide prepared by certain chemical methods is investigated using High-Resolution Transmission Electron Microscopy (HRTEM), Scanning Electron Microscopy (SEM), and X-ray diffraction (XRD) methods. An anomalous HRTEM Fast Fourier Transform (FFT) exhibited a central scatter of diffraction spots, which is surrounded by secondary clusters of spots arranged in a hexagonal pattern around the central cluster was observed. FFT analysis has revealed a long lattice parameter and mostly viewed along a hexagonal axis where there many columns of atoms slightly displaced from one another. This FFT analysis has revealed that the metal sulfide has a long-range order interwoven chain of atoms in its crystal structure. The observed crystalline structure is inconsistent with commonly observed FFT patterns of chemically synthesized Tin Sulfide nanocrystals and thin films. SEM analysis showed the morphology of a myriad of multi-shaped crystals ranging from hexagonal, cubic, and spherical micro to nanostructured crystals. This study also investigates the presence of quasi-crystals as reflected by the presence of mixed local symmetries.Keywords: fast fourier transform, high resolution transmission electron microscopy, tin sulfide, crystalline structure
Procedia PDF Downloads 1445766 Deep Learning Application for Object Image Recognition and Robot Automatic Grasping
Authors: Shiuh-Jer Huang, Chen-Zon Yan, C. K. Huang, Chun-Chien Ting
Abstract:
Since the vision system application in industrial environment for autonomous purposes is required intensely, the image recognition technique becomes an important research topic. Here, deep learning algorithm is employed in image system to recognize the industrial object and integrate with a 7A6 Series Manipulator for object automatic gripping task. PC and Graphic Processing Unit (GPU) are chosen to construct the 3D Vision Recognition System. Depth Camera (Intel RealSense SR300) is employed to extract the image for object recognition and coordinate derivation. The YOLOv2 scheme is adopted in Convolution neural network (CNN) structure for object classification and center point prediction. Additionally, image processing strategy is used to find the object contour for calculating the object orientation angle. Then, the specified object location and orientation information are sent to robotic controller. Finally, a six-axis manipulator can grasp the specific object in a random environment based on the user command and the extracted image information. The experimental results show that YOLOv2 has been successfully employed to detect the object location and category with confidence near 0.9 and 3D position error less than 0.4 mm. It is useful for future intelligent robotic application in industrial 4.0 environment.Keywords: deep learning, image processing, convolution neural network, YOLOv2, 7A6 series manipulator
Procedia PDF Downloads 2505765 Investigating the Sloshing Characteristics of a Liquid by Using an Image Processing Method
Authors: Ufuk Tosun, Reza Aghazadeh, Mehmet Bülent Özer
Abstract:
This study puts forward a method to analyze the sloshing characteristics of liquid in a tuned sloshing absorber system by using image processing tools. Tuned sloshing vibration absorbers have recently attracted researchers’ attention as a seismic load damper in constructions due to its practical and logistical convenience. The absorber is liquid which sloshes and applies a force in opposite phase to the motion of structure. Experimentally characterization of the sloshing behavior can be utilized as means of verifying the results of numerical analysis. It can also be used to identify the accuracy of assumptions related to the motion of the liquid. There are extensive theoretical and experimental studies in the literature related to the dynamical and structural behavior of tuned sloshing dampers. In most of these works there are efforts to estimate the sloshing behavior of the liquid such as free surface motion and total force applied by liquid to the wall of container. For these purposes the use of sensors such as load cells and ultrasonic sensors are prevalent in experimental works. Load cells are only capable of measuring the force and requires conducting tests both with and without liquid to obtain pure sloshing force. Ultrasonic level sensors give point-wise measurements and hence they are not applicable to measure the whole free surface motion. Furthermore, in the case of liquid splashing it may give incorrect data. In this work a method for evaluating the sloshing wave height by using camera records and image processing techniques is presented. In this method the motion of the liquid and its container, made of a transparent material, is recorded by a high speed camera which is aligned to the free surface of the liquid. The video captured by the camera is processed frame by frame by using MATLAB Image Processing toolbox. The process starts with cropping the desired region. By recognizing the regions containing liquid and eliminating noise and liquid splashing, the final picture depicting the free surface of liquid is achieved. This picture then is used to obtain the height of the liquid through the length of container. This process is verified by ultrasonic sensors that measured fluid height on the surface of liquid.Keywords: fluid structure interaction, image processing, sloshing, tuned liquid damper
Procedia PDF Downloads 3445764 Patient Safety of Eating Ready-Made Meals at Government Hospitals
Authors: Hala Kama Ahmed Rashwan
Abstract:
Ensuring the patient safety especially at intensive care units and those exposed to hospital tools and equipment is one of the most important challenges facing healthcare today. Outbreak of food poisoning as a result of food-borne pathogens has been reported in many hospitals and care homes all over the world due to hospital meals. Patient safety of eating hospital meals is a fundamental principle of healthcare; it is new healthcare disciplines that assure the food raw materials, food storage, meals processing, and control of kitchen errors that often lead to adverse healthcare events. The aim of this article is to promote any hospital in attaining the hygienic practices and better quality system during processing of the ready-to- eat meals for intensive care units patients according to the WHO safety guidelines.Keywords: hospitals, meals, safety, intensive care
Procedia PDF Downloads 5105763 Choice of Optimal Methods for Processing Phosphate Raw Materials into Complex Mineral Fertilizers
Authors: Andrey Norov
Abstract:
Based on the generalization of scientific and production experience and the latest developments of JSC “NIUIF”, the oldest (founded in September 1919) and the only Russian research institute for phosphorus-containing fertilizers, this paper shows the factors that determine the reasonable choice of a method for processing phosphate raw materials into complex fertilizers. These factors primarily include the composition of phosphate raw materials and the impurities contained in it, as well as some parameters of the process mode, wastelessness, ecofriendliness, energy saving, maximum use of the heat of chemical reactions, fire and explosion safety, efficiency, productive capacity, the required product range and the possibility of creating flexible technologies, compliance with BAT principles, etc. The presented data allow to choose the right technology for complex granular fertilizers, depending on the abovementioned factors.Keywords: BAT, ecofriendliness, energy saving, phosphate raw materials, wastelessness
Procedia PDF Downloads 875762 The Subcellular Localisation of EhRRP6 and Its Involvement in Pre-Ribosomal RNA Processing in Growth-Stressed Entamoeba histolytica
Authors: S. S. Singh, A. Bhattacharya, S. Bhattacharya
Abstract:
The eukaryotic exosome complex plays a pivotal role in RNA biogenesis, maturation, surveillance and differential expression of various RNAs in response to varying environmental signals. The exosome is composed of evolutionary conserved nine core subunits and the associated exonucleases Rrp6 and Rrp44. Rrp6p is crucial for the processing of rRNAs, other non-coding RNAs, regulation of polyA tail length and termination of transcription. Rrp6p, a 3’-5’ exonuclease is required for degradation of 5’-external transcribed spacer (ETS) released from the rRNA precursors during the early steps of pre-rRNA processing. In the parasitic protist Entamoeba histolytica in response to growth stress, there occurs the accumulation of unprocessed pre-rRNA and 5’ ETS sub fragment. To understand the processes leading to this accumulation, we looked for Rrp6 and the exosome subunits in E. histolytica, by in silico approaches. Of the nine core exosomal subunits, seven had high percentage of sequence similarity with the yeast and human. The EhRrp6 homolog contained exoribonuclease and HRDC domains like yeast but its N- terminus lacked the PMC2NT domain. EhRrp6 complemented the temperature sensitive phenotype of yeast rrp6Δ cells suggesting conservation of biological activity. We showed 3’-5’ exoribonuclease activity of EhRrp6p with in vitro-synthesized appropriate RNAs substrates. Like the yeast enzyme, EhRrp6p degraded unstructured RNA, but could degrade the stem-loops slowly. Furthermore, immunolocalization revealed that EhRrp6 was nuclear-localized in normal cells but was diminished from nucleus during serum starvation, which could explain the accumulation of 5’ETS during stress. Our study shows functional conservation of EhRrp6p in E.histolytica, an early-branching eukaryote, and will help to understand the evolution of exosomal components and their regulatory function.Keywords: entamoeba histolytica, exosome complex, rRNA processing, Rrp6
Procedia PDF Downloads 2015761 Spatial Temporal Rainfall Trends in Australia
Authors: Bright E. Owusu, Nittaya McNeil
Abstract:
Rainfall is one of the most essential quantities in meteorology and hydrology. It has important impacts on people’s daily life and excess or inadequate of it could bring tremendous losses in economy and cause fatalities. Population increase around the globe tends to have a corresponding increase in settlement and industrialization. Some countries are affected by flood and drought occasionally due to climate change, which disrupt most of the daily activities. Knowledge of trends in spatial and temporal rainfall variability and their physical explanations would be beneficial in climate change assessment and to determine erosivity. This study describes the spatial-temporal variability of daily rainfall in Australia and their corresponding long-term trend during 1950-2013. The spatial patterns were investigated by using exploratory factor analysis and the long term trend in rainfall time series were determined by linear regression, Mann-Kendall rank statistics and the Sen’s slope test. The exploratory factor analysis explained most of the variations in the data and grouped Australia into eight distinct rainfall regions with different rainfall patterns. Significant increasing trends in annual rainfall were observed in the northern regions of Australia. However, the northeastern part was the wettest of all the eight rainfall regions.Keywords: climate change, explanatory factor analysis, Mann-Kendall and Sen’s slope test, rainfall.
Procedia PDF Downloads 3525760 Combining Multiscale Patterns of Weather and Sea States into a Machine Learning Classifier for Mid-Term Prediction of Extreme Rainfall in North-Western Mediterranean Sea
Authors: Pinel Sebastien, Bourrin François, De Madron Du Rieu Xavier, Ludwig Wolfgang, Arnau Pedro
Abstract:
Heavy precipitation constitutes a major meteorological threat in the western Mediterranean. Research has investigated the relationship between the states of the Mediterranean Sea and the atmosphere with the precipitation for short temporal windows. However, at a larger temporal scale, the precursor signals of heavy rainfall in the sea and atmosphere have drawn little attention. Moreover, despite ongoing improvements in numerical weather prediction, the medium-term forecasting of rainfall events remains a difficult task. Here, we aim to investigate the influence of early-spring environmental parameters on the following autumnal heavy precipitations. Hence, we develop a machine learning model to predict extreme autumnal rainfall with a 6-month lead time over the Spanish Catalan coastal area, based on i) the sea pattern (main current-LPC and Sea Surface Temperature-SST) at the mesoscale scale, ii) 4 European weather teleconnection patterns (NAO, WeMo, SCAND, MO) at synoptic scale, and iii) the hydrological regime of the main local river (Rhône River). The accuracy of the developed model classifier is evaluated via statistical analysis based on classification accuracy, logarithmic and confusion matrix by comparing with rainfall estimates from rain gauges and satellite observations (CHIRPS-2.0). Sensitivity tests are carried out by changing the model configuration, such as sea SST, sea LPC, river regime, and synoptic atmosphere configuration. The sensitivity analysis suggests a negligible influence from the hydrological regime, unlike SST, LPC, and specific teleconnection weather patterns. At last, this study illustrates how public datasets can be integrated into a machine learning model for heavy rainfall prediction and can interest local policies for management purposes.Keywords: extreme hazards, sensitivity analysis, heavy rainfall, machine learning, sea-atmosphere modeling, precipitation forecasting
Procedia PDF Downloads 1365759 Climate Change and Human Migration
Authors: Sungwoo Park
Abstract:
The paper attempts to investigate the correlation between climate change and migration that has caused violent disputes in some regions of the world. Recently, NGOs and educational institutions have proposed claims that migratory patterns and violent uprisings are intertwined with climate change. Thus, the paper is primarily concerned with collecting evidences provided from scholars, validating this significant connection between climate change and migration, and evaluating and suggesting current and future research approaches respectively to enhance the acknowledgment and protection of environmental refugees. In order to examine the linkage of environmental migration, primary sources, such as political speeches, and secondary sources like theses from environmental policy analysts, books, and reports are used. More specifically, the investigation focuses on an civil war in Syria to draw a connection between environmental migration and violent dispute that threatens the global security. The examination undertaken specifically analyzes examples where forced migration occurred due to climate change. In Bangladesh, Pakistan, and Kiribati, residents have been at risk of fleeing their countries because of abnormal climate patterns, such as the rise of sea level or an excessive heat stress. As the brutal uprising in Syria has proven that climate change can pose a significant threat to global security, correlation between climate change and migration is surely worth delving into.Keywords: climate change, climate migration, global security, refugee crisis
Procedia PDF Downloads 3465758 Improving Grade Control Turnaround Times with In-Pit Hyperspectral Assaying
Authors: Gary Pattemore, Michael Edgar, Andrew Job, Marina Auad, Kathryn Job
Abstract:
As critical commodities become more scarce, significant time and resources have been used to better understand complicated ore bodies and extract their full potential. These challenging ore bodies provide several pain points for geologists and engineers to overcome, poor handling of these issues flows downs stream to the processing plant affecting throughput rates and recovery. Many open cut mines utilise blast hole drilling to extract additional information to feed back into the modelling process. This method requires samples to be collected during or after blast hole drilling. Samples are then sent for assay with turnaround times varying from 1 to 12 days. This method is time consuming, costly, requires human exposure on the bench and collects elemental data only. To address this challenge, research has been undertaken to utilise hyperspectral imaging across a broad spectrum to scan samples, collars or take down hole measurements for minerals and moisture content and grade abundances. Automation of this process using unmanned vehicles and on-board processing reduces human in pit exposure to ensure ongoing safety. On-board processing allows data to be integrated into modelling workflows with immediacy. The preliminary results demonstrate numerous direct and indirect benefits from this new technology, including rapid and accurate grade estimates, moisture content and mineralogy. These benefits allow for faster geo modelling updates, better informed mine scheduling and improved downstream blending and processing practices. The paper presents recommendations for implementation of the technology in open cut mining environments.Keywords: grade control, hyperspectral scanning, artificial intelligence, autonomous mining, machine learning
Procedia PDF Downloads 1135757 Finding the Free Stream Velocity Using Flow Generated Sound
Authors: Saeed Hosseini, Ali Reza Tahavvor
Abstract:
Sound processing is one the subjects that newly attracts a lot of researchers. It is efficient and usually less expensive than other methods. In this paper the flow generated sound is used to estimate the flow speed of free flows. Many sound samples are gathered. After analyzing the data, a parameter named wave power is chosen. For all samples, the wave power is calculated and averaged for each flow speed. A curve is fitted to the averaged data and a correlation between the wave power and flow speed is founded. Test data are used to validate the method and errors for all test data were under 10 percent. The speed of the flow can be estimated by calculating the wave power of the flow generated sound and using the proposed correlation.Keywords: the flow generated sound, free stream, sound processing, speed, wave power
Procedia PDF Downloads 4155756 A Normalized Non-Stationary Wavelet Based Analysis Approach for a Computer Assisted Classification of Laryngoscopic High-Speed Video Recordings
Authors: Mona K. Fehling, Jakob Unger, Dietmar J. Hecker, Bernhard Schick, Joerg Lohscheller
Abstract:
Voice disorders origin from disturbances of the vibration patterns of the two vocal folds located within the human larynx. Consequently, the visual examination of vocal fold vibrations is an integral part within the clinical diagnostic process. For an objective analysis of the vocal fold vibration patterns, the two-dimensional vocal fold dynamics are captured during sustained phonation using an endoscopic high-speed camera. In this work, we present an approach allowing a fully automatic analysis of the high-speed video data including a computerized classification of healthy and pathological voices. The approach bases on a wavelet-based analysis of so-called phonovibrograms (PVG), which are extracted from the high-speed videos and comprise the entire two-dimensional vibration pattern of each vocal fold individually. Using a principal component analysis (PCA) strategy a low-dimensional feature set is computed from each phonovibrogram. From the PCA-space clinically relevant measures can be derived that quantify objectively vibration abnormalities. In the first part of the work it will be shown that, using a machine learning approach, the derived measures are suitable to distinguish automatically between healthy and pathological voices. Within the approach the formation of the PCA-space and consequently the extracted quantitative measures depend on the clinical data, which were used to compute the principle components. Therefore, in the second part of the work we proposed a strategy to achieve a normalization of the PCA-space by registering the PCA-space to a coordinate system using a set of synthetically generated vibration patterns. The results show that owing to the normalization step potential ambiguousness of the parameter space can be eliminated. The normalization further allows a direct comparison of research results, which bases on PCA-spaces obtained from different clinical subjects.Keywords: Wavelet-based analysis, Multiscale product, normalization, computer assisted classification, high-speed laryngoscopy, vocal fold analysis, phonovibrogram
Procedia PDF Downloads 2655755 CTHTC: A Convolution-Backed Transformer Architecture for Temporal Knowledge Graph Embedding with Periodicity Recognition
Authors: Xinyuan Chen, Mohd Nizam Husen, Zhongmei Zhou, Gongde Guo, Wei Gao
Abstract:
Temporal Knowledge Graph Completion (TKGC) has attracted increasing attention for its enormous value; however, existing models lack capabilities to capture both local interactions and global dependencies simultaneously with evolutionary dynamics, while the latest achievements in convolutions and Transformers haven't been employed in this area. What’s more, periodic patterns in TKGs haven’t been fully explored either. To this end, a multi-stage hybrid architecture with convolution-backed Transformers is introduced in TKGC tasks for the first time combining the Hawkes process to model evolving event sequences in a continuous-time domain. In addition, the seasonal-trend decomposition is adopted to identify periodic patterns. Experiments on six public datasets are conducted to verify model effectiveness against state-of-the-art (SOTA) methods. An extensive ablation study is carried out accordingly to evaluate architecture variants as well as the contributions of independent components in addition, paving the way for further potential exploitation. Besides complexity analysis, input sensitivity and safety challenges are also thoroughly discussed for comprehensiveness with novel methods.Keywords: temporal knowledge graph completion, convolution, transformer, Hawkes process, periodicity
Procedia PDF Downloads 785754 Patterns of TV Simultaneous Interpreting of Emotive Overtones in Trump’s Victory Speech from English into Arabic
Authors: Hanan Al-Jabri
Abstract:
Simultaneous interpreting is deemed to be the most challenging mode of interpreting by many scholars. The special constraints involved in this task including time constraints, different linguistic systems, and stress pose a great challenge to most interpreters. These constraints are likely to maximise when the interpreting task is done live on TV. The TV interpreter is exposed to a wide variety of audiences with different backgrounds and needs and is mostly asked to interpret high profile tasks which raise his/her levels of stress, which further complicate the task. Under these constraints, which require fast and efficient performance, TV interpreters of four TV channels were asked to render Trump's victory speech into Arabic. However, they had also to deal with the burden of rendering English emotive overtones employed by the speaker into a whole different linguistic system. The current study aims at investigating the way TV interpreters, who worked in the simultaneous mode, handled this task; it aims at exploring and evaluating the TV interpreters’ linguistic choices and whether the original emotive effect was maintained, upgraded, downgraded or abandoned in their renditions. It also aims at exploring the possible difficulties and challenges that emerged during this process and might have influenced the interpreters’ linguistic choices. To achieve its aims, the study analysed Trump’s victory speech delivered on November 6, 2016, along with four Arabic simultaneous interpretations produced by four TV channels: Al-Jazeera, RT, CBC News, and France 24. The analysis of the study relied on two frameworks: a macro and a micro framework. The former presents an overview of the wider context of the English speech as well as an overview of the speaker and his political background to help understand the linguistic choices he made in the speech, and the latter framework investigates the linguistic tools which were employed by the speaker to stir people’s emotions. These tools were investigated based on Shamaa’s (1978) classification of emotive meaning according to their linguistic level: phonological, morphological, syntactic, and semantic and lexical levels. Moreover, this level investigates the patterns of rendition which were detected in the Arabic deliveries. The results of the study identified different rendition patterns in the Arabic deliveries, including parallel rendition, approximation, condensation, elaboration, transformation, expansion, generalisation, explicitation, paraphrase, and omission. The emerging patterns, as suggested by the analysis, were influenced by factors such as speedy and continuous delivery of some stretches, and highly-dense segments among other factors. The study aims to contribute to a better understanding of TV simultaneous interpreting between English and Arabic, as well as the practices of TV interpreters when rendering emotiveness especially that little is known about interpreting practices in the field of TV, particularly between Arabic and English.Keywords: emotive overtones, interpreting strategies, political speeches, TV interpreting
Procedia PDF Downloads 1595753 Development of an Appropriate Method for the Determination of Multiple Mycotoxins in Pork Processing Products by UHPLC-TCFLD
Authors: Jason Gica, Yi-Hsieng Samuel Wu, Deng-Jye Yang, Yi-Chen Chen
Abstract:
Mycotoxins, harmful secondary metabolites produced by certain fungi species, pose significant risks to animals and humans worldwide. Their stable properties lead to contamination during grain harvesting, transportation, and storage, as well as in processed food products. The prevalence of mycotoxin contamination has attracted significant attention due to its adverse impact on food safety and global trade. The secondary contamination pathway from animal products has been identified as an important route of exposure, posing health risks for livestock and humans consuming contaminated products. Pork, one of the highly consumed meat products in Taiwan according to the National Food Consumption Database, plays a critical role in the nation's diet and economy. Given its substantial consumption, pork processing products are a significant component of the food supply chain and a potential source of mycotoxin contamination. This study is paramount for formulating effective regulations and strategies to mitigate mycotoxin-related risks in the food supply chain. By establishing a reliable analytical method, this research contributes to safeguarding public health and enhancing the quality of pork processing products. The findings will serve as valuable guidance for policymakers, food industries, and consumers to ensure a safer food supply chain in the face of emerging mycotoxin challenges. An innovative and efficient analytical approach is proposed using Ultra-High Performance Liquid Chromatography coupled with Temperature Control Fluorescence Detector Light (UHPLC-TCFLD) to determine multiple mycotoxins in pork meat samples due to its exceptional capacity to detect multiple mycotoxins at the lowest levels of concentration, making it highly sensitive and reliable for comprehensive mycotoxin analysis. Additionally, its ability to simultaneously detect multiple mycotoxins in a single run significantly reduces the time and resources required for analysis, making it a cost-effective solution for monitoring mycotoxin contamination in pork processing products. The research aims to optimize the efficient mycotoxin QuEChERs extraction method and rigorously validate its accuracy and precision. The results will provide crucial insights into mycotoxin levels in pork processing products.Keywords: multiple-mycotoxin analysis, pork processing products, QuEChERs, UHPLC-TCFLD, validation
Procedia PDF Downloads 695752 Effect of Plasma Treatment on UV Protection Properties of Fabrics
Authors: Sheila Shahidi
Abstract:
UV protection by fabrics has recently become a focus of great interest, particularly in connection with environmental degradation or ozone layer depletion. Fabrics provide simple and convenient protection against UV radiation (UVR), but not all fabrics offer sufficient UV protection. To describe the degree of UVR protection offered by clothing materials, the ultraviolet protection factor (UPF) is commonly used. UV-protective fabric can be generated by application of a chemical finish using normal wet-processing methodologies. However, traditional wet-processing techniques are known to consume large quantities of water and energy and may lead to adverse alterations of the bulk properties of the substrate. Recently, usage of plasmas to generate physicochemical surface modifications of textile substrates has become an intriguing approach to replace or enhance conventional wet-processing techniques. In this research work the effect of plasma treatment on UV protection properties of fabrics was investigated. DC magnetron sputtering was used and the parameters of plasma such as gas type, electrodes, time of exposure, power and, etc. were studied. The morphological and chemical properties of samples were analyzed using Scanning Electron Microscope (SEM) and Furrier Transform Infrared Spectroscopy (FTIR), respectively. The transmittance and UPF values of the original and plasma-treated samples were measured using a Shimadzu UV3101 PC (UV–Vis–NIR scanning spectrophotometer, 190–2, 100 nm range). It was concluded that, plasma which is an echo-friendly, cost effective and dry technique is being used in different branches of the industries, and will conquer textile industry in the near future. Also it is promising method for preparation of UV protection textile.Keywords: fabric, plasma, textile, UV protection
Procedia PDF Downloads 5195751 Computational Team Dynamics in Student New Product Development Teams
Authors: Shankaran Sitarama
Abstract:
Teamwork is an extremely effective pedagogical tool in engineering education. New Product Development (NPD) has been an effective strategy of companies to streamline and bring innovative products and solutions to customers. Thus, Engineering curriculum in many schools, some collaboratively with business schools, have brought NPD into the curriculum at the graduate level. Teamwork is invariably used during instruction, where students work in teams to come up with new products and solutions. There is a significant emphasis of grade on the semester long teamwork for it to be taken seriously by students. As the students work in teams and go through this process to develop the new product prototypes, their effectiveness and learning to a great extent depends on how they function as a team and go through the creative process, come together, and work towards the common goal. A core attribute of a successful NPD team is their creativity and innovation. The team needs to be creative as a group, generating a breadth of ideas and innovative solutions that solve or address the problem they are targeting and meet the user’s needs. They also need to be very efficient in their teamwork as they work through the various stages of the development of these ideas resulting in a POC (proof-of-concept) implementation or a prototype of the product. The simultaneous requirement of teams to be creative and at the same time also converge and work together imposes different types of tensions in their team interactions. These ideational tensions / conflicts and sometimes relational tensions / conflicts are inevitable. Effective teams will have to deal with the Team dynamics and manage it to be resilient enough and yet be creative. This research paper provides a computational analysis of the teams’ communication that is reflective of the team dynamics, and through a superimposition of latent semantic analysis with social network analysis, provides a computational methodology of arriving at patterns of visual interaction. These team interaction patterns have clear correlations to the team dynamics and provide insights into the functioning and thus the effectiveness of the teams. 23 student NPD teams over 2 years of a course on Managing NPD that has a blend of engineering and business school students is considered, and the results are presented. It is also correlated with the teams’ detailed and tailored individual and group feedback and self-reflection and evaluation questionnaire.Keywords: team dynamics, social network analysis, team interaction patterns, new product development teamwork, NPD teams
Procedia PDF Downloads 1165750 High Motivational Salient Face Distractors Slowed Target Detection: Evidence from Behavioral Studies
Authors: Rashmi Gupta
Abstract:
Rewarding stimuli capture attention involuntarily as a result of an association process that develops quickly during value learning, referred to as the reward or value-driven attentional capture. It is essential to compare reward with punishment processing to get a full picture of value-based modulation in visual attention processing. Hence, the present study manipulated both valence/value (reward as well as punishment) and motivational salience (probability of an outcome: high vs. low) together. Series of experiments were conducted, and there were two phases in each experiment. In phase 1, participants were required to learn to associate specific face stimuli with a high or low probability of winning or losing points. In the second phase, these conditioned stimuli then served as a distractor or prime in a speeded letter search task. Faces with high versus low outcome probability, regardless of valence, slowed the search for targets (specifically the left visual field target) and suggesting that the costs to performance on non-emotional cognitive tasks were only driven by motivational salience (high vs. loss) associated with the stimuli rather than the valence (gain vs. loss). It also suggests that the processing of motivationally salient stimuli is right-hemisphere biased. Together, results of these studies strengthen the notion that our visual attention system is more sensitive to affected by motivational saliency rather than valence, which termed here as motivational-driven attentional capture.Keywords: attention, distractors, motivational salience, valence
Procedia PDF Downloads 2205749 An Informetrics Analysis of Research on Phishing in Scopus and Web of Science Databases from 2012 to 2021
Authors: Nkosingiphile Mbusozayo Zungu
Abstract:
The purpose of the current study is to adopt informetrics methods to analyse the research on phishing from 2012 to 2021 in three selected databases in order to contribute to global cybersecurity through impactful research. The study follows a quantitative research methodology. We opted for the positivist epistemology and objectivist ontology. The analysis focuses on: (i) the productivity of individual authors, institutions, and countries; (ii) the research contributions, using co-authorship as a measure of collaboration; (iii) the altmetrics of selected research contributions; (iv) the citation patterns and research impact of research on phishing; and (v) research contributions by keywords, to discover the concepts that are related to phishing. The preliminary findings favour developed countries in terms of quantity and quality of research in the domain. There are unique research trends and patterns in the developing countries, including those in Africa, that provide opportunities for research development in the domain in the region. This study explores an important research domain by using unexplored method in the region. The study supports the SDG Agenda 2030, such as ending abuse, exploitation, trafficking, and all other forms of violence and torture of children through the use of cyberspace (SDG 16). Further, the results from this study can inform research, teaching, and learning largely in Africa. Invariably, the study contributes to cybersecurity awareness that will mitigate cybersecurity threats against vulnerable communities.Keywords: phishing, cybersecurity, informetrics, information security
Procedia PDF Downloads 1135748 Combining Laser Scanning and High Dynamic Range Photography for the Presentation of Bloodstain Pattern Evidence
Authors: Patrick Ho
Abstract:
Bloodstain Pattern Analysis (BPA) forensic evidence can be complex, requiring effective courtroom presentation to ensure clear and comprehensive understanding of the analyst’s findings. BPA witness statements can often involve reference to spatial information (such as location of rooms, objects, walls) which, when coupled with classified blood patterns, may illustrate the reconstructed movements of suspects and injured parties. However, it may be difficult to communicate this information through photography alone, despite this remaining the UK’s established method for presenting BPA evidence. Through an academic-police partnership between the University of Warwick and West Midlands Police (WMP), an integrated 3D scanning and HDR photography workflow for BPA was developed. Homicide scenes were laser scanned and, after processing, the 3D models were utilised in the BPA peer-review process. The same 3D models were made available for court but were not always utilised. This workflow has improved the ease of presentation for analysts and provided 3D scene models that assist with the investigation. However, the effects of incorporating 3D scene models in judicial processes may need to be studied before they are adopted more widely. 3D models from a simulated crime scene and West Midlands Police cases approved for conference disclosure are presented. We describe how the workflow was developed and integrated into established practices at WMP, including peer-review processes and witness statement delivery in court, and explain the impact the work has had on the Criminal Justice System in the West Midlands.Keywords: bloodstain pattern analysis, forensic science, criminal justice, 3D scanning
Procedia PDF Downloads 975747 Secure Message Transmission Using Meaningful Shares
Authors: Ajish Sreedharan
Abstract:
Visual cryptography encodes a secret image into shares of random binary patterns. If the shares are exerted onto transparencies, the secret image can be visually decoded by superimposing a qualified subset of transparencies, but no secret information can be obtained from the superposition of a forbidden subset. The binary patterns of the shares, however, have no visual meaning and hinder the objectives of visual cryptography. In the Secret Message Transmission through Meaningful Shares a secret message to be transmitted is converted to grey scale image. Then (2,2) visual cryptographic shares are generated from this converted gray scale image. The shares are encrypted using A Chaos-Based Image Encryption Algorithm Using Wavelet Transform. Two separate color images which are of the same size of the shares, taken as cover image of the respective shares to hide the shares into them. The encrypted shares which are covered by meaningful images so that a potential eavesdropper wont know there is a message to be read. The meaningful shares are transmitted through two different transmission medium. During decoding shares are fetched from received meaningful images and decrypted using A Chaos-Based Image Encryption Algorithm Using Wavelet Transform. The shares are combined to regenerate the grey scale image from where the secret message is obtained.Keywords: visual cryptography, wavelet transform, meaningful shares, grey scale image
Procedia PDF Downloads 4555746 A Generalization of Planar Pascal’s Triangle to Polynomial Expansion and Connection with Sierpinski Patterns
Authors: Wajdi Mohamed Ratemi
Abstract:
The very well-known stacked sets of numbers referred to as Pascal’s triangle present the coefficients of the binomial expansion of the form (x+y)n. This paper presents an approach (the Staircase Horizontal Vertical, SHV-method) to the generalization of planar Pascal’s triangle for polynomial expansion of the form (x+y+z+w+r+⋯)n. The presented generalization of Pascal’s triangle is different from other generalizations of Pascal’s triangles given in the literature. The coefficients of the generalized Pascal’s triangles, presented in this work, are generated by inspection, using embedded Pascal’s triangles. The coefficients of I-variables expansion are generated by horizontally laying out the Pascal’s elements of (I-1) variables expansion, in a staircase manner, and multiplying them with the relevant columns of vertically laid out classical Pascal’s elements, hence avoiding factorial calculations for generating the coefficients of the polynomial expansion. Furthermore, the classical Pascal’s triangle has some pattern built into it regarding its odd and even numbers. Such pattern is known as the Sierpinski’s triangle. In this study, a presentation of Sierpinski-like patterns of the generalized Pascal’s triangles is given. Applications related to those coefficients of the binomial expansion (Pascal’s triangle), or polynomial expansion (generalized Pascal’s triangles) can be in areas of combinatorics, and probabilities.Keywords: pascal’s triangle, generalized pascal’s triangle, polynomial expansion, sierpinski’s triangle, combinatorics, probabilities
Procedia PDF Downloads 3675745 Contrasting Patterns of Accumulation, Partitioning, and Reallocation Patterns of Dm and N Within the Maize Canopy Under Decreased N Availabilities
Authors: Panpan Fan, Bo Ming, Niels P. R. Anten, Jochem B. Evers, Yaoyao Li, Shaokun Li, Ruizhi Xie
Abstract:
The reallocation of dry matter (DM) and nitrogen (N) from vegetative tissues to the grain sinks are critical for grain yield. The objective of this study was to quantify the DM and N accumulation, partition, and reallocation at the single-leaf, different-organ, and individual-plant scales and clarify the responses to different levels of N availabilities. A two-year field experiment was conducted in Jinlin province, Northeast China, with three N fertilizer rates to create the different N availability levels: N0 (N deficiency), N1(low supply), and N2 (high supply). The results showed that grain N depends more on reallocations of vegetative organs compared with grain DM. Besides, vegetative organs reallocated more DM and N to grain under lower N availability, whereas more grain DM and grain N were derived from post-silking leaf photosynthesis and post-silking N uptake from the soil under high N availability. Furthermore, the reallocation amount and reallocation efficiency of leaf DM and leaf N content differed among leaf ranks and were regulated by N availability; specifically, the DM reallocation occurs mainly on senesced leaves, whereas the leaf N reallocation was in live leaves. These results provide a theoretical basis for deriving parameters in crop models for the simulation of the demand, uptake, partition, and reallocation processes of DM and N.Keywords: dry matter, leaf N content, leaf rank, N availability, reallocation efficiency
Procedia PDF Downloads 1275744 A Text Classification Approach Based on Natural Language Processing and Machine Learning Techniques
Authors: Rim Messaoudi, Nogaye-Gueye Gning, François Azelart
Abstract:
Automatic text classification applies mostly natural language processing (NLP) and other AI-guided techniques to automatically classify text in a faster and more accurate manner. This paper discusses the subject of using predictive maintenance to manage incident tickets inside the sociality. It focuses on proposing a tool that treats and analyses comments and notes written by administrators after resolving an incident ticket. The goal here is to increase the quality of these comments. Additionally, this tool is based on NLP and machine learning techniques to realize the textual analytics of the extracted data. This approach was tested using real data taken from the French National Railways (SNCF) company and was given a high-quality result.Keywords: machine learning, text classification, NLP techniques, semantic representation
Procedia PDF Downloads 1005743 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads
Authors: Gaurav Kumar Sinha
Abstract:
In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies
Procedia PDF Downloads 685742 Online Teaching and Learning Processes: Declarative and Procedural Knowledge
Authors: Eulalia Torras, Andreu Bellot
Abstract:
To know whether students’ achievements are the result of online interaction and not just a consequence of individual differences themselves, it seems essential to link the teaching presence and social presence to the types of knowledge built. The research aim is to analyze the social presence in relation to two types of knowledge, declarative and procedural. Qualitative methodology has been used. The analysis of the contents was based on an observation protocol that included community of enquiry indicators and procedural and declarative knowledge indicators. The research has been conducted in three phases that focused on an observational protocol and indicators, results and conclusions. Results show that the teaching-learning processes have been characterized by the patterns of presence and types of knowledge. Results also show the importance of social presence support provided by the teacher and the students, not only in regard to the nature of the instructional support but also concerning how it is presented to the student and the importance that is attributed to it in the teaching-learning process, that is, what it is that assistance is offered on. In this study, we find that the presence based on procedural guidelines and declarative reflection, the management of shared meaning on the basis of the skills and the evidence of these skills entail patterns of learning. Nevertheless, the importance that the teacher attributes to each support aspect has a bearing on the extent to which the students reflect more on the given task.Keywords: education, online, teaching and learning processes, knowledge
Procedia PDF Downloads 2165741 Study on Roll Marks of Stainless Steel in Rolling Mill
Authors: Cai-Wan Chang-Jian, Han-Ting Tsai
Abstract:
In the processing industry of metal forming, rolling is the most used method of processing. In a cold rolling factory of stainless steel, there occurs a product defect on temper rolling process within cold rolling. It is called 'roll marks', which is a phenomenon of undesirable flatness problem. In this research, we performed a series of experimental measurements on the roll marks, and we used optical sensors to measure it and compared the vibration frequency of roll marks with the vibration frequency of key components in the skin pass mill. We found there is less correlation between the above mentioned data. Finally, we took measurement on the motor driver in rolling mill. We found that the undulation frequency of motor could match with the frequency of roll marks, and then we have confirmed that the motor’s undulation caused roll marks.Keywords: roll mark, plane strain, rolling mill, stainless steel
Procedia PDF Downloads 4545740 Embedded Electrochemistry with Miniaturized, Drone-Based, Potentiostat System for Remote Detection Chemical Warfare Agents
Authors: Amer Dawoud, Jesy Motchaalangaram, Arati Biswakarma, Wujan Mio, Karl Wallace
Abstract:
The development of an embedded miniaturized drone-based system for remote detection of Chemical Warfare Agents (CWA) is proposed. The paper focuses on the software/hardware system design of the electrochemical Cyclic Voltammetry (CV) and Differential Pulse Voltammetry (DPV) signal processing for future deployment on drones. The paper summarizes the progress made towards hardware and electrochemical signal processing for signature detection of CWA. Also, the miniature potentiostat signal is validated by comparing it with the high-end lab potentiostat signal.Keywords: drone-based, remote detection chemical warfare agents, miniaturized, potentiostat
Procedia PDF Downloads 1365739 Regulatory and Economic Challenges of AI Integration in Cyber Insurance
Authors: Shreyas Kumar, Mili Shangari
Abstract:
Integrating artificial intelligence (AI) in the cyber insurance sector represents a significant advancement, offering the potential to revolutionize risk assessment, fraud detection, and claims processing. However, this integration introduces a range of regulatory and economic challenges that must be addressed to ensure responsible and effective deployment of AI technologies. This paper examines the multifaceted regulatory landscape governing AI in cyber insurance and explores the economic implications of compliance, innovation, and market dynamics. AI's capabilities in processing vast amounts of data and identifying patterns make it an invaluable tool for insurers in managing cyber risks. Yet, the application of AI in this domain is subject to stringent regulatory scrutiny aimed at safeguarding data privacy, ensuring algorithmic transparency, and preventing biases. Regulatory bodies, such as the European Union with its General Data Protection Regulation (GDPR), mandate strict compliance requirements that can significantly impact the deployment of AI systems. These regulations necessitate robust data protection measures, ethical AI practices, and clear accountability frameworks, all of which entail substantial compliance costs for insurers. The economic implications of these regulatory requirements are profound. Insurers must invest heavily in upgrading their IT infrastructure, implementing robust data governance frameworks, and training personnel to handle AI systems ethically and effectively. These investments, while essential for regulatory compliance, can strain financial resources, particularly for smaller insurers, potentially leading to market consolidation. Furthermore, the cost of regulatory compliance can translate into higher premiums for policyholders, affecting the overall affordability and accessibility of cyber insurance. Despite these challenges, the potential economic benefits of AI integration in cyber insurance are significant. AI-enhanced risk assessment models can provide more accurate pricing, reduce the incidence of fraudulent claims, and expedite claims processing, leading to overall cost savings and increased efficiency. These efficiencies can improve the competitiveness of insurers and drive innovation in product offerings. However, balancing these benefits with regulatory compliance is crucial to avoid legal penalties and reputational damage. The paper also explores the potential risks associated with AI integration, such as algorithmic biases that could lead to unfair discrimination in policy underwriting and claims adjudication. Regulatory frameworks need to evolve to address these issues, promoting fairness and transparency in AI applications. Policymakers play a critical role in creating a balanced regulatory environment that fosters innovation while protecting consumer rights and ensuring market stability. In conclusion, the integration of AI in cyber insurance presents both regulatory and economic challenges that require a coordinated approach involving regulators, insurers, and other stakeholders. By navigating these challenges effectively, the industry can harness the transformative potential of AI, driving advancements in risk management and enhancing the resilience of the cyber insurance market. This paper provides insights and recommendations for policymakers and industry leaders to achieve a balanced and sustainable integration of AI technologies in cyber insurance.Keywords: artificial intelligence (AI), cyber insurance, regulatory compliance, economic impact, risk assessment, fraud detection, cyber liability insurance, risk management, ransomware
Procedia PDF Downloads 335738 Assessment of an ICA-Based Method for Detecting the Effect of Attention in the Auditory Late Response
Authors: Siavash Mirahmadizoghi, Steven Bell, David Simpson
Abstract:
In this work a new independent component analysis (ICA) based method for noise reduction in evoked potentials is evaluated on for auditory late responses (ALR) captured with a 63-channel electroencephalogram (EEG) from 10 normal-hearing subjects. The performance of the new method is compared with a single channel alternative in terms of signal to noise ratio (SNR), the number of channels with an SNR above an empirically derived statistical critical value and an estimate of the effect of attention on the major components in the ALR waveform. The results show that the multichannel signal processing method can significantly enhance the quality of the ALR signal and also detect the effect of the attention on the ALR better than the single channel alternative.Keywords: auditory late response (ALR), attention, EEG, independent component analysis (ICA), multichannel signal processing
Procedia PDF Downloads 505