Search results for: adaptive computer games
2028 Systematic Review of Associations between Interoception, Vagal Tone, and Emotional Regulation
Authors: Darren Edwards, Thomas Pinna
Abstract:
Background: Interoception and heart rate variability have been found to predict outcomes of mental health and well-being. However, these have usually been investigated independently of one another. Objectives: This review aimed to explore the associations between interoception and heart rate variability (HRV) with emotion regulation (ER) and ER strategies within the existing literature and utilizing systematic review methodology. Methods: The process of article retrieval and selection followed the preferred reporting items for systematic review and meta-analyses (PRISMA) guidelines. Databases PsychINFO, Web of Science, PubMed, CINAHL, and MEDLINE were scanned for papers published. Preliminary inclusion and exclusion criteria were specified following the patient, intervention, comparison, and outcome (PICO) framework, whilst the checklist for critical appraisal and data extraction for systematic reviews of prediction modeling studies (CHARMS) framework was used to help formulate the research question, and to critically assess for bias in the identified full-length articles. Results: 237 studies were identified after initial database searches. Of these, eight studies were included in the final selection. Six studies explored the associations between HRV and ER, whilst three investigated the associations between interoception and ER (one of which was included in the HRV selection too). Overall, the results seem to show that greater HRV and interoception are associated with better ER. Specifically, high parasympathetic activity largely predicted the use of adaptive ER strategies such as reappraisal, and better acceptance of emotions. High interoception, instead, was predictive of effective down-regulation of negative emotions and handling of social uncertainty, there was no association with any specific ER strategy. Conclusions: Awareness of one’s own bodily feelings and vagal activation seem to be of central importance for the effective regulation of emotional responses.Keywords: emotional regulation, vagal tone, interoception, chronic conditions, health and well-being, psychological flexibility
Procedia PDF Downloads 1182027 Kinoform Optimisation Using Gerchberg- Saxton Iterative Algorithm
Authors: M. Al-Shamery, R. Young, P. Birch, C. Chatwin
Abstract:
Computer Generated Holography (CGH) is employed to create digitally defined coherent wavefronts. A CGH can be created by using different techniques such as by using a detour-phase technique or by direct phase modulation to create a kinoform. The detour-phase technique was one of the first techniques that was used to generate holograms digitally. The disadvantage of this technique is that the reconstructed image often has poor quality due to the limited dynamic range it is possible to record using a medium with reasonable spatial resolution.. The kinoform (phase-only hologram) is an alternative technique. In this method, the phase of the original wavefront is recorded but the amplitude is constrained to be constant. The original object does not need to exist physically and so the kinoform can be used to reconstruct an almost arbitrary wavefront. However, the image reconstructed by this technique contains high levels of noise and is not identical to the reference image. To improve the reconstruction quality of the kinoform, iterative techniques such as the Gerchberg-Saxton algorithm (GS) are employed. In this paper the GS algorithm is described for the optimisation of a kinoform used for the reconstruction of a complex wavefront. Iterations of the GS algorithm are applied to determine the phase at a plane (with known amplitude distribution which is often taken as uniform), that satisfies given phase and amplitude constraints in a corresponding Fourier plane. The GS algorithm can be used in this way to enhance the reconstruction quality of the kinoform. Different images are employed as the reference object and their kinoform is synthesised using the GS algorithm. The quality of the reconstructed images is quantified to demonstrate the enhanced reconstruction quality achieved by using this method.Keywords: computer generated holography, digital holography, Gerchberg-Saxton algorithm, kinoform
Procedia PDF Downloads 5392026 Erosion of Culture through Democratization
Authors: Mladen Milicevic
Abstract:
This paper explores how the explosion of computer technologies has allowed for the democratization of many aspects of human activities, which were in the past only available through the institutionalized channels of production and distribution. We will going to use as an example the music recording industries, just to illustrate this process, but the analogies to other activities and aspects of human life can easily be extrapolated from it.Keywords: aura, democratization, music industry, music sharing, paradigm-shift
Procedia PDF Downloads 2402025 The Mouth and Gastrointestinal Tract of the African Lung Fish Protopterus annectens in River Niger at Agenebode, Nigeria
Authors: Marian Agbugui
Abstract:
The West African Lung fishes are fishes rich in protein and serve as an important source of food supply for man. The kind of food ingested by this group of fishes is dependent on the alimentary canal as well as the fish’s digestive processes which provide suitable modifications for maximum utilization of food taken. A study of the alimentary canal of P. annectens will expose the best information on the anatomy and histology of the fish. Samples of P. annectens were dissected to reveal the liver, pancreas and entire gut wall. Digital pictures of the mouth, jaws and the Gastrointestinal Tract (GIT) were taken. The entire gut was identified, sectioned and micro graphed. P. annectens was observed to possess a terminal mouth that opens up to 10% of its total body length, an adaptive feature to enable the fish to swallow the whole of its pry. Its dentition is made up of incisors- scissor-like teeth which also help to firmly grip, seize and tear through the skin of prey before swallowing. A short, straight and longitudinal GIT was observed in P. annectens which is known to be common feature in lungfishes, though it is thought to be a primitive characteristic similar to the lamprey. The oesophagus is short and distensible similar to other predatory and carnivorous species. Food is temporarily stored in the stomach before it is passed down into the intestine. A pyloric aperture is seen at the end of the double folded pyloric valve which leads into an intestine that makes up 75% of the whole GIT. The intestine begins at the posterior end of the pyloric aperture and winds down in six coils through the whole length intestine and ends at the cloaca. From this study it is concluded that P. annectens possess a composite GIT with organs similar to other lung fishes; it is a detritor with carnivorous abilities.Keywords: gastrointestinal tract, incisors scissor-like teeth, intestine, mucus, Protopterus annectens, serosa
Procedia PDF Downloads 1562024 Derivatives Balance Method for Linear and Nonlinear Control Systems
Authors: Musaab Mohammed Ahmed Ali, Vladimir Vodichev
Abstract:
work deals with an universal control technique or single controller for linear and nonlinear stabilization and tracing control systems. These systems may be structured as SISO and MIMO. Parameters of controlled plants can vary over a wide range. Introduced a novel control systems design method, construction of stable platform orbits using derivative balance, solved transfer function stability preservation problem of linear system under partial substitution of a rational function. Universal controller is proposed as a polar system with the multiple orbits to simplify design procedure, where each orbit represent single order of controller transfer function. Designed controller consist of proportional, integral, derivative terms and multiple feedback and feedforward loops. The controller parameters synthesis method is presented. In generally, controller parameters depend on new polynomial equation where all parameters have a relationship with each other and have fixed values without requirements of retuning. The simulation results show that the proposed universal controller can stabilize infinity number of linear and nonlinear plants and shaping desired previously ordered performance. It has been proven that sensor errors and poor performance will be completely compensated and cannot affect system performance. Disturbances and noises effect on the controller loop will be fully rejected. Technical and economic effect of using proposed controller has been investigated and compared to adaptive, predictive, and robust controllers. The economic analysis shows the advantage of single controller with fixed parameters to drive infinity numbers of plants compared to above mentioned control techniques.Keywords: derivative balance, fixed parameters, stable platform, universal control
Procedia PDF Downloads 1402023 Multi-scale Spatial and Unified Temporal Feature-fusion Network for Multivariate Time Series Anomaly Detection
Authors: Hang Yang, Jichao Li, Kewei Yang, Tianyang Lei
Abstract:
Multivariate time series anomaly detection is a significant research topic in the field of data mining, encompassing a wide range of applications across various industrial sectors such as traffic roads, financial logistics, and corporate production. The inherent spatial dependencies and temporal characteristics present in multivariate time series introduce challenges to the anomaly detection task. Previous studies have typically been based on the assumption that all variables belong to the same spatial hierarchy, neglecting the multi-level spatial relationships. To address this challenge, this paper proposes a multi-scale spatial and unified temporal feature fusion network, denoted as MSUT-Net, for multivariate time series anomaly detection. The proposed model employs a multi-level modeling approach, incorporating both temporal and spatial modules. The spatial module is designed to capture the spatial characteristics of multivariate time series data, utilizing an adaptive graph structure learning model to identify the multi-level spatial relationships between data variables and their attributes. The temporal module consists of a unified temporal processing module, which is tasked with capturing the temporal features of multivariate time series. This module is capable of simultaneously identifying temporal dependencies among different variables. Extensive testing on multiple publicly available datasets confirms that MSUT-Net achieves superior performance on the majority of datasets. Our method is able to model and accurately detect systems data with multi-level spatial relationships from a spatial-temporal perspective, providing a novel perspective for anomaly detection analysis.Keywords: data mining, industrial system, multivariate time series, anomaly detection
Procedia PDF Downloads 202022 Design and Development of High Strength Aluminium Alloy from Recycled 7xxx-Series Material Using Bayesian Optimisation
Authors: Alireza Vahid, Santu Rana, Sunil Gupta, Pratibha Vellanki, Svetha Venkatesh, Thomas Dorin
Abstract:
Aluminum is the preferred material for lightweight applications and its alloys are constantly improving. The high strength 7xxx alloys have been extensively used for structural components in aerospace and automobile industries for the past 50 years. In the next decade, a great number of airplanes will be retired, providing an obvious source of valuable used metals and great demand for cost-effective methods to re-use these alloys. The design of proper aerospace alloys is primarily based on optimizing strength and ductility, both of which can be improved by controlling the additional alloying elements as well as heat treatment conditions. In this project, we explore the design of high-performance alloys with 7xxx as a base material. These designed alloys have to be optimized and improved to compare with modern 7xxx-series alloys and to remain competitive for aircraft manufacturing. Aerospace alloys are extremely complex with multiple alloying elements and numerous processing steps making optimization often intensive and costly. In the present study, we used Bayesian optimization algorithm, a well-known adaptive design strategy, to optimize this multi-variable system. An Al alloy was proposed and the relevant heat treatment schedules were optimized, using the tensile yield strength as the output to maximize. The designed alloy has a maximum yield strength and ultimate tensile strength of more than 730 and 760 MPa, respectively, and is thus comparable to the modern high strength 7xxx-series alloys. The microstructure of this alloy is characterized by electron microscopy, indicating that the increased strength of the alloy is due to the presence of a high number density of refined precipitates.Keywords: aluminum alloys, Bayesian optimization, heat treatment, tensile properties
Procedia PDF Downloads 1232021 The Mechanical and Comfort Properties of Cotton/Micro-Tencel Lawn Fabrics
Authors: Abdul Basit, Shahid Latif, Shah Mehmood
Abstract:
Lawn fabric was usually prepared from originally of linen but at present chiefly cotton. Lawn fabric is worn in summer. Cotton Lawn is a lightweight pure cloth which is heavier than voile. It is so fine that it is somewhat transparent. It is soft and superb to wear thus it is perfect for summer clothes or for regular wear in hotter climates. Tencel (Lyocell) fiber is considered as the fiber of the future as Tencel fibers are absorbent, soft, and extremely strong when wet or dry, and resistant to wrinkles. Fibers are more absorbent than cotton, softer than silk and cooler than linen. High water absorption and water vapor absorption give more heat capacity and heat balancing effect for thermo-regulation. This thermo-regulation is analogous with the action of phase-change-materials. The thermal wear properties result in cool and dry touch that gives cooling effect in sportswear, and the warmth properties (when used as an insulation layer). These cooling and warming effects are adaptive to the environment giving comfort in a broad range of climatic conditions. In this work, single yarns of Ne 80s were made. Yarns were made from conventional ring spinning. Different yarns of 100% cotton, 100% micro-Tencel and Cotton:micro-Tencel blends (67:33, 50:50:33:67) were made. The mechanical and comfort properties of the woven fabrics were compared. The mechanical properties include the tensile and tear strength, bending length, pilling and abrasion resistance whereas comfort properties include the air permeability, moisture management and thermal resistance. It is found that as the content of the micro-Tencel is increased, the mechanical and comfort properties of the woven fabric are also increased.Keywords: combed cotton, comfort properties , mechanical properties, micro-Tencel
Procedia PDF Downloads 3232020 High Temperature Tolerance of Chironomus Sulfurosus and Its Molecular Mechanisms
Authors: Tettey Afi Pamela, Sotaro Fujii, Hidetoshi Saito, Kawaii Koichiro
Abstract:
Introduction: Organisms employ adaptive mechanisms when faced with any stressor or risk of being wiped out. This has made it possible for them to survive in harsh environmental conditions such as increasing temperature, low pH, and anoxia. Some of the mechanisms they utilize include the expression of heat shock proteins, synthesis of cryoprotectants, and anhydrobiosis. Heat shock proteins (HSPs) have been widely studied to determine their involvement in stress tolerance among various organism, of which chironomid species have been no exception. We examined the survival and expression of genes encoding five (5) heat shock proteins (HSP70, HSP67, HSP60, HSP27, and HSP23) from Chironomus sulfurosus larvae reared from 1st instar at 25°C, 30°C, 35°C, and 40°C. Results: The highest survival rate was recorded at 30°C, followed by 25°C, then 35°C. Only a small percentage of C. sulfurosus survived at 40°C (14.5%). With regards to HSPs expression, some HSPs responded to an increase in high temperature. The relative expression levels were lowest at 30°C for HSP70, HSP60, HSP27, and HSP23. At 25°C and 40°C, HSP70, HSP67, HSP60, HSP27, and HSP23 had the highest expression. At 35°C, all had the lowest expression. Discussion: The expression of heat shock proteins varies from one species to another. We designated the genes HSP 70, HSP 67, HSP 60, HSP 27, and HSP 23 genes based on transcriptome analysis of C. sulfurosus. Our study can be termed as a long-heat shock study as C. sulfurosus was reared from the first instar to the fourth instar, and this might have led to a continuous induction of HSPs at 25°C. 40°C had the lowest survival but highest HSPs expression as C. sulfurosus larvae had to utilize HSPs for sustenance. These results and future high-throughput studies at both the transcriptome and proteome level will improve the information needed to predict the future geographic distribution of these species within the context of global warming.Keywords: chironomid, heat shock proteins, high temperature, heat shock protein expression
Procedia PDF Downloads 1012019 Beyond the White Cube: A Study on the Site Specific Curatorial Practice of Kochi Muziris Biennale
Authors: Girish Chandran, Milu Tigi
Abstract:
Brian O'Doherty's seminal essay, Inside the white Cube theorized and named the dominant mode of display and exhibition of Modern Art museums. Ever since the advent of Biennales and other site-specific public art projects we have seen a departure from the white cube mode of exhibition. The physicality, materiality and context within which an artwork is framed has a role in the production of meaning of public art. Equally, artworks contribute to the meaning and identity of a place. This to and fro relationship between the site and artwork and its influence on the sense of place and production of meaning is being explored in this paper in the context of Kochi Muziris Biennale (KMB). Known as the Peoples biennale with over 5 lakh visitors, it is India's first Biennale and its largest art exhibition of contemporary art. The paper employs place theory and contemporary curatorial theories to present the case. The KMB has an interesting mix of exhibition spaces which includes existing galleries and halls, site-specific projects in public spaces, infill developments and adaptive reuse of heritage and other unused architecture. The biennale was envisioned as an event connecting to the history, socio-political peculiarities of the cultural landscape of Kerala and more specifically Kochi. The paper explains the role of spatial elements in forming a curatorial narrative connected to the above mentioned ambitions.The site-specific nature of exhibition and its use of unused architecture helps in the formation of exhibition spaces unique in type and materiality. The paper argues how this helps in the creation of an 'archeology of the place'. The research elucidates how a composite nature of experience helps connect with the thematic ambitions of the Biennale and how it brings about an aesthetics distinct to KMB.Keywords: public art, curatorial practice, architecture, place, contemporary art, site specificity
Procedia PDF Downloads 1622018 Mapping Interrelationships among Key Sustainability Drivers: A Strategic Framework for Enhanced Entrepreneurial Sustainability among MSME
Authors: Akriti Chandra, Gourav Dwivedi, Seema Sharma, Shivani
Abstract:
This study investigates the adoption of green business (GB) models within a circular economy framework (CEBM) for Micro Small and Medium Enterprise (MSME), given the rising importance of sustainable practices. The research begins by exploring the shift from linear business models towards resource-efficient, sustainable models, emphasizing the benefits of the circular economy. The study's literature review identifies 60 influential factors impacting the shift to green businesses, grouped as internal and external drivers. However, there is a research gap in examining these factors' interrelationships and operationalizing them within MSMEs. To address this gap, the study employs Total Interpretive Structural Modelling (TISM) to establish a hierarchical structure of factors influencing GB and circular economy business model (CEBM) adoption. Findings reveal that factors like green innovation and market competitiveness are particularly impactful. Using Systems Theory, which views organizations as complex adaptive systems, the study contextualizes these drivers within MSMEs, proposing a framework for a sustainable business model adoption. The study concludes with significant implications for policymakers, suggesting that the identified factors and their hierarchical relationships can guide policy formulation for a broader transition to green business practices. This work also invites further research, recommending larger, quantitative studies to empirically validate these factors and explore practical challenges in implementing CEBMs.Keywords: green business (GB), circular economy business model (CEBM), micro small and medium enterprise (MSME), total interpretive structural modelling (TISM), systems theory
Procedia PDF Downloads 272017 Refined Edge Detection Network
Authors: Omar Elharrouss, Youssef Hmamouche, Assia Kamal Idrissi, Btissam El Khamlichi, Amal El Fallah-Seghrouchni
Abstract:
Edge detection is represented as one of the most challenging tasks in computer vision, due to the complexity of detecting the edges or boundaries in real-world images that contains objects of different types and scales like trees, building as well as various backgrounds. Edge detection is represented also as a key task for many computer vision applications. Using a set of backbones as well as attention modules, deep-learning-based methods improved the detection of edges compared with the traditional methods like Sobel and Canny. However, images of complex scenes still represent a challenge for these methods. Also, the detected edges using the existing approaches suffer from non-refined results while the image output contains many erroneous edges. To overcome this, n this paper, by using the mechanism of residual learning, a refined edge detection network is proposed (RED-Net). By maintaining the high resolution of edges during the training process, and conserving the resolution of the edge image during the network stage, we make the pooling outputs at each stage connected with the output of the previous layer. Also, after each layer, we use an affined batch normalization layer as an erosion operation for the homogeneous region in the image. The proposed methods are evaluated using the most challenging datasets including BSDS500, NYUD, and Multicue. The obtained results outperform the designed edge detection networks in terms of performance metrics and quality of output images.Keywords: edge detection, convolutional neural networks, deep learning, scale-representation, backbone
Procedia PDF Downloads 1062016 Using Audio-Visual Aids and Computer-Assisted Language Instruction (CALI) to Overcome Learning Difficulties of Listening in Students of Special Needs
Authors: Sadeq Al Yaari, Muhammad Alkhunayn, Ayman Al Yaari, Montaha Al Yaari, Adham Al Yaari, Sajedah Al Yaari, Fatehi Eissa
Abstract:
Background & Aims: Audio-visual aids and computer-aided language instruction (CALI) have been documented to improve receptive skills, namely listening skills, in normal students. The increased listening has been attributed to the understanding of other interlocutors' speech, but recent experiments have suggested that audio-visual aids and CALI should be tested against the listening of students of special needs to see the effects of the former in the latter. This investigation described the effect of audio-visual aids and CALI on the performance of these students. Methods: Pre-and-posttests were administered to 40 students of special needs of both sexes at al-Malādh school for students of special needs aged between 8 and 18 years old. A comparison was held between this group of students and another similar group (control group). Whereas the former group underwent a listening course using audio-visual aids and CALI, the latter studied the same course with the same speech language therapist (SLT) with the classical method. The outcomes of the two tests for the two groups were qualitatively and quantitatively analyzed. Results: Significant improvement in the performance was found in the first group (treatment group) (posttest= 72.45% vs. pre-test= 25.55%) in comparison to the second (control) (posttest= 25.55% vs. pre-test= 23.72%). In comparison to the males’ scores, the scores of females are higher (1487 scores vs. 1411 scores). Suggested results support the necessity of the use of audio-visual aids and CALI in teaching listening at the schools of students of special needs.Keywords: listening, receptive skills, audio-visual aids, CALI, special needs
Procedia PDF Downloads 542015 Adaptation Mechanisms of the Polyextremophile Natranaerobius Thermophilus to Saline-Alkaline-Hermal Environments
Authors: Qinghua Xing, Xinyi Tao, Haisheng Wang, Baisuo Zhao
Abstract:
The first true anaerobic, halophilic alkali thermophile, Natranaerobius thermophilus DSM 18059T, serves as a valuable model for studying cellular adaptations to saline, alkaline and thermal extremes. To uncover the adaptive strategies employed by N. thermophilus in coping with these challenges, we conducted a comprehensive iTRAQ-based quantitative proteomic analysis under different conditions of salinity (3.5 M vs. 2.5 M Na+), pH (pH 9.6 vs. pH 8.6), and temperature (52°C vs. 42°C). The increased intracellular accumulation of glycine betaine, through both synthesis and transport, plays a critical role in N. thermophilus' adaptation to these combined stresses. Under all three stress conditions, the up-regulation of Trk family proteins responsible for K+ transport is observed. Intracellular K+ concentration rises in response to salt and pH levels. Multiple types of Na+/H+ antiporter (NhaC family, Mrp family and CPA family) and a diverse range of FOF1-ATP synthase are identified as vital components for maintaining ionic balance under different stress conditions. Importantly, proteins involved in amino acid metabolism, carbohydrate metabolism, ABC transporters, signaling and chemotaxis, as well as biological macromolecule repair and protection, exhibited significant up-regulation in response to these extreme conditions. These metabolic pathways emerge as critical factors in N. thermophilus' adaptation mechanisms under extreme environmental stress. To validate the proteomic data, ddPCR analysis confirmed changes in mRNA expression, thereby corroborating the up-regulation and down-regulation patterns of 19 co-up-regulated and 36 key proteins under saline, alkaline and thermal stresses. This research enriches our understanding of the complex regulatory systems that enable polyextremophiles to survive in combined extreme conditions.Keywords: polyextremophiles, natranaerobius thermophilus, saline- alkaline- thermal stresses, combined extremes
Procedia PDF Downloads 612014 Urinary Incontinence and Performance in Elite Athletes
Authors: María Barbaño Acevedo Gómez, Elena Sonsoles Rodríguez López, Sofía Olivia Calvo Moreno, Ángel Basas García, Christophe RamíRez Parenteau
Abstract:
Introduction: Urinary incontinence (UI) is defined as the involuntary leakage of urine. In persons who practice sport, its prevalence is 36.1% (95% CI 26.5% –46.8%) and varies as it seems to depend on the intensity of exercise, movements and impact on the ground. Such high impact sports are likely to generate higher intra-abdominal pressures and leading to pelvic floor muscle weakness. Although physical exercise reduces the risk of suffering from many diseases the mentality of an elite athlete is not to optimize their health, achieving their goals can put their health at risk. Furthermore, feeling or suffering from any discomfort during training seems to be normal within the elite sport demands. Objective: The main objective of the present study was to know the effects of UI in sports performance in athletes. Methods: This was an observational study conducted in 754 elite athletes. After collecting questions about pelvic floor, UI and sport-related data, participants completed the questionnaire International Consultation on Incontinence Questionnaire-UI Short- Form (ICIQ-SF) and ISI (index of incontinence severity). Results: 48.8% of the athletes declare having losses also in rest, preseason and / or competition (χ2 [3] = 3.64; p = 0.302), being the competition period (29.1%) the most frequent where suffer from urine leakage. Of the elite athletes surveyed, 33% had UI according ICIQ-SF (mean age 23.75 ± 7.74 years). Elite athletes with UI (5.31 ± 1.07 days) dedicate significantly more days per week to training [M = 0.28; 95% CI = 0.08-0.48; t (752) = 2.78; p = 0.005] than those without UI. Regarding frequency, 59.7% lose urine once a week, 25.6% lose urine more than 3 times a week, and 14.7% daily. Based on the amount, approximately 15% claim to lose a moderate and abundant. Athletes with the highest number of urine leaks during their training, the UI affects them more in their daily life (r = 0.259; p = 0.001), they present a greater number of losses in their day to day (r = 0.341; p <0.001 ) and greater severity of UI (r = 0.341; p <0.001). Conclusions: Athletes consider that UI affects them negatively in their daily routine, 30.9% affirm having a severity between moderate and severe in their daily routine, and 29.1% loss urine in competition period. An interesting fact is that more than half of the samples collected were elite athletes who compete at the highest level (Olympic Games, World and European Championship), the dedication to sport occupies a big piece in their life. The most frequent period where athletes suffers urine leakage is in competition and there are many emotions that athletes manage to get their best performance, if we add urine losses in that moments it is possible that their performance could be affected.Keywords: athletes, performance, prevalence, sport, training, urinary incontinence
Procedia PDF Downloads 1342013 Development of Automatic Laser Scanning Measurement Instrument
Authors: Chien-Hung Liu, Yu-Fen Chen
Abstract:
This study used triangular laser probe and three-axial direction mobile platform for surface measurement, programmed it and applied it to real-time analytic statistics of different measured data. This structure was used to design a system integration program: using triangular laser probe for scattering or reflection non-contact measurement, transferring the captured signals to the computer through RS-232, and using RS-485 to control the three-axis platform for a wide range of measurement. The data captured by the laser probe are formed into a 3D surface. This study constructed an optical measurement application program in the concept of visual programming language. First, the signals are transmitted to the computer through RS-232/RS-485, and then the signals are stored and recorded in graphic interface timely. This programming concept analyzes various messages, and makes proper presentation graphs and data processing to provide the users with friendly graphic interfaces and data processing state monitoring, and identifies whether the present data are normal in graphic concept. The major functions of the measurement system developed by this study are thickness measurement, SPC, surface smoothness analysis, and analytical calculation of trend line. A result report can be made and printed promptly. This study measured different heights and surfaces successfully, performed on-line data analysis and processing effectively, and developed a man-machine interface for users to operate.Keywords: laser probe, non-contact measurement, triangulation measurement principle, statistical process control, labVIEW
Procedia PDF Downloads 3662012 Climate Change, Multiple Stressors, and Livelihoods: A Search for Communities Understanding, Vulnerability, and Adaptation in Zanzibar Islands
Authors: Thani R. Said
Abstract:
There is a wide concern on the academic literatures that the world is on course to experience “severe and pervasive” negative impacts from climate change unless it takes rapid action to slash its greenhouse gas emissions. The big threat however, is more belligerent in the third world countries, small islands states in particular. Most of the academic literatures claims that the livelihoods, economic and ecological landscapes of most of the coastal communities are into serious danger due to the peril of climate change. However, focusing the climate change alone and paying less intention to the surrounding stressors which sometimes are apparent then the climate change its self has now placed at the greater concern on academic debates. The recently studies have begun to question such narrowed assessment of climate change intervening programs from both its methodological and theoretical perspectives as related with livelihoods and the landscapes of the coastal communities. Looking climate as alone as an ostentatious threat doesn't yield the yield an appropriate mechanisms to address the problem in its totality and tend to provide the partially picture of the real problem striking the majority of the peoples living in the coastal areas of small islands states, Zanzibar in particular. By using the multiples human grounded knowledge approaches, the objective of this study is to go beyond the mere climate change by analyzing other multiples stressors that real challenging and treating the livelihoods, economic and ecological landscapes of the coastal communities through dialectic understanding, vulnerability and adaptive mechanisms at their own localities. To be more focus and to capture the full picture on this study special intention will be given to those areas were climate changes intervening programs have been onto place, the study will further compare and contrast between the two islands communities, Unguja and Pemba taking into account their respective diverse economic and geographical landscapes prevailed.Keywords: climate change, multiple stressors, livelihoods, vulnerability-adaptation
Procedia PDF Downloads 4082011 Interaction between the Rio Conventions on Climate and Biodiversity: Analysis of the Integration of Ecosystem-Based Approaches and Nature-Based Solutions into the UNFCCC
Authors: Dieudonne Mevono Mvogo
Abstract:
The Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES)-Intergovernmental Panel on Climate Change (IPCC) co-sponsored workshop report suggests that climate change and biodiversity loss are two of the most pressing issues of the Anthropocene. Research establishes the interconnection between climate change and biodiversity. On the one hand, the impact of climate change on biodiversity loss – 14 % over the past century – is projected to surpass other threats – land and sea use 34 % and direct exploitation of species 23 % – during the 21st century. Response measures to climate change also affect biodiversity negatively or positively. On the other hand, actions to halt or reverse biodiversity loss can enhance land and ocean capacity for carbon sequestration. These actions can also promote adaptation by ensuring adaptive capacity. This systemic interaction between climate change and biodiversity affects the human quality of life. The United Nations Secretariat's report entitled 'Gaps in international environmental law and environment-related instruments: towards a global pact for the environment,' released in 2018, states that cooperation and mutual support among agreements dealing with climate change, the protection of the marine environment, freshwater resources and hazardous waste are indispensable for the effective implementation of the Convention on the Biological Diversity (CBD). Since biodiversity is being lost at an alarming rate, this study aims to evaluate the cooperative framework for the coherence and coordination between climate change and biodiversity regimes to provide co-benefits for climate and biodiversity crises. It questions the potential improvement regarding integrating ecosystem-based approaches and nature-based solutions – promoted by the CBD – into the United Nations Framework Convention on Climate Change (UNFCCC).Keywords: rio conventions, climate change, biodiversity, cooperative framework, ecosystem-based approaches, nature-based solutions
Procedia PDF Downloads 1362010 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services
Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme
Abstract:
Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing
Procedia PDF Downloads 1182009 Adaptation of Smart City Concept in Africa: Localization, Relevance and Bottleneck
Authors: Adeleye Johnson Adelagunayeja
Abstract:
The concept of making cities, communities, and neighborhoods smart, intelligent, and responsive is relatively new to Africa and its urban renewal agencies. Efforts must be made by relevant agencies to begin a holistic review of the implementation of infrastructural facilities and urban renewal methodologies that will revolve around the appreciation and application of artificial intelligence. The propagation of the ideals and benefits of the smart city concept are key factors that can encourage governments of African nations, the African Union, and other regional organizations in Africa to embrace the ideology. The ability of this smart city concept to curb insecurities – armed robbery, assassination, terrorism, and civil disorder – is one major reason, amongst others, why African governments must speedily embrace this contemporary developmental concept whose time has come! The seamlessness to access information and virtually cross-pollinate ideas with people living in already established smart cities, when combined with the great efficiency that the emergence of smart cities brings with it, are other reasons why Africa must come up with action plans that can enable the existing cities to metamorphose into smart cities. Innovations will be required to enable Africa to develop a smart city concept that will be compatible with the basic patterns of livelihood because the essence of the smart city evolution is to make life better for people to co-exist, to be productive and to enjoy standard infrastructural facilities. This research paper enumerates the multifaceted adaptive factors that have the potentials of making the adoption of smartcity concept in Africa seamless. It also proffers solutions to potential bottlenecks capable of undermining the execution of the smart city concept in Africa.Keywords: smartcity compactibility innovation Africa government evolution, Africa as global village member, evolution in Africa, ways to make Africa adopt smartcity, localizing smartcity concept in Africa, bottleneck to smartcity developmet in Africa
Procedia PDF Downloads 902008 Highly Glazed Office Spaces: Simulated Visual Comfort vs Real User Experiences
Authors: Zahra Hamedani, Ebrahim Solgi, Henry Skates, Gillian Isoardi
Abstract:
Daylighting plays a pivotal role in promoting productivity and user satisfaction in office spaces. There is an ongoing trend in designing office buildings with a high proportion of glazing which relatively increases the risk of high visual discomfort. Providing a more realistic lighting analysis can be of high value at the early stages of building design when necessary changes can be made at a very low cost. This holistic approach can be achieved by incorporating subjective evaluation and user behaviour in computer simulation and provide a comprehensive lighting analysis. In this research, a detailed computer simulation model has been made using Radiance and Daysim. Afterwards, this model was validated by measurements and user feedback. The case study building is the school of science at Griffith University, Gold Coast, Queensland, which features highly glazed office spaces. In this paper, the visual comfort predicted by the model is compared with a preliminary survey of the building users to evaluate how user behaviour such as desk position, orientation selection, and user movement caused by daylight changes and other visual variations can inform perceptions of visual comfort. This work supports preliminary design analysis of visual comfort incorporating the effects of gaze shift patterns and views with the goal of designing effective layout for office spaces.Keywords: lighting simulation, office buildings, user behaviour, validation, visual comfort
Procedia PDF Downloads 2172007 Cut-Out Animation as an Technic and Development inside History Process
Authors: Armagan Gokcearslan
Abstract:
The art of animation has developed very rapidly from the aspects of script, sound and music, motion, character design, techniques being used and technological tools being developed since the first years until today. Technical variety attracts a particular attention in the art of animation. Being perceived as a kind of illusion in the beginning; animations commonly used the Flash Sketch technique. Animations artists using the Flash Sketch technique created scenes by drawing them on a blackboard with chalk. The Flash Sketch technique was used by primary animation artists like Emile Cohl, Winsor McCay ande Blackton. And then tools like Magical Lantern, Thaumatrope, Phenakisticope, and Zeotrap were developed and started to be used intensely in the first years of the art of animation. Today, on the other hand, the art of animation is affected by developments in the computer technology. It is possible to create three-dimensional and two-dimensional animations with the help of various computer software. Cut-out technique is among the important techniques being used in the art of animation. Cut-out animation technique is based on the art of paper cutting. Examining cut-out animations; it is observed that they technically resemble the art of paper cutting. The art of paper cutting has a rooted history. It is possible to see the oldest samples of paper cutting in the People’s Republic of China in the period after the 2. century B.C. when the Chinese invented paper. The most popular artist using the cut-out animation technique is the German artist Lotte Reiniger. This study titled “Cut-out Animation as a Technic and Development Inside History Process” will embrace the art of paper cutting, the relationship between the art of paper cutting and cut-out animation, its development within the historical process, animation artists producing artworks in this field, important cut-out animations, and their technical properties.Keywords: cut-out, paper art, animation, technic
Procedia PDF Downloads 2782006 The Effect of Second Language Listening Proficiency on Cognitive Control among Young Adult Bilinguals
Authors: Zhilong Xie, Jinwen Huang, Guofang Zeng
Abstract:
The existing body of research on bilingualism has consistently linked the use of multiple languages to enhanced cognitive control. Numerous studies have demonstrated that bilingual individuals exhibit advantages in non-linguistic tasks demanding cognitive control. However, recent investigations have challenged these findings, leading to a debate regarding the extent and nature of bilingual advantages. The adaptive control hypothesis posits that variations in bilingual experiences hold the key to resolving these controversies. This study aims to contribute to this discussion by exploring the impact of second language (L2) listening experience on cognitive control among young Chinese-English bilinguals. By examining this specific aspect of bilingualism, the study offers a perspective on the origins of bilingual advantages. This study employed a range of cognitive tasks, including the Flanker task, Wisconsin Card Sorting Test (WCST), Operation Span Task (OSPAN), and a second language listening comprehension test. After controlling for potential confounding variables such as intelligence, socioeconomic status, and overall language proficiency, independent sample t-test analysis revealed significant differences in performance between groups with high and low L2 listening proficiency in the Flanker task and OSPAN. However, no significant differences emerged between the two groups in the WCST. These findings suggest that L2 listening proficiency has a significant impact on inhibitory control and working memory but not on conflict monitoring or mental set shifting. These specific findings provide a more nuanced understanding of the origins of bilingual advantages within a specific bilingual context, highlighting the importance of considering the nature of bilingual experience when exploring cognitive benefits.Keywords: bilingual advantage, inhibitory control, L2 listening, working memory
Procedia PDF Downloads 222005 An Orphan Software Engineering Course: Supportive Ways toward a True Software Engineer
Authors: Haya Sammana
Abstract:
A well-defined curricula must be adopted to meet the increasing complexity and diversity in the software applications. In reality, some IT majors such as computer science and computer engineering receive the software engineering education in a single course which is considered as a big challenged for the instructors and universities. Also, it requires students to gain the most of practical experiences that simulate the real work in software companies. Furthermore, we have noticed that there is no consensus on how, when and what to teach in that introductory course to gain the practical experiences that are required by the software companies. Because all of software engineering disciplines will not fit in just one course, so the course needs reasonable choices in selecting its topics. This arises an important question which is an essential one to ask: Is this course has the ability to formulate a true software engineer that meets the needs of industry? This question arises a big challenge in selecting the appropriate topics. So answering this question is very important for the next undergraduate students. During teaching this course in the curricula, the feedbacks from an undergraduate students and the keynotes of the annual meeting for an advisory committee from industrial side provide a probable answer for the proposed question: it is impossible to build a true software engineer who possesses all the essential elements of software engineering education such teamwork, communications skills, project management skills and contemporary industrial practice from one course and it is impossible to have a one course covering all software engineering topics. Besides the used teaching approach, the author proposes an implemented three supportive ways aiming for mitigating the expected risks and increasing the opportunity to build a true software engineer.Keywords: software engineering course, software engineering education, software experience, supportive approach
Procedia PDF Downloads 3632004 Convolutional Neural Networks versus Radiomic Analysis for Classification of Breast Mammogram
Authors: Mehwish Asghar
Abstract:
Breast Cancer (BC) is a common type of cancer among women. Its screening is usually performed using different imaging modalities such as magnetic resonance imaging, mammogram, X-ray, CT, etc. Among these modalities’ mammogram is considered a powerful tool for diagnosis and screening of breast cancer. Sophisticated machine learning approaches have shown promising results in complementing human diagnosis. Generally, machine learning methods can be divided into two major classes: one is Radiomics analysis (RA), where image features are extracted manually; and the other one is the concept of convolutional neural networks (CNN), in which the computer learns to recognize image features on its own. This research aims to improve the incidence of early detection, thus reducing the mortality rate caused by breast cancer through the latest advancements in computer science, in general, and machine learning, in particular. It has also been aimed to ease the burden of doctors by improving and automating the process of breast cancer detection. This research is related to a relative analysis of different techniques for the implementation of different models for detecting and classifying breast cancer. The main goal of this research is to provide a detailed view of results and performances between different techniques. The purpose of this paper is to explore the potential of a convolutional neural network (CNN) w.r.t feature extractor and as a classifier. Also, in this research, it has been aimed to add the module of Radiomics for comparison of its results with deep learning techniques.Keywords: breast cancer (BC), machine learning (ML), convolutional neural network (CNN), radionics, magnetic resonance imaging, artificial intelligence
Procedia PDF Downloads 2302003 Unleashing the Power of Cerebrospinal System for a Better Computer Architecture
Authors: Lakshmi N. Reddi, Akanksha Varma Sagi
Abstract:
Studies on biomimetics are largely developed, deriving inspiration from natural processes in our objective world to develop novel technologies. Recent studies are diverse in nature, making their categorization quite challenging. Based on an exhaustive survey, we developed categorizations based on either the essential elements of nature - air, water, land, fire, and space, or on form/shape, functionality, and process. Such diverse studies as aircraft wings inspired by bird wings, a self-cleaning coating inspired by a lotus petal, wetsuits inspired by beaver fur, and search algorithms inspired by arboreal ant path networks lend themselves to these categorizations. Our categorizations of biomimetic studies allowed us to define a different dimension of biomimetics. This new dimension is not restricted to inspiration from the objective world. It is based on the premise that the biological processes observed in the objective world find their reflections in our human bodies in a variety of ways. For example, the lungs provide the most efficient example for liquid-gas phase exchange, the heart exemplifies a very efficient pumping and circulatory system, and the kidneys epitomize the most effective cleaning system. The main focus of this paper is to bring out the magnificence of the cerebro-spinal system (CSS) insofar as it relates to our current computer architecture. In particular, the paper uses four key measures to analyze the differences between CSS and human- engineered computational systems. These are adaptability, sustainability, energy efficiency, and resilience. We found that the cerebrospinal system reveals some important challenges in the development and evolution of our current computer architectures. In particular, the myriad ways in which the CSS is integrated with other systems/processes (circulatory, respiration, etc) offer useful insights on how the human-engineered computational systems could be made more sustainable, energy-efficient, resilient, and adaptable. In our paper, we highlight the energy consumption differences between CSS and our current computational designs. Apart from the obvious differences in materials used between the two, the systemic nature of how CSS functions provides clues to enhance life-cycles of our current computational systems. The rapid formation and changes in the physiology of dendritic spines and their synaptic plasticity causing memory changes (ex., long-term potentiation and long-term depression) allowed us to formulate differences in the adaptability and resilience of CSS. In addition, the CSS is sustained by integrative functions of various organs, and its robustness comes from its interdependence with the circulatory system. The paper documents and analyzes quantifiable differences between the two in terms of the four measures. Our analyses point out the possibilities in the development of computational systems that are more adaptable, sustainable, energy efficient, and resilient. It concludes with the potential approaches for technological advancement through creation of more interconnected and interdependent systems to replicate the effective operation of cerebro-spinal system.Keywords: cerebrospinal system, computer architecture, adaptability, sustainability, resilience, energy efficiency
Procedia PDF Downloads 1042002 Computer Countenanced Diagnosis of Skin Nodule Detection and Histogram Augmentation: Extracting System for Skin Cancer
Authors: S. Zith Dey Babu, S. Kour, S. Verma, C. Verma, V. Pathania, A. Agrawal, V. Chaudhary, A. Manoj Puthur, R. Goyal, A. Pal, T. Danti Dey, A. Kumar, K. Wadhwa, O. Ved
Abstract:
Background: Skin cancer is now is the buzzing button in the field of medical science. The cyst's pandemic is drastically calibrating the body and well-being of the global village. Methods: The extracted image of the skin tumor cannot be used in one way for diagnosis. The stored image contains anarchies like the center. This approach will locate the forepart of an extracted appearance of skin. Partitioning image models has been presented to sort out the disturbance in the picture. Results: After completing partitioning, feature extraction has been formed by using genetic algorithm and finally, classification can be performed between the trained and test data to evaluate a large scale of an image that helps the doctors for the right prediction. To bring the improvisation of the existing system, we have set our objectives with an analysis. The efficiency of the natural selection process and the enriching histogram is essential in that respect. To reduce the false-positive rate or output, GA is performed with its accuracy. Conclusions: The objective of this task is to bring improvisation of effectiveness. GA is accomplishing its task with perfection to bring down the invalid-positive rate or outcome. The paper's mergeable portion conflicts with the composition of deep learning and medical image processing, which provides superior accuracy. Proportional types of handling create the reusability without any errors.Keywords: computer-aided system, detection, image segmentation, morphology
Procedia PDF Downloads 1552001 Preliminary Study of Hand Gesture Classification in Upper-Limb Prosthetics Using Machine Learning with EMG Signals
Authors: Linghui Meng, James Atlas, Deborah Munro
Abstract:
There is an increasing demand for prosthetics capable of mimicking natural limb movements and hand gestures, but precise movement control of prosthetics using only electrode signals continues to be challenging. This study considers the implementation of machine learning as a means of improving accuracy and presents an initial investigation into hand gesture recognition using models based on electromyographic (EMG) signals. EMG signals, which capture muscle activity, are used as inputs to machine learning algorithms to improve prosthetic control accuracy, functionality and adaptivity. Using logistic regression, a machine learning classifier, this study evaluates the accuracy of classifying two hand gestures from the publicly available Ninapro dataset using two-time series feature extraction algorithms: Time Series Feature Extraction (TSFE) and Convolutional Neural Networks (CNNs). Trials were conducted using varying numbers of EMG channels from one to eight to determine the impact of channel quantity on classification accuracy. The results suggest that although both algorithms can successfully distinguish between hand gesture EMG signals, CNNs outperform TSFE in extracting useful information for both accuracy and computational efficiency. In addition, although more channels of EMG signals provide more useful information, they also require more complex and computationally intensive feature extractors and consequently do not perform as well as lower numbers of channels. The findings also underscore the potential of machine learning techniques in developing more effective and adaptive prosthetic control systems.Keywords: EMG, machine learning, prosthetic control, electromyographic prosthetics, hand gesture classification, CNN, computational neural networks, TSFE, time series feature extraction, channel count, logistic regression, ninapro, classifiers
Procedia PDF Downloads 432000 Factors Affecting Autistic Children's Development during the Early Years in Elementary School: A Longitudinal Study in Taiwan
Authors: Huang Ying
Abstract:
The present study was to investigate factors affecting children's improvement through the first two years of elementary school on a population-based sample of children with autism in Taiwan. All the children were diagnosed with autism spectrum disorder (ASD) by clinical psychologists according to DSM-IV. Children's development was assessed by the Vineland Adaptive Behavior Scales-Chinese version (VABS-C) on the first and the third grade. Children's improvement was measured by the difference between the standardized total score of the third and the first year. In Taiwan, school-age children with special-education needs will be arranged into different classes, including normal classes (NC), resource classes (RC), and special classes (SC) by the government. Therefore, type of class was one of the independent variables. Moreover, as early intervention is considered to be crucial, the earliest age when intervention begins was collected from parents. Attention was also included in the analysis. Teachers were asked to evaluate children's attention with a 3-item Likert Scale. The frequency of paying attention to the class or the task was recorded and scores were summed up. Additionally, standardized scores of the VABS-C in the first grade were used as pretest scores representing children's developmental level at the beginning of elementary school. Multiple regression was conducted with improvement as the dependent variable. Results showed that children in special classes had smaller improvement compared to those in normal or resource classes. Attention positively predicted improvement yet the effect of earliest intervention age was not significant. Furthermore, scores in the first grade negatively predicted improvement, which indicated that children with higher developmental levels would make less progress in the following years. Results were to some degree consistent with previous findings through meta-analysis that the effectiveness of conventional intervention methods lacked sufficient evidence to support.Keywords: attention, early intervention, elementary school, special education in Taiwan
Procedia PDF Downloads 2931999 Finite Difference Modelling of Temperature Distribution around Fire Generated Heat Source in an Enclosure
Authors: A. A. Dare, E. U. Iniegbedion
Abstract:
Industrial furnaces generally involve enclosures of fire typically initiated by the combustion of gases. The fire leads to temperature distribution inside the enclosure. A proper understanding of the temperature and velocity distribution within the enclosure is often required for optimal design and use of the furnace. This study was therefore directed at numerical modeling of temperature distribution inside an enclosure as typical in a furnace. A mathematical model was developed from the conservation of mass, momentum and energy. The stream function-vorticity formulation of the governing equations was solved by an alternating direction implicit (ADI) finite difference technique. The finite difference formulation obtained were then developed into a computer code. This was used to determine the temperature, velocities, stream function and vorticity. The effect of the wall heat conduction was also considered, by assuming a one-dimensional heat flow through the wall. The computer code (MATLAB program) developed was used for the determination of the aforementioned variables. The results obtained showed that the transient temperature distribution assumed a uniform profile which becomes more chaotic with increasing time. The vertical velocity showed increasing turbulent behavior with time, while the horizontal velocity assumed decreasing laminar behavior with time. All of these behaviours were equally reported in the literature. The developed model has provided understanding of heat transfer process in an industrial furnace.Keywords: heat source, modelling, enclosure, furnace
Procedia PDF Downloads 257