Search results for: infinite memory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1307

Search results for: infinite memory

707 Inappropriate Effects Which the Use of Computer and Playing Video Games Have on Young People

Authors: Maja Ruzic-Baf, Mirjana Radetic-Paic

Abstract:

The use of computers by children has many positive aspects, including the development of memory, learning methods, problem-solving skills and the feeling of one’s own competence and self-confidence. Playing on line video games can encourage hanging out with peers having similar interests as well as communication; it develops coordination, spatial relations and presentation. On the other hand, the Internet enables quick access to different information and the exchange of experiences. How kids use computers and what the negative effects of this can be depends on various factors. ICT has improved and become easy to get for everyone. In the past 12 years so many video games has been made even to that level that some of them are free to play. Young people, even some adults, had simply start to forget about the real outside world because in that other, digital world, they have found something that makes them feal more worthy as a man. This article present the use of ICT, forms of behavior and addictions to on line video games. The use of computers by children has many positive aspects, including the development of memory, learning methods, problem-solving skills and the feeling of one’s own competence and self-confidence. Playing on line video games can encourage hanging out with peers having similar interests as well as communication; it develops coordination, spatial relations and presentation. On the other hand, the Internet enables quick access to different information and the exchange of experiences. How kids use computers and what the negative effects of this can be depends on various factors. ICT has improved and become easy to get for everyone. In the past 12 years so many video games has been made even to that level that some of them are free to play. Young people, even some adults, had simply start to forget about the real outside world because in that other, digital world, they have found something that makes them feal more worthy as a man. This article present the use of ICT, forms of behavior and addictions to on line video games.

Keywords: addiction to video games, behaviour, ICT, young people

Procedia PDF Downloads 528
706 Environmental Factors and Executive Functions of Children in 5-Year-Old Kindergarten

Authors: Stephanie Duval

Abstract:

The concept of educational success, combined with the overall development of the child in kindergarten, is at the center of current interests, both in research and in the environments responsible for the education of young children. In order to promote it, researchers emphasize the importance of studying the executive functions [EF] of children in preschool education. More precisely, the EFs, which refers to working memory [WM], inhibition, mental flexibility and planning, would be the pivotal element of the child’s educational success. In order to support the EFs of the child, and even his educational success, the quality of the environments is beginning to be explored more and more. The question that arises now is how to promote EFs for young children in the educational environment, in order to support their educational success? The objective of this study is to investigate the link between the quality of interactions in 5-year-old kindergarten and child’s EFs. The sample consists of 118 children (70 girls, 48 boys) in 12 classes. The quality of the interactions is observed from the Classroom Assessment Scoring System [CLASS], and the EFs (i.e., working memory, inhibition, cognitive flexibility, and planning) are measured with administered tests. The hypothesis of this study was that the quality of teacher-child interactions in preschool education, as measured by the CLASS, was associated with the child’s EFs. The results revealed that the quality of emotional support offered by adults in kindergarten, included in the CLASS tool, was positively and significantly related to WM and inhibition skills. The results also suggest that WM is a key skill in the development of EFs, which may be associated with the educational success of the child. However, this hypothesis remains to be clarified, as is the link with educational success. In addition, results showed that factors associated to the family (ex. parents’ income) moderate the relationship between the domain ‘instructional support’ of the CLASS (ex. concept development) and child’s WM skills. These data suggest a moderating effect related to family characteristics in the link between ‘quality of classroom interactions’ and ‘EFs’. This project proposes, as a future avenue, to check the distinctive effect of different environments (familial and educational) on the child’s EFs. More specifically, future study could examine the influence of the educational environment on EF skills, as well as whether or not there is a moderating effect of the family environment (ex. parents' income) on the link between the quality of the interactions in the classroom and the EFs of the children, as anticipated by this research.

Keywords: executive functions [EFs], environmental factors, quality of interactions, preschool education

Procedia PDF Downloads 350
705 Emotions in Human-Machine Interaction

Authors: Joanna Maj

Abstract:

Awe inspiring is the idea that emotions could be present in human-machine interactions, both on the human side as well as the machine side. Human factors present intriguing components and are examined in detail while discussing this controversial topic. Mood, attention, memory, performance, assessment, causes of emotion, and neurological responses are analyzed as components of the interaction. Problems in computer-based technology, revenge of the system on its users and design, and applications comprise a major part of all descriptions and examples throughout this paper. It also allows for critical thinking while challenging intriguing questions regarding future directions in research, dealing with emotion in human-machine interactions.

Keywords: biocomputing, biomedical engineering, emotions, human-machine interaction, interfaces

Procedia PDF Downloads 113
704 Development & Standardization of a Literacy Free Cognitive Rehabilitation Program for Patients Post Traumatic Brain Injury

Authors: Sakshi Chopra, Ashima Nehra, Sumit Sinha, Harsimarpreet Kaur, Ravindra Mohan Pandey

Abstract:

Background: Cognitive rehabilitation aims to retrain brain injured individuals with cognitive deficits to restore or compensate lost functions. As illiterates or people with low literacy levels represent a significant proportion of the world, specific rehabilitation modules for such populations are indispensable. Literacy is significantly associated with all neuropsychological measures and retraining programs widely use written or spoken techniques which essentially require the patient to read or write. So, the aim of the study was to develop and standardize a literacy free neuropsychological rehabilitation program for improving cognitive functioning in patients with mild and moderate Traumatic Brain Injury (TBI). Several studies have pointed out to the impairments seen in memory, executive functioning, and attention and concentration post-TBI, so the rehabilitation program focussed on these domains. Visual item memorization, stick constructions, symbol cancellations, and colouring techniques were used to construct the retraining program. Methodology: The development of the program consisted of planning, preparing, analyzing, and revising the different modules. The construction focussed on areas of retraining immediate and delayed visual memory, planning ability, focused and divided attention, concentration, and response inhibition (to control irritability and aggression). A total of 98 home based retraining modules were prepared in the 4 domains (42 for memory, 42 for executive functioning, 7 for attention and concentration, and 7 for response inhibition). The standardization was done on 20 healthy controls to review, select and edit items. For each module, the time, errors made and errors per second were noted down, to establish the difficulty level of each module and were arranged in increasing level of difficulty over a period of 6 weeks. The retraining tasks were then administered on 11 brain injured individuals (5 after Mild TBI and 6 after Moderate TBI). These patients were referred from the Trauma Centre to Clinical Neuropsychology OPD, All India Institute of Medical Sciences, New Delhi, India. Results: The time was taken, errors made and errors per second were analysed for all domains. Education levels were divided into illiterates, up to 10 years, 10 years to graduation and graduation and above. Mean and standard deviations were calculated. Between group and within group analysis was done using the t-test. The performance of 20 healthy controls was analyzed and only a significant difference was observed on the time taken for the attention tasks and all other domains had non-significant differences in performance between different education levels. Comparing the errors, time taken between patient and control group, there was a significant difference in all the domains at the 0.01 level except the errors made on executive functioning, indicating that the tool can successfully differentiate between healthy controls and patient groups. Conclusions: Apart from the time taken for symbol cancellations, the entire cognitive rehabilitation program is literacy free. As it taps the major areas of impairment post-TBI, it could be a useful tool to rehabilitate the patient population with low literacy levels across the world. The next step is already underway to test its efficacy in improving cognitive functioning in a randomized clinical controlled trial.

Keywords: cognitive rehabilitation, illiterates, India, traumatic brain injury

Procedia PDF Downloads 314
703 Metaphysics of the Unified Field of the Universe

Authors: Santosh Kaware, Dnyandeo Patil, Moninder Modgil, Hemant Bhoir, Debendra Behera

Abstract:

The Unified Field Theory has been an area of intensive research since many decades. This paper focuses on philosophy and metaphysics of unified field theory at Planck scale - and its relationship with super string theory and Quantum Vacuum Dynamic Physics. We examined the epistemology of questions such as - (1) what is the Unified Field of universe? (2) can it actually - (a) permeate the complete universe - or (b) be localized in bound regions of the universe - or, (c) extend into the extra dimensions? - -or (d) live only in extra dimensions? (3) What should be the emergent ontological properties of Unified field? (4) How the universe is manifesting through its Quantum Vacuum energies? (5) How is the space time metric coupled to the Unified field? We present a number of ansatz - which we outline below. It is proposed that the unified field possesses consciousness as well as a memory - a recording of past history - analogous to ‘Consistent Histories’ interpretation of quantum mechanics. We proposed Planck scale geometry of Unified Field with circle like topology and having 32 energy points on its periphery which are the connected to each other by 10 dimensional meta-strings which are sources for manifestation of different fundamentals forces and particles of universe through its Quantum Vacuum energies. It is also proposed that the sub energy levels of ‘Conscious Unified Field’ are used for the process of creation, preservation and rejuvenation of the universe over a period of time by means of negentropy. These epochs can be for the complete universe, or for localized regions such as galaxies or cluster of galaxies. It is proposed that Unified field operates through geometric patterns of its Quantum Vacuum energies - manifesting as various elementary particles by giving spins to zero point energy elements. Epistemological relationship between unified field theory and super-string theories is examined. Properties of ‘consciousness’ and 'memory' cascades from universe, into macroscopic objects - and further onto the elementary particles - via a fractal pattern. Other properties of fundamental particles - such as mass, charge, spin, iso-spin also spill out of such a cascade. The manifestations of the unified field can reach into the parallel universes or the ‘multi-verse’ and essentially have an existence independent of the space-time. It is proposed that mass, length, time scales of the unified theory are less than even the Planck scale - and can be called at a level which we call that of 'Super Quantum Gravity (SQG)'.

Keywords: super string theory, Planck scale geometry, negentropy, super quantum gravity

Procedia PDF Downloads 252
702 Ab Initio Study of Electronic Structure and Transport of Graphyne and Graphdiyne

Authors: Zeljko Crljen, Predrag Lazic

Abstract:

Graphene has attracted a tremendous interest in the field of nanoelectronics and spintronics due to its exceptional electronic properties. However, pristine graphene has no band gap, a feature needed in building some of the electronic elements. Recently, a growing attention has been given to a class of carbon allotropes of graphene with honeycomb structures, in particular to graphyne and graphdiyne. They are characterized with a single and double acetylene bonding chains respectively, connecting the nearest-neighbor hexagonal rings. With an electron density comparable to that of graphene and a prominent gap in electronic band structures they appear as promising materials for nanoelectronic components. We studied the electronic structure and transport of infinite sheets of graphyne and graphdiyne and compared them with graphene. The method based on the non-equilibrium Green functions and density functional theory has been used in order to obtain a full ab initio self-consistent description of the transport current with different electrochemical bias potentials. The current/voltage (I/V) characteristics show a semi-conducting behavior with prominent nonlinearities at higher voltages. The calculated band gaps are 0.52V and 0.59V, respectively, and the effective masses are considerably smaller compared to typical semiconductors. We analyzed the results in terms of transmission eigenchannels and showed that the difference in conductance is directly related to the difference of the internal structure of the allotropes.

Keywords: electronic transport, graphene-like structures, nanoelectronics, two-dimensional materials

Procedia PDF Downloads 169
701 Poly(Trimethylene Carbonate)/Poly(ε-Caprolactone) Phase-Separated Triblock Copolymers with Advanced Properties

Authors: Nikola Toshikj, Michel Ramonda, Sylvain Catrouillet, Jean-Jacques Robin, Sebastien Blanquer

Abstract:

Biodegradable and biocompatible block copolymers have risen as the golden materials in both medical and environmental applications. Moreover, if their architecture is of controlled manner, higher applications can be foreseen. In the meantime, organocatalytic ROP has been promoted as more rapid and immaculate route, compared to the traditional organometallic catalysis, towards efficient synthesis of block copolymer architectures. Therefore, herein we report novel organocatalytic pathway with guanidine molecules (TBD) for supported synthesis of trimethylene carbonate initiated by poly(caprolactone) as pre-polymer. Pristine PTMC-b-PCL-b-PTMC block copolymer structure, without any residual products and clear desired block proportions, was achieved under 1.5 hours at room temperature and verified by NMR spectroscopies and size-exclusion chromatography. Besides, when elaborating block copolymer films, further stability and amelioration of mechanical properties can be achieved via additional reticulation step of precedently methacrylated block copolymers. Subsequently, stimulated by the insufficient studies on the phase-separation/crystallinity relationship in these semi-crystalline block copolymer systems, their intrinsic thermal and morphology properties were investigated by differential scanning calorimetry and atomic force microscopy. Firstly, by DSC measurements, the block copolymers with χABN values superior to 20 presented two distinct glass transition temperatures, close to the ones of the respecting homopolymers, demonstrating an initial indication of a phase-separated system. In the interim, the existence of the crystalline phase was supported by the presence of melting temperature. As expected, the crystallinity driven phase-separated morphology predominated in the AFM analysis of the block copolymers. Neither crosslinking at melted state, hence creation of a dense polymer network, disturbed the crystallinity phenomena. However, the later revealed as sensible to rapid liquid nitrogen quenching directly from the melted state. Therefore, AFM analysis of liquid nitrogen quenched and crosslinked block copolymer films demonstrated a thermodynamically driven phase-separation clearly predominating over the originally crystalline one. These AFM films remained stable with their morphology unchanged even after 4 months at room temperature. However, as demonstrated by DSC analysis once rising the temperature above the melting temperature of the PCL block, neither the crosslinking nor the liquid nitrogen quenching shattered the semi-crystalline network, while the access to thermodynamical phase-separated structures was possible for temperatures under the poly (caprolactone) melting point. Precisely this coexistence of dual crosslinked/crystalline networks in the same copolymer structure allowed us to establish, for the first time, the shape-memory properties in such materials, as verified by thermomechanical analysis. Moreover, the response temperature to the material original shape depended on the block copolymer emplacement, hence PTMC or PCL as end-block. Therefore, it has been possible to reach a block copolymer with transition temperature around 40°C thus opening potential real-life medical applications. In conclusion, the initial study of phase-separation/crystallinity relationship in PTMC-b-PCL-b-PTMC block copolymers lead to the discovery of novel shape memory materials with superior properties, widely demanded in modern-life applications.

Keywords: biodegradable block copolymers, organocatalytic ROP, self-assembly, shape-memory

Procedia PDF Downloads 115
700 A Comparative Study of Cognitive Factors Affecting Social Distancing among Vaccinated and Unvaccinated Filipinos

Authors: Emmanuel Carlo Belara, Albert John Dela Merced, Mark Anthony Dominguez, Diomari Erasga, Jerome Ferrer, Bernard Ombrog

Abstract:

Social distancing errors are a common prevalence between vaccinated and unvaccinated in the Filipino community. This study aims to identify and relate the factors on how they affect our daily lives. Observed factors include memory, attention, anxiety, decision-making, and stress. Upon applying the ergonomic tools and statistical treatment such as t-test and multiple linear regression, stress and attention turned out to have the most impact to the errors of social distancing.

Keywords: vaccinated, unvaccinated, socoal distancing, filipinos

Procedia PDF Downloads 182
699 The Digital Unconscious: Exploring AI Potential to Decode the Human Subconscious

Authors: Khader I. Alkhouri

Abstract:

This paper explores the emerging intersection of artificial intelligence (AI) and subconscious research, examining how AI technologies may revolutionize our understanding of the human mind. We review key AI techniques being applied to decode subconscious processes, discuss potential applications and breakthroughs, and consider the ethical implications and societal impacts of this rapidly advancing field. By leveraging AI's powerful pattern recognition and data analysis capabilities, researchers aim to gain unprecedented insights into implicit memory, unconscious bias, and automatic behaviors. While promising, this research also raises important questions about cognitive privacy and the responsible development of these technologies.

Keywords: artificial intelligence, machine learning, neuroethics, psychological research, subconscious

Procedia PDF Downloads 5
698 Micro-Milling Process Development of Advanced Materials

Authors: M. A. Hafiz, P. T. Matevenga

Abstract:

Micro-level machining of metals is a developing field which has shown to be a prospective approach to produce features on the parts in the range of a few to a few hundred microns with acceptable machining quality. It is known that the mechanics (i.e. the material removal mechanism) of micro-machining and conventional machining have significant differences due to the scaling effects associated with tool-geometry, tool material and work piece material characteristics. Shape memory alloys (SMAs) are those metal alloys which display two exceptional properties, pseudoelasticity and the shape memory effect (SME). Nickel-titanium (NiTi) alloys are one of those unique metal alloys. NiTi alloys are known to be difficult-to-cut materials specifically by using conventional machining techniques due to their explicit properties. Their high ductility, high amount of strain hardening, and unusual stress–strain behaviour are the main properties accountable for their poor machinability in terms of tool wear and work piece quality. The motivation of this research work was to address the challenges and issues of micro-machining combining with those of machining of NiTi alloy which can affect the desired performance level of machining outputs. To explore the significance of range of cutting conditions on surface roughness and tool wear, machining tests were conducted on NiTi. Influence of different cutting conditions and cutting tools on surface and sub-surface deformation in work piece was investigated. Design of experiments strategy (L9 Array) was applied to determine the key process variables. The dominant cutting parameters were determined by analysis of variance. These findings showed that feed rate was the dominant factor on surface roughness whereas depth of cut found to be dominant factor as far as tool wear was concerned. The lowest surface roughness was achieved at the feed rate of equal to the cutting edge radius where as the lowest flank wear was observed at lowest depth of cut. Repeated machining trials have yet to be carried out in order to observe the tool life, sub-surface deformation and strain induced hardening which are also expecting to be amongst the critical issues in micro machining of NiTi. The machining performance using different cutting fluids and strategies have yet to be studied.

Keywords: nickel titanium, micro-machining, surface roughness, machinability

Procedia PDF Downloads 325
697 Loading Forces following Addition of 5% Cu in Nickel-Titanium Alloy Used for Orthodontics

Authors: Aphinan Phukaoluan, Surachai Dechkunakorn, Niwat Anuwongnukroh, Anak Khantachawana, Pongpan Kaewtathip, Julathep Kajornchaiyakul, Wassana Wichai

Abstract:

Aims: This study aims to address the amount of force delivered by a NiTiCu orthodontic wire with a ternary composition ratio of 46.0 Ni: 49.0 Ti: 5.0 Cu and to compare the results with a commercial NiTiCu 35 °C orthodontic archwire. Materials and Methods: Nickel (purity 99.9%), Titanium (purity 99.9%), and Copper (purity 99.9%) were used in this study with the atomic weight ratio 46.0 Ni: 49.0 Ti: 5.0 Cu. The elements were melted to form an alloy using an electrolytic arc furnace in argon gas atmosphere and homogenized at 800 °C for 1 hr. The alloys were subsequently sliced into thin plates (1.5mm) by EDM wire cutting machine to obtain the specimens and were cold-rolled with 30% followed by heat treatment in a furnace at 400 °C for 1 hour. Then, the three newly fabricated NiTiCu specimens were cut in nearly identical wire sizes of 0.016 inch x0.022 inch. Commercial preformed Ormco NiTiCu35 °C archwire with size 0.016 inch x 0.022 inches were used for comparative purposes. Three-point bending test was performed using a Universal Testing Machine to investigate the force of the load-deflection curve at oral temperature (36 °C+ 1) with deflection points at 0.25, 0.5, 0.75, 1.0. 1.25, and 1.5 mm. Descriptive statistics was used to evaluate each variables and independent t-test was used to analyze the differences between the groups. Results: Both NiTiCu wires presented typical superelastic properties as observed from the load-deflection curve. The average force was 341.70 g for loading, and 264.18 g for unloading for 46.0 Ni: 49.0 Ti: 5.0 Cu wire. Similarly, the values were 299.88 g for loading, and 201.96 g for unloading of Ormco NiTiCu35°C. There were significant differences (p < 0.05) in mean loading and unloading forces between the two NiTiCu wires. The deflection forces in loading and unloading force for Ormco NiTiCu at each point were less than 46.0 Ni: 49.0 Ti: 5.0 Cu wire, except at the deflection point of 0.25mm. Regarding the force difference between each deflection point of loading and unloading force, Ormco NiTiCu35 °C exerted less force than 46.0 Ni: 49.0 Ti: 5.0 Cu wire, except at difference deflection at 1.5-1.25 mm of unloading force. However, there were still within the acceptable limits for orthodontic use. Conclusion: The fabricated ternary alloy of 46.0 Ni: 49.0 Ti: 5.0 Cu (atomic weight) with 30% reduction and heat treatment at 400°C for 1 hr. and Ormco 35 °C NiTiCu presented the characteristics of the shape memory in their wire form. The unloading forces of both NiTiCu wires were in the range of orthodontic use. This should be a good foundation for further studies towards development of new orthodontic NiTiCu archwires.

Keywords: loading force, ternary alloy, NiTiCu, shape memory, orthodontic wire

Procedia PDF Downloads 265
696 An Optimal and Efficient Family of Fourth-Order Methods for Nonlinear Equations

Authors: Parshanth Maroju, Ramandeep Behl, Sandile S. Motsa

Abstract:

In this study, we proposed a simple and interesting family of fourth-order multi-point methods without memory for obtaining simple roots. This family requires only three functional evaluations (viz. two of functions f(xn), f(yn) and third one of its first-order derivative f'(xn)) per iteration. Moreover, the accuracy and validity of new schemes is tested by a number of numerical examples are also proposed to illustrate their accuracy by comparing them with the new existing optimal fourth-order methods available in the literature. It is found that they are very useful in high precision computations. Further, the dynamic study of these methods also supports the theoretical aspect.

Keywords: basins of attraction, nonlinear equations, simple roots, Newton's method

Procedia PDF Downloads 299
695 Discussion of Blackness in Wrestling

Authors: Jason Michael Crozier

Abstract:

The wrestling territories of the mid-twentieth century in the United States are widely considered the birthplace of modern professional wrestling, and by many professional wrestlers, to be a beacon of hope for the easing of racial tensions during the civil rights era and beyond. The performers writing on this period speak of racial equality but fail to acknowledge the exploitation of black athletes as a racialized capital commodity who suffered the challenges of systemic racism, codified by a false narrative of aspirational exceptionalism and equality measured by audience diversity. The promoters’ ability to equate racial and capital exploitation with equality leads to a broader discussion of the history of Muscular Christianity in the United States and the exploitation of black bodies. Narratives of racial erasure that dominate the historical discourse when examining athleticism and exceptionalism redefined how blackness existed and how physicality and race are conceived of in sport and entertainment spaces. When discussing the implications of race and professional wrestling, it is important to examine the role of promotions as ‘imagined communities’ where the social agency of wrestlers is defined and quantified based on their ‘desired elements’ as a performer. The intentionally vague nature of this language masks a deep history of racialization that has been perpetuated by promoters and never fully examined by scholars. Sympathetic racism and the omission of cultural identity are also key factors in the limitations and racial barriers placed upon black athletes in the squared circle. The use of sympathetic racism within professional wrestling during the twentieth century defined black athletes into two distinct categorizations, the ‘black savage’ or the ‘black minstrel’. Black wrestlers of the twentieth century were defined by their strength as a capital commodity and their physicality rather than their knowledge of the business and in-ring skill. These performers had little agency in their ability to shape their own character development inside and outside the ring. Promoters would often create personas that heavily racialized the performer by tying them to a regional past or memory, such as that of slavery in the deep south using dog collar matches and adoring black characters in chains. Promoters softened cultural memory by satirizing the historic legacy of slavery and the black identity.

Keywords: sympathetic racism, social agency, racial commodification, stereotyping

Procedia PDF Downloads 111
694 Interactive Garments: Flexible Technologies for Textile Integration

Authors: Anupam Bhatia

Abstract:

Upon reviewing the literature and the pragmatic work done in the field of E- textiles, it is observed that the applications of wearable technologies have found a steady growth in the field of military, medical, industrial, sports; whereas fashion is at a loss to know how to treat this technology and bring it to market. The purpose of this paper is to understand the practical issues of integration of electronics in garments; cutting patterns for mass production, maintaining the basic properties of textiles and daily maintenance of garments that hinder the wide adoption of interactive fabric technology within Fashion and leisure wear. To understand the practical hindrances an experimental and laboratory approach is taken. “Techno Meets Fashion” has been an interactive fashion project where sensor technologies have been embedded with textiles that result in set of ensembles that are light emitting garments, sound sensing garments, proximity garments, shape memory garments etc. Smart textiles, especially in the form of textile interfaces, are drastically underused in fashion and other lifestyle product design. Clothing and some other textile products must be washable, which subjects to the interactive elements to water and chemical immersion, physical stress, and extreme temperature. The current state of the art tends to be too fragile for this treatment. The process for mass producing traditional textiles becomes difficult in interactive textiles. As cutting patterns from larger rolls of cloth and sewing them together to make garments breaks and reforms electronic connections in an uncontrolled manner. Because of this, interactive fabric elements are integrated by hand into textiles produced by standard methods. The Arduino has surely made embedding electronics into textiles much easier than before; even then electronics are not integral to the daily wear garments. Soft and flexible interfaces of MEMS (micro sensors and Micro actuators) can be an option to make this possible by blending electronics within E-textiles in a way that’s seamless and still retains functions of the circuits as well as the garment. Smart clothes, which offer simultaneously a challenging design and utility value, can be only mass produced if the demands of the body are taken care of i.e. protection, anthropometry, ergonomics of human movement, thermo- physiological regulation.

Keywords: ambient intelligence, proximity sensors, shape memory materials, sound sensing garments, wearable technology

Procedia PDF Downloads 371
693 Weak Solutions Of Stochastic Fractional Differential Equations

Authors: Lev Idels, Arcady Ponosov

Abstract:

Stochastic fractional differential equations have recently attracted considerable attention, as they have been used to model real-world processes, which are subject to natural memory effects and measurement uncertainties. Compared to conventional hereditary differential equations, one of the advantages of fractional differential equations is related to more realistic geometric properties of their trajectories that do not intersect in the phase space. In this report, a Peano-like existence theorem for nonlinear stochastic fractional differential equations is proven under very general hypotheses. Several specific classes of equations are checked to satisfy these hypotheses, including delay equations driven by the fractional Brownian motion, stochastic fractional neutral equations and many others.

Keywords: delay equations, operator methods, stochastic noise, weak solutions

Procedia PDF Downloads 186
692 Alternative Key Exchange Algorithm Based on Elliptic Curve Digital Signature Algorithm Certificate and Usage in Applications

Authors: A. Andreasyan, C. Connors

Abstract:

The Elliptic Curve Digital Signature algorithm-based X509v3 certificates are becoming more popular due to their short public and private key sizes. Moreover, these certificates can be stored in Internet of Things (IoT) devices, with limited resources, using less memory and transmitted in network security protocols, such as Internet Key Exchange (IKE), Transport Layer Security (TLS) and Secure Shell (SSH) with less bandwidth. The proposed method gives another advantage, in that it increases the performance of the above-mentioned protocols in terms of key exchange by saving one scalar multiplication operation.

Keywords: cryptography, elliptic curve digital signature algorithm, key exchange, network security protocol

Procedia PDF Downloads 127
691 A Cognitive Training Program in Learning Disability: A Program Evaluation and Follow-Up Study

Authors: Krisztina Bohacs, Klaudia Markus

Abstract:

To author’s best knowledge we are in absence of studies on cognitive program evaluation and we are certainly short of programs that prove to have high effect sizes with strong retention results. The purpose of our study was to investigate the effectiveness of a comprehensive cognitive training program, namely BrainRx. This cognitive rehabilitation program target and remediate seven core cognitive skills and related systems of sub-skills through repeated engagement in game-like mental procedures delivered one-on-one by a clinician, supplemented by digital training. A larger sample of children with learning disability were given pretest and post-test cognitive assessments. The experimental group completed a twenty-week cognitive training program in a BrainRx center. A matched control group received another twenty-week intervention with Feuerstein’s Instrumental Enrichment programs. A second matched control group did not receive training. As for pre- and post-test, we used a general intelligence test to assess IQ and a computer-based test battery for assessing cognition across the lifespan. Multiple regression analyses indicated that the experimental BrainRx treatment group had statistically significant higher outcomes in attention, working memory, processing speed, logic and reasoning, auditory processing, visual processing and long-term memory compared to the non-treatment control group with very large effect sizes. With the exception of logic and reasoning, the BrainRx treatment group realized significantly greater gains in six of the above given seven cognitive measures compared to the Feuerstein control group. Our one-year retention measures showed that all the cognitive training gains were above ninety percent with the greatest retention skills in visual processing, auditory processing, logic, and reasoning. The BrainRx program may be an effective tool to establish long-term cognitive changes in case of students with learning disabilities. Recommendations are made for treatment centers and special education institutions on the cognitive training of students with special needs. The importance of our study is that targeted, systematic, progressively loaded and intensive brain training approach may significantly change learning disabilities.

Keywords: cognitive rehabilitation training, cognitive skills, learning disability, permanent structural cognitive changes

Procedia PDF Downloads 185
690 Influence of Selected Finishing Technologies on the Roughness Parameters of Stainless Steel Manufactured by Selective Laser Melting Method

Authors: J. Hajnys, M. Pagac, J. Petru, P. Stefek, J. Mesicek, J. Kratochvil

Abstract:

The new progressive method of 3D metal printing SLM (Selective Laser Melting) is increasingly expanded into the normal operation. As a result, greater demands are placed on the surface quality of the parts produced in this way. The article deals with research of selected finishing methods (tumbling, face milling, sandblasting, shot peening and brushing) and their impact on the final surface roughness. The 20 x 20 x 7 mm produced specimens using SLM additive technology on the Renishaw AM400 were subjected to testing of these finishing methods by adjusting various parameters. Surface parameters of roughness Sa, Sz were chosen as the evaluation criteria and profile parameters Ra, Rz were used as additional measurements. Optical measurement of surface roughness was performed on Alicona Infinite Focus 5. An experiment conducted to optimize the surface roughness revealed, as expected, that the best roughness parameters were achieved through a face milling operation. Tumbling is particularly suitable for 3D printing components, as tumbling media are able to reach even complex shapes and, after changing to polishing bodies, achieve a high surface gloss. Surface quality after tumbling depends on the process time. Other methods with satisfactory results are shot peening and tumbling, which should be the focus of further research.

Keywords: additive manufacturing, selective laser melting, SLM, surface roughness, stainless steel

Procedia PDF Downloads 117
689 The Explanation for Dark Matter and Dark Energy

Authors: Richard Lewis

Abstract:

The following assumptions of the Big Bang theory are challenged and found to be false: the cosmological principle, the assumption that all matter formed at the same time and the assumption regarding the cause of the cosmic microwave background radiation. The evolution of the universe is described based on the conclusion that the universe is finite with a space boundary. This conclusion is reached by ruling out the possibility of an infinite universe or a universe which is finite with no boundary. In a finite universe, the centre of the universe can be located with reference to our home galaxy (The Milky Way) using the speed relative to the Cosmic Microwave Background (CMB) rest frame and Hubble's law. This places our home galaxy at a distance of approximately 26 million light years from the centre of the universe. Because we are making observations from a point relatively close to the centre of the universe, the universe appears to be isotropic and homogeneous but this is not the case. The CMB is coming from a source located within the event horizon of the universe. There is sufficient mass in the universe to create an event horizon at the Schwarzschild radius. Galaxies form over time due to the energy released by the expansion of space. Conservation of energy must consider total energy which is mass (+ve) plus energy (+ve) plus spacetime curvature (-ve) so that the total energy of the universe is always zero. The predominant position of galaxy formation moves over time from the centre of the universe towards the boundary so that today the majority of new galaxy formation is taking place beyond our horizon of observation at 14 billion light years.

Keywords: cosmology, dark energy, dark matter, evolution of the universe

Procedia PDF Downloads 122
688 Urban Form, Heritage, and Disaster Prevention: What Do They Have in Common?

Authors: Milton Montejano Castillo, Tarsicio Pastrana Salcedo

Abstract:

Based on the hypothesis that disaster risk is constructed socially and historically, this article shows the importance of keeping alive the historical memory of disaster by means of architectural and urban heritage conservation. This is illustrated with three examples of Latin American World Heritage cities where disasters like floods and earthquakes have shaped urban form. Therefore, the study of urban form or ‘Urban Morphology’ is proposed as a tool to understand and analyze urban transformations with the documentation of the occurrence of disasters. Lessons learned from such cities may be useful to reduce disasters risk in contemporary built environments.

Keywords: conservation, disaster risk reduction, urban morphology, World Heritage

Procedia PDF Downloads 521
687 Implications of Fuel Reloading in Heterogeneous Thorium-Based Fuel Designs for Improved Fuel Cycle Characteristics

Authors: Hendrik Bernard Van Der Walt, Frik Van Niekerk

Abstract:

Fuel models render a reduction in BOL when thorium is added to a reactor core. Thorium emulates the role of a fertile poison, and is beneficial for reducing beginning of cycle (BOC) excess reactivity. In spite of the build-up of 233U over the duration of a fuel cycle, the effects of fuel reloading have a significant impact on fuel viability, especially in the case of heterogeneous thorium-based fuels. The most common practice of compensating for the reduction of BOC reactivity is the addition of fissile isotopes (uranium fuel with increased enrichment or plutonium). This study introduces a heterogeneous thorium-based fuel with minimal fissile isotope additions. A pseudo reloading scheme was developed for numerical simulations of an infinite reactor based on the North-Anna 1 reactor operating in Virginia, USA. Use of this reloading pattern allows new thorium-based fuel to be loaded into the reactor model as part of a phasing in strategy at the end of any conventional reactor cycle. Results demonstrate the effects of thorium-based fuel on fuel cycle characteristics such as fuel cycle length, neutron economy and material matrix. Application of the above mentioned approach delivered promising results and presents a heterogeneous thorium-based fuel which could replace conventional fuel of typical, currently operating (or future) reactors without the need for expensive reactor redesign or fuel recycling strategies.

Keywords: nuclear fuel, nuclear characteristics, nuclear fuel cycle, thorium-based fuel, heterogeneous design, fuel reloading

Procedia PDF Downloads 117
686 Circadian Clock and Subjective Time Perception: A Simple Open Source Application for the Analysis of Induced Time Perception in Humans

Authors: Agata M. Kołodziejczyk, Mateusz Harasymczuk, Pierre-Yves Girardin, Lucie Davidová

Abstract:

Subjective time perception implies connection to cognitive functions, attention, memory and awareness, but a little is known about connections with homeostatic states of the body coordinated by circadian clock. In this paper, we present results from experimental study of subjective time perception in volunteers performing physical activity on treadmill in various phases of their circadian rhythms. Subjects were exposed to several time illusions simulated by programmed timing systems. This study brings better understanding for further improvement of of work quality in isolated areas. 

Keywords: biological clock, light, time illusions, treadmill

Procedia PDF Downloads 314
685 Nonlinear Evolution on Graphs

Authors: Benniche Omar

Abstract:

We are concerned with abstract fully nonlinear differential equations having the form y’(t)=Ay(t)+f(t,y(t)) where A is an m—dissipative operator (possibly multi—valued) defined on a subset D(A) of a Banach space X with values in X and f is a given function defined on I×X with values in X. We consider a graph K in I×X. We recall that K is said to be viable with respect to the above abstract differential equation if for each initial data in K there exists at least one trajectory starting from that initial data and remaining in K at least for a short time. The viability problem has been studied by many authors by using various techniques and frames. If K is closed, it is shown that a tangency condition, which is mainly linked to the dynamic, is crucial for viability. In the case when X is infinite dimensional, compactness and convexity assumptions are needed. In this paper, we are concerned with the notion of near viability for a given graph K with respect to y’(t)=Ay(t)+f(t,y(t)). Roughly speaking, the graph K is said to be near viable with respect to y’(t)=Ay(t)+f(t,y(t)), if for each initial data in K there exists at least one trajectory remaining arbitrary close to K at least for short time. It is interesting to note that the near viability is equivalent to an appropriate tangency condition under mild assumptions on the dynamic. Adding natural convexity and compactness assumptions on the dynamic, we may recover the (exact) viability. Here we investigate near viability for a graph K in I×X with respect to y’(t)=Ay(t)+f(t,y(t)) where A and f are as above. We emphasis that the t—dependence on the perturbation f leads us to introduce a new tangency concept. In the base of a tangency conditions expressed in terms of that tangency concept, we formulate criteria for K to be near viable with respect to y’(t)=Ay(t)+f(t,y(t)). As application, an abstract null—controllability theorem is given.

Keywords: abstract differential equation, graph, tangency condition, viability

Procedia PDF Downloads 125
684 Discovery the Relics of Buddhist Stupa at Thanesar, Kurukshetra

Authors: Chander Shekhar, Manoj Kumar

Abstract:

Present paper deal with the discovery of the stupa’s relics which belongs to the Kushana period. These remains were found during the scientific clearance work at a mound near Brahma-SarovarThanesar, Kurukshetra. This archaeological work was done by Department of Archaeology & Museums Haryana Government. The relics of stupa show that it would have been similar to Assandh and Damekhstupa. As per-Buddhist literature, GoutamBudhha reached Thanesar. In memory of Buddh’s Journey, King Ashoka built a big Stupa at Thanesar on the bank of Sarasvati River. Chinese pilgrim Yuan Chuang also referred a Monastery and stupa near Aujas-ghatof Brahma-sarovar. It may be part of that settlement which was mentioned by Yuan Chuang.

Keywords: archaeology, stupa, buddhism, excavtoin

Procedia PDF Downloads 161
683 Resistive Switching Characteristics of Resistive Random Access Memory Devices after Furnace Annealing Processes

Authors: Chi-Yan Chu, Kai-Chi Chuang, Huang-Chung Cheng

Abstract:

In this study, the RRAM devices with the TiN/Ti/HfOx/TiN structure were fabricated, then the electrical characteristics of the devices without annealing and after 400 °C and 500 °C of the furnace annealing (FA) temperature processes were compared. The RRAM devices after the FA’s 400 °C showed the lower forming, set and reset voltages than the other devices without annealing. However, the RRAM devices after the FA’s 500 °C did not show any electrical characteristics because the TiN/Ti/HfOx/TiN device was oxidized, as shown in the XPS analysis. From these results, the RRAM devices after the FA’s 400 °C showed the best electrical characteristics.

Keywords: RRAM, furnace annealing (FA), forming, set and reset voltages, XPS

Procedia PDF Downloads 355
682 Fuel Inventory/ Depletion Analysis for a Thorium-Uranium Dioxide (Th-U) O2 Pin Cell Benchmark Using Monte Carlo and Deterministic Codes with New Version VIII.0 of the Evaluated Nuclear Data File (ENDF/B) Nuclear Data Library

Authors: Jamal Al-Zain, O. El Hajjaji, T. El Bardouni

Abstract:

A (Th-U) O2 fuel pin benchmark made up of 25 w/o U and 75 w/o Th was used. In order to analyze the depletion and inventory of the fuel for the pressurized water reactor pin-cell model. The new version VIII.0 of the ENDF/B nuclear data library was used to create a data set in ACE format at various temperatures and process the data using the MAKXSF6.2 and NJOY2016 programs to process the data at the various temperatures in order to conduct this study and analyze cross-section data. The infinite multiplication factor, the concentrations and activities of the main fission products, the actinide radionuclides accumulated in the pin cell, and the total radioactivity were all estimated and compared in this study using the Monte Carlo N-Particle 6 (MCNP6.2) and DRAGON5 programs. Additionally, the behavior of the Pressurized Water Reactor (PWR) thorium pin cell that is dependent on burn-up (BU) was validated and compared with the reference data obtained using the Massachusetts Institute of Technology (MIT-MOCUP), Idaho National Engineering and Environmental Laboratory (INEEL-MOCUP), and CASMO-4 codes. The results of this study indicate that all of the codes examined have good agreements.

Keywords: PWR thorium pin cell, ENDF/B-VIII.0, MAKXSF6.2, NJOY2016, MCNP6.2, DRAGON5, fuel burn-up.

Procedia PDF Downloads 72
681 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: Gaelle Candel, David Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embeddings. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n²) to O(n²=k), and the memory requirement from n² to 2(n=k)², which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution, and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning

Procedia PDF Downloads 127
680 Optimal ECG Sampling Frequency for Multiscale Entropy-Based HRV

Authors: Manjit Singh

Abstract:

Multiscale entropy (MSE) is an extensively used index to provide a general understanding of multiple complexity of physiologic mechanism of heart rate variability (HRV) that operates on a wide range of time scales. Accurate selection of electrocardiogram (ECG) sampling frequency is an essential concern for clinically significant HRV quantification; high ECG sampling rate increase memory requirements and processing time, whereas low sampling rate degrade signal quality and results in clinically misinterpreted HRV. In this work, the impact of ECG sampling frequency on MSE based HRV have been quantified. MSE measures are found to be sensitive to ECG sampling frequency and effect of sampling frequency will be a function of time scale.

Keywords: ECG (electrocardiogram), heart rate variability (HRV), multiscale entropy, sampling frequency

Procedia PDF Downloads 252
679 A Reduced Distributed Sate Space for Modular Petri Nets

Authors: Sawsen Khlifa, Chiheb AMeur Abid, Belhassan Zouari

Abstract:

Modular verification approaches have been widely attempted to cope with the well known state explosion problem. This paper deals with the modular verification of modular Petri nets. We propose a reduced version for the modular state space of a given modular Petri net. The new structure allows the creation of smaller modular graphs. Each one draws the behavior of the corresponding module and outlines some global information. Hence, this version helps to overcome the explosion problem and to use less memory space. In this condensed structure, the verification of some generic properties concerning one module is limited to the exploration of its associated graph.

Keywords: distributed systems, modular verification, petri nets, state space explosition

Procedia PDF Downloads 94
678 A Video Surveillance System Using an Ensemble of Simple Neural Network Classifiers

Authors: Rodrigo S. Moreira, Nelson F. F. Ebecken

Abstract:

This paper proposes a maritime vessel tracker composed of an ensemble of WiSARD weightless neural network classifiers. A failure detector analyzes vessel movement with a Kalman filter and corrects the tracking, if necessary, using FFT matching. The use of the WiSARD neural network to track objects is uncommon. The additional contributions of the present study include a performance comparison with four state-of-art trackers, an experimental study of the features that improve maritime vessel tracking, the first use of an ensemble of classifiers to track maritime vessels and a new quantization algorithm that compares the values of pixel pairs.

Keywords: ram memory, WiSARD weightless neural network, object tracking, quantization

Procedia PDF Downloads 292