Search results for: Extended Park´s vector approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15952

Search results for: Extended Park´s vector approach

14722 Smartphone-Based Human Activity Recognition by Machine Learning Methods

Authors: Yanting Cao, Kazumitsu Nawata

Abstract:

As smartphones upgrading, their software and hardware are getting smarter, so the smartphone-based human activity recognition will be described as more refined, complex, and detailed. In this context, we analyzed a set of experimental data obtained by observing and measuring 30 volunteers with six activities of daily living (ADL). Due to the large sample size, especially a 561-feature vector with time and frequency domain variables, cleaning these intractable features and training a proper model becomes extremely challenging. After a series of feature selection and parameters adjustment, a well-performed SVM classifier has been trained.

Keywords: smart sensors, human activity recognition, artificial intelligence, SVM

Procedia PDF Downloads 136
14721 Numerical Simulation of Phase Transfer during Cryosurgery for an Irregular Tumor Using Hybrid Approach

Authors: Rama Bhargava, Surabhi Nishad

Abstract:

The infusion of nanofluids has dramatically enhanced the heat-carrying capacity of the fluids, applicable to many engineering and medical process where the temperature below freezing is required. Cryosurgery is an efficient therapy for the treatment of cancer, but sometimes the excessive cooling may harm the nearby healthy cells. Efforts are therefore done to develop a model which can cause to generate the low temperature as required. In the present study, a mathematical model is developed based on the bioheat transfer equation to simulate the heat transfer from the probe on a tumor (with irregular domain) using the hybrid technique consisting of element free Galerkin method with αα-family of approximation. The probe is loaded will nano-particles. The effects of different nanoparticles, namely Al₂O₃, Fe₃O₄, Au on the heat-producing rate, is obtained. It is observed that the temperature can be brought to (60°C)-(-30°C) at a faster freezing rate on the infusion of different nanoparticles. Besides increasing the freezing rate, the volume of the nanoparticle can also control the size and growth of ice crystals formed during the freezing process. The study is also made to find the time required to achieve the desired temperature. The problem is further extended for multi tumors of different shapes and sizes. The irregular shape of the frozen domain and the direction of ice growth are very sensitive issues, posing a challenge for simulation. The Meshfree method has been one of the accurate methods in such problems as a domain is naturally irregular. The discretization is done using the nodes only. MLS approximation is taken in order to generate the shape functions. Sufficiently accurate results are obtained.

Keywords: cryosurgery, EFGM, hybrid, nanoparticles

Procedia PDF Downloads 113
14720 An Investigation of the Relationship between Organizational Culture and Innovation Type: A Mixed Method Study Using the OCAI in a Telecommunication Company in Saudi Arabia

Authors: A. Almubrad, R. Clouse, A. Aljlaoud

Abstract:

Organizational culture (OC) is recognized to have an influence on the propensity of organizations to innovate. It is also presumed that it may impede the innovation process from thriving within the organization. Investigating the role organizational culture plays in enabling or inhibiting innovation merits exploration to investigate organizational cultural attributes necessary to reach innovation goals. This study aims to investigate a preliminary matching heuristic of OC attributes to the type of innovation that has the potential to thrive within those attributes. A mixed methods research approach was adopted to achieve the research aims. Accordingly, participants from a national telecom company in Saudi Arabia took the Organizational Culture Assessment Instrument (OCAI). A further sample selected from the respondents’ pool holding the role of managing directors was interviewed in the qualitative phase. Our study findings reveal that the market culture type has a tendency to adopt radical innovations to disrupt the market and to preserve its market position. In contrast, we find that the adhocracy culture type tends to adopt the incremental innovation type and found this tends to be more convenient for employees due to its low levels of uncertainty. Our results are an encouraging indication that matching organizational culture attributes to the type of innovation aids in innovation management. This study carries limitations while drawing its findings from a limited sample of OC attributes that identify with the adhocracy and market culture types. An extended investigation is merited to explore other types of organizational cultures and their optimal innovation types.

Keywords: incremental innovation, radical innovation, organization culture, market culture, adhocracy culture, OACI

Procedia PDF Downloads 93
14719 Classroom Discourse and English Language Teaching: Issues, Importance, and Implications

Authors: Rabi Abdullahi Danjuma, Fatima Binta Attahir

Abstract:

Classroom discourse is important, and it is worth examining what the phenomena is and how it helps both the teacher and students in a classroom situation. This paper looks at the classroom as a traditional social setting which has its own norms and values. The paper also explains what discourse is, as extended communication in speech or writing often interactively dealing with some particular topics. It also discusses classroom discourse as the language which teachers and students use to communicate with each other in a classroom situation. The paper also looks at some strategies for effective classroom discourse. Finally, implications and recommendations were drawn.

Keywords: classroom, discourse, learning, student, strategies, communication

Procedia PDF Downloads 586
14718 Biomimetic Architecture from the Inspiration by Nature to the Innovation of the Saharan Architecture

Authors: Yassine Mohammed Benyoucef, Razin Andery Dionisovich

Abstract:

Biomimicry is an old approach, but in the scientific conceptualization is new, as an approach of innovation based on the emulation of Nature, in recent years, this approach brings many potential theories and innovations in the architecture field. Indeed, these innovations have changed our view towards other Natural organisms also to the design processes in architecture, now the use of the biomimicry approach allows the application of a great sustainable development. The Sahara area is heading towards a sustainable policy with the desire to develop this rich context in terms of architecture, because of the rapid evolution of the architectural and urban concepts and the technology acceleration in one side, and under the pressure of the architectural crisis and the accelerated urbanization in the Saharan cities on the other side, the imperatives of sustainable development, ecology, climate adaptation, energy needs, are strongly imposed. Besides that, the new architectural and urban projects in the Saharan cities are not reliable in terms of energy efficiency and design and relationship with the environment. This article discusses the using of biomimetic strategy in the sustainable development of Saharan architecture. The aim of the article is to present a synthesis of biomimicry approach and propose the biomimicry as a solution for the development of Saharan architecture which can use this approach as a sustainable and innovation strategy. The biomimicry is the solution for effective strategies of development and can have a great potential point to meet the current challenges of designing efficient for forms or structures, energy efficiency, and climate issues. Moreover, the Sahara can be a favorable soil for great changes, the use of this approach is the key for the most optimal strategies and sustainable development of the Saharan architecture.

Keywords: biomimicry, Sahara, architecture, nature, innovation, technology

Procedia PDF Downloads 181
14717 The Linguistic Fingerprint in Western and Arab Judicial Applications

Authors: Asem Bani Amer

Abstract:

This study handles the linguistic fingerprint in judicial applications described in a law technicality that is recent and developing. It can be adopted to discover criminals by identifying their way of speaking and their special linguistic expressions. This is achieved by understanding the expression "linguistic fingerprint," its concept, and its extended domain, then revealing some of the linguistic fingerprint tools in Western judicial applications and deducing a technical imagination for a linguistic fingerprint in the Arabic language, which is needy for such judicial applications regarding this field, through dictionaries, language rhythm, and language structure.

Keywords: linguistic fingerprint, judicial, application, dictionary, picture, rhythm, structure

Procedia PDF Downloads 72
14716 A Clustering-Sequencing Approach to the Facility Layout Problem

Authors: Saeideh Salimpour, Sophie-Charlotte Viaux, Ahmed Azab, Mohammed Fazle Baki

Abstract:

The Facility Layout Problem (FLP) is key to the efficient and cost-effective operation of a system. This paper presents a hybrid heuristic- and mathematical-programming-based approach that divides the problem conceptually into those of clustering and sequencing. First, clusters of vertically aligned facilities are formed, which are later on sequenced horizontally. The developed methodology provides promising results in comparison to its counterparts in the literature by minimizing the inter-distances for facilities which have more interactions amongst each other and aims at placing the facilities with more interactions at the centroid of the shop.

Keywords: clustering-sequencing approach, mathematical modeling, optimization, unequal facility layout problem

Procedia PDF Downloads 319
14715 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 153
14714 Detection of Parkinsonian Freezing of Gait

Authors: Sang-Hoon Park, Yeji Ho, Gwang-Moon Eom

Abstract:

Fast and accurate detection of Freezing of Gait (FOG) is desirable for appropriate application of cueing which has been shown to ameliorate FOG. Utilization of frequency spectrum of leg acceleration to derive the freeze index requires much calculation and it would lead to delayed cueing. We hypothesized that FOG can be reasonably detected from the time domain amplitude of foot acceleration. A time instant was recognized as FOG if the mean amplitude of the acceleration in the time window surrounding the time instant was in the specific FOG range. Parameters required in the FOG detection was optimized by simulated annealing. The suggested time domain methods showed performances comparable to those of frequency domain methods.

Keywords: freezing of gait, detection, Parkinson's disease, time-domain method

Procedia PDF Downloads 428
14713 Approach of Measuring System Analyses for Automotive Part Manufacturing

Authors: S. Homrossukon, S. Sansureerungsigun

Abstract:

This work aims to introduce an efficient and to standardize the measuring system analyses for automotive industrial. The study started by literature reviewing about the management and analyses measurement system. The approach of measuring system management, then, was constructed. Such approach was validated by collecting the current measuring system data using the equipments of interest including vernier caliper and micrometer. Their accuracy and precision of measurements were analyzed. Finally, the measuring system was improved and evaluated. The study showed that vernier did not meet its measuring characteristics based on the linearity whereas all equipment were lacking of the measuring precision characteristics. Consequently, the causes of measuring variation via the equipment of interest were declared. After the improvement, it was found that their measuring performance could be accepted as the standard required. Finally, the standardized approach for analyzing the measuring system of automotive was concluded.

Keywords: automotive part manufacturing measurement, measuring accuracy, measuring precision, measurement system analyses

Procedia PDF Downloads 300
14712 Examining Risk Based Approach to Financial Crime in the Charity Sector: The Challenges and Solutions, Evidence from the Regulation of Charities in England and Wales

Authors: Paschal Ohalehi

Abstract:

Purpose - The purpose of this paper, which is part of a PhD thesis is to examine the role of risk based approach in minimising financial crime in the charity sector as well as offer recommendations to improving the quality of charity regulation whilst still retaining risk based approach as a regulatory framework and also making a case for a new regulatory model. Increase in financial crimes in the charity sector has put the role of regulation in minimising financial crime up for debates amongst researchers and practitioners. Although previous research has addressed the regulation of charities, research on the role of risk based approach to minimising financial crime in the charity sector is limited. Financial crime is a concern for all organisation including charities. Design/methodology/approach - This research adopts a social constructionist’s epistemological position. This research is carried out using semi structured in-depth interviews amongst randomly selected 24 charity trustees divided into three classes: 10 small charities, 10 medium charities and 4 large charities. The researcher also interviewed 4 stakeholders (NFA, Charity Commission and two different police forces in terms of size and area of coverage) in the charity sector. Findings - The results of this research show that reliance on risk based approach to financial crime in the sector is weak and fragmented with the research pointing to a clear evidence of disconnect between the regulator and the regulated leading to little or lack of regulation of trustees’ activities, limited monitoring of charities and lack of training and awareness on financial crime in the sector. Originality – This paper shows how regulation of charities in general and risk based approach in particular can be improved in order to meet the expectations of the stakeholders, the public, the regulator and the regulated.

Keywords: risk, risk based approach, financial crime, fraud, self-regulation

Procedia PDF Downloads 366
14711 Experiential Learning for Upholding Entrepreneurship Education: A Case Study from Egypt

Authors: Randa El Bedawy

Abstract:

Exchanging best practices in the scope of entrepreneurship education and the use of experiential learning approaches are growing lately at a very fast pace. Educators should be challenged to promote such a learning approach to bridge the gap between entrepreneurship students and the actual business work environment. The study aims to share best practices, experiences, and knowledge to support entrepreneurship education. The study is exploratory qualitative research based on a case study approach to demonstrate how experiential learning can be used for supporting learning effectiveness in entrepreneurship education through demonstrating a set of fourteen tasks that were used to engage practically the students who were studying a course of entrepreneurship at the American University in Cairo. The study sheds the light on the rational process of using experiential learning to endorse entrepreneurship education through the illustration of each task along with its learning outcomes. The study explores the benefits and obstacles that educators may face when implementing such an experiential approach. The results of the study confirm that developing an experiential learning approach based on constructing a set of well designed practical tasks that complement the overall intended learning outcomes has proven very effective for promoting the students’ learning of entrepreneurship education. However, good preparation for both educators and students is needed primarily to ensure the effective implementation of such an experiential learning approach.

Keywords: business education, entrepreneurship, entrepreneurship education, experiential learning

Procedia PDF Downloads 149
14710 PolyScan: Comprehending Human Polymicrobial Infections for Vector-Borne Disease Diagnostic Purposes

Authors: Kunal Garg, Louise Theusen Hermansan, Kanoktip Puttaraska, Oliver Hendricks, Heidi Pirttinen, Leona Gilbert

Abstract:

The Germ Theory (one infectious determinant is equal to one disease) has unarguably evolved our capability to diagnose and treat infectious diseases over the years. Nevertheless, the advent of technology, climate change, and volatile human behavior has brought about drastic changes in our environment, leading us to question the relevance of the Germ Theory in our day, i.e. will vector-borne disease (VBD) sufferers produce multiple immune responses when tested for multiple microbes? Vector diseased patients producing multiple immune responses to different microbes would evidently suggest human polymicrobial infections (HPI). Ongoing diagnostic tools are exceedingly unequipped with the current research findings that would aid in diagnosing patients for polymicrobial infections. This shortcoming has caused misdiagnosis at very high rates, consequently diminishing the patient’s quality of life due to inadequate treatment. Equipped with the state-of-art scientific knowledge, PolyScan intends to address the pitfalls in current VBD diagnostics. PolyScan is a multiplex and multifunctional enzyme linked Immunosorbent assay (ELISA) platform that can test for numerous VBD microbes and allow simultaneous screening for multiple types of antibodies. To validate PolyScan, Lyme Borreliosis (LB) and spondyloarthritis (SpA) patient groups (n = 54 each) were tested for Borrelia burgdorferi, Borrelia burgdorferi Round Body (RB), Borrelia afzelii, Borrelia garinii, and Ehrlichia chaffeensis against IgM and IgG antibodies. LB serum samples were obtained from Germany and SpA serum samples were obtained from Denmark under relevant ethical approvals. The SpA group represented chronic LB stage because reactive arthritis (SpA subtype) in the form of Lyme arthritis links to LB. It was hypothesized that patients from both the groups will produce multiple immune responses that as a consequence would evidently suggest HPI. It was also hypothesized that the multiple immune response proportion in SpA patient group would be significantly larger when compared to the LB patient group across both antibodies. It was observed that 26% LB patients and 57% SpA patients produced multiple immune responses in contrast to 33% LB patients and 30% SpA patients that produced solitary immune responses when tested against IgM. Similarly, 52% LB patients and an astounding 73% SpA patients produced multiple immune responses in contrast to 30% LB patients and 8% SpA patients that produced solitary immune responses when tested against IgG. Interestingly, IgM immune dysfunction in both the patient groups was also recorded. Atypically, 6% of the unresponsive 18% LB with IgG antibody was recorded producing multiple immune responses with the IgM antibody. Similarly, 12% of the unresponsive 19% SpA with IgG antibody was recorded producing multiple immune responses with the IgM antibody. Thus, results not only supported hypothesis but also suggested that IgM may atypically prevail longer than IgG. The PolyScan concept will aid clinicians to detect patients for early, persistent, late, polymicrobial, & immune dysfunction conditions linked to different VBD. PolyScan provides a paradigm shift for the VBD diagnostic industry to follow that will drastically shorten patient’s time to receive adequate treatment.

Keywords: diagnostics, immune dysfunction, polymicrobial, TICK-TAG

Procedia PDF Downloads 316
14709 A Fundamental Study for Real-Time Safety Evaluation System of Landing Pier Using FBG Sensor

Authors: Heungsu Lee, Youngseok Kim, Jonghwa Yi, Chul Park

Abstract:

A landing pier is subjected to safety assessment by visual inspection and design data, but it is difficult to check the damage in real-time. In this study, real - time damage detection and safety evaluation methods were studied. As a result of structural analysis of the arbitrary landing pier structure, the inflection point of deformation and moment occurred at 10%, 50%, and 90% of pile length. The critical value of Fiber Bragg Grating (FBG) sensor was set according to the safety factor, and the FBG sensor application method for real - time safety evaluation was derived.

Keywords: FBG sensor, harbor structure, maintenance, safety evaluation system

Procedia PDF Downloads 201
14708 Prediction of Formation Pressure Using Artificial Intelligence Techniques

Authors: Abdulmalek Ahmed

Abstract:

Formation pressure is the main function that affects drilling operation economically and efficiently. Knowing the pore pressure and the parameters that affect it will help to reduce the cost of drilling process. Many empirical models reported in the literature were used to calculate the formation pressure based on different parameters. Some of these models used only drilling parameters to estimate pore pressure. Other models predicted the formation pressure based on log data. All of these models required different trends such as normal or abnormal to predict the pore pressure. Few researchers applied artificial intelligence (AI) techniques to predict the formation pressure by only one method or a maximum of two methods of AI. The objective of this research is to predict the pore pressure based on both drilling parameters and log data namely; weight on bit, rotary speed, rate of penetration, mud weight, bulk density, porosity and delta sonic time. A real field data is used to predict the formation pressure using five different artificial intelligence (AI) methods such as; artificial neural networks (ANN), radial basis function (RBF), fuzzy logic (FL), support vector machine (SVM) and functional networks (FN). All AI tools were compared with different empirical models. AI methods estimated the formation pressure by a high accuracy (high correlation coefficient and low average absolute percentage error) and outperformed all previous. The advantage of the new technique is its simplicity, which represented from its estimation of pore pressure without the need of different trends as compared to other models which require a two different trend (normal or abnormal pressure). Moreover, by comparing the AI tools with each other, the results indicate that SVM has the advantage of pore pressure prediction by its fast processing speed and high performance (a high correlation coefficient of 0.997 and a low average absolute percentage error of 0.14%). In the end, a new empirical correlation for formation pressure was developed using ANN method that can estimate pore pressure with a high precision (correlation coefficient of 0.998 and average absolute percentage error of 0.17%).

Keywords: Artificial Intelligence (AI), Formation pressure, Artificial Neural Networks (ANN), Fuzzy Logic (FL), Support Vector Machine (SVM), Functional Networks (FN), Radial Basis Function (RBF)

Procedia PDF Downloads 143
14707 Electret: A Solution of Partial Discharge in High Voltage Applications

Authors: Farhina Haque, Chanyeop Park

Abstract:

The high efficiency, high field, and high power density provided by wide bandgap (WBG) semiconductors and advanced power electronic converter (PEC) topologies enabled the dynamic control of power in medium to high voltage systems. Although WBG semiconductors outperform the conventional Silicon based devices in terms of voltage rating, switching speed, and efficiency, the increased voltage handling properties, high dv/dt, and compact device packaging increase local electric fields, which are the main causes of partial discharge (PD) in the advanced medium and high voltage applications. PD, which occurs actively in voids, triple points, and airgaps, is an inevitable dielectric challenge that causes insulation and device aging. The aging process accelerates over time and eventually leads to the complete failure of the applications. Hence, it is critical to mitigating PD. Sharp edges, airgaps, triple points, and bubbles are common defects that exist in any medium to high voltage device. The defects are created during the manufacturing processes of the devices and are prone to high-electric-field-induced PD due to the low permittivity and low breakdown strength of the gaseous medium filling the defects. A contemporary approach of mitigating PD by neutralizing electric fields in high power density applications is introduced in this study. To neutralize the locally enhanced electric fields that occur around the triple points, airgaps, sharp edges, and bubbles, electrets are developed and incorporated into high voltage applications. Electrets are electric fields emitting dielectric materials that are embedded with electrical charges on the surface and in bulk. In this study, electrets are fabricated by electrically charging polyvinylidene difluoride (PVDF) films based on the widely used triode corona discharge method. To investigate the PD mitigation performance of the fabricated electret films, a series of PD experiments are conducted on both the charged and uncharged PVDF films under square voltage stimuli that represent PWM waveform. In addition to the use of single layer electrets, multiple layers of electrets are also experimented with to mitigate PD caused by higher system voltages. The electret-based approach shows great promise in mitigating PD by neutralizing the local electric field. The results of the PD measurements suggest that the development of an ultimate solution to the decades-long dielectric challenge would be possible with further developments in the fabrication process of electrets.

Keywords: electrets, high power density, partial discharge, triode corona discharge

Procedia PDF Downloads 196
14706 Fault Tolerant Control System Using a Multiple Time Scale SMC Technique and a Geometric Approach

Authors: Ghodbane Azeddine, Saad Maarouf, Boland Jean-Francois, Thibeault Claude

Abstract:

This paper proposes a new design of an active fault-tolerant flight control system against abrupt actuator faults. This overall system combines a multiple time scale sliding mode controller for fault compensation and a geometric approach for fault detection and diagnosis. The proposed control system is able to accommodate several kinds of partial and total actuator failures, by using available healthy redundancy actuators. The overall system first estimates the correct fault information using the geometric approach. Then, and based on that, a new reconfigurable control law is designed based on the multiple time scale sliding mode technique for on-line compensating the effect of such faults. This approach takes advantages of the fact that there are significant difference between the time scales of aircraft states that have a slow dynamics and those that have a fast dynamics. The closed-loop stability of the overall system is proved using Lyapunov technique. A case study of the non-linear model of the F16 fighter, subject to the rudder total loss of control confirms the effectiveness of the proposed approach.

Keywords: actuator faults, fault detection and diagnosis, fault tolerant flight control, sliding mode control, multiple time scale approximation, geometric approach for fault reconstruction, lyapunov stability

Procedia PDF Downloads 362
14705 Involving Participants at the Methodological Design Stage: The Group Repertory Grid Approach

Authors: Art Tsang

Abstract:

In educational research, the scope of investigations has almost always been determined by researchers. As learners are at the forefront of education, it is essential to balance researchers’ and learners’ voices in educational studies. In this paper, a data collection method that helps partly address the dearth of learners’ voices in research design is introduced. Inspired by the repertory grid approach (RGA), the group RGA approach, created by the author and his doctoral student, was successfully piloted with learners in Hong Kong. This method will very likely be of interest and use to many researchers, teachers, and postgraduate students in the field of education and beyond.

Keywords: education, learners, repertory grids, research methods

Procedia PDF Downloads 50
14704 Quantum Information Scrambling and Quantum Chaos in Silicon-Based Fermi-Hubbard Quantum Dot Arrays

Authors: Nikolaos Petropoulos, Elena Blokhina, Andrii Sokolov, Andrii Semenov, Panagiotis Giounanlis, Xutong Wu, Dmytro Mishagli, Eugene Koskin, Robert Bogdan Staszewski, Dirk Leipold

Abstract:

We investigate entanglement and quantum information scrambling (QIS) by the example of a many-body Extended and spinless effective Fermi-Hubbard Model (EFHM and e-FHM, respectively) that describes a special type of quantum dot array provided by Equal1 labs silicon-based quantum computer. The concept of QIS is used in the framework of quantum information processing by quantum circuits and quantum channels. In general, QIS is manifest as the de-localization of quantum information over the entire quantum system; more compactly, information about the input cannot be obtained by local measurements of the output of the quantum system. In our work, we will first make an introduction to the concept of quantum information scrambling and its connection with the 4-point out-of-time-order (OTO) correlators. In order to have a quantitative measure of QIS we use the tripartite mutual information, in similar lines to previous works, that measures the mutual information between 4 different spacetime partitions of the system and study the Transverse Field Ising (TFI) model; this is used to quantify the dynamical spreading of quantum entanglement and information in the system. Then, we investigate scrambling in the quantum many-body Extended Hubbard Model with external magnetic field Bz and spin-spin coupling J for both uniform and thermal quantum channel inputs and show that it scrambles for specific external tuning parameters (e.g., tunneling amplitudes, on-site potentials, magnetic field). In addition, we compare different Hilbert space sizes (different number of qubits) and show the qualitative and quantitative differences in quantum scrambling as we increase the number of quantum degrees of freedom in the system. Moreover, we find a "scrambling phase transition" for a threshold temperature in the thermal case, that is, the temperature of the model that the channel starts to scramble quantum information. Finally, we make comparisons to the TFI model and highlight the key physical differences between the two systems and mention some future directions of research.

Keywords: condensed matter physics, quantum computing, quantum information theory, quantum physics

Procedia PDF Downloads 84
14703 A Mathematical Equation to Calculate Stock Price of Different Growth Model

Authors: Weiping Liu

Abstract:

This paper presents an equation to calculate stock prices of different growth model. This equation is mathematically derived by using discounted cash flow method. It has the advantages of being very easy to use and very accurate. It can still be used even when the first stage is lengthy. This equation is more generalized because it can be used for all the three popular stock price models. It can be programmed into financial calculator or electronic spreadsheets. In addition, it can be extended to a multistage model. It is more versatile and efficient than the traditional methods.

Keywords: stock price, multistage model, different growth model, discounted cash flow method

Procedia PDF Downloads 394
14702 Adaptive Process Monitoring for Time-Varying Situations Using Statistical Learning Algorithms

Authors: Seulki Lee, Seoung Bum Kim

Abstract:

Statistical process control (SPC) is a practical and effective method for quality control. The most important and widely used technique in SPC is a control chart. The main goal of a control chart is to detect any assignable changes that affect the quality output. Most conventional control charts, such as Hotelling’s T2 charts, are commonly based on the assumption that the quality characteristics follow a multivariate normal distribution. However, in modern complicated manufacturing systems, appropriate control chart techniques that can efficiently handle the nonnormal processes are required. To overcome the shortcomings of conventional control charts for nonnormal processes, several methods have been proposed to combine statistical learning algorithms and multivariate control charts. Statistical learning-based control charts, such as support vector data description (SVDD)-based charts, k-nearest neighbors-based charts, have proven their improved performance in nonnormal situations compared to that of the T2 chart. Beside the nonnormal property, time-varying operations are also quite common in real manufacturing fields because of various factors such as product and set-point changes, seasonal variations, catalyst degradation, and sensor drifting. However, traditional control charts cannot accommodate future condition changes of the process because they are formulated based on the data information recorded in the early stage of the process. In the present paper, we propose a SVDD algorithm-based control chart, which is capable of adaptively monitoring time-varying and nonnormal processes. We reformulated the SVDD algorithm into a time-adaptive SVDD algorithm by adding a weighting factor that reflects time-varying situations. Moreover, we defined the updating region for the efficient model-updating structure of the control chart. The proposed control chart simultaneously allows efficient model updates and timely detection of out-of-control signals. The effectiveness and applicability of the proposed chart were demonstrated through experiments with the simulated data and the real data from the metal frame process in mobile device manufacturing.

Keywords: multivariate control chart, nonparametric method, support vector data description, time-varying process

Procedia PDF Downloads 290
14701 Voices of the Grown-Ups: Transnational Rearing among Chinese Families

Authors: Laura Lamas Abraira

Abstract:

Large-scale Chinese immigration in Spain emerged in the 80's. Engaged in their own businesses or working for other Chinese migrants with long schedules, young couples had to choose between contracting or transnationalising the care labour as they were unable to combine productive and reproductive tasks. In most cases, they decided to transnationalize the care labour embodied on grandparents or children migratory paths. Either the grandparents go to Spain to take care of their grandchildren or the kids were left behind or sent to China after being born in Spain in order to be raised with their extended family members. Very little is known about how the people who have been raised in a transnational context relates their own experience and agency as care managers within the family care cycle. In order to fill this gap, this paper aims to inquire into these transnationally-reared Chinese young adults’ narratives about their own experience and expectations (past, present and future) by adopting care circulation and care cycle approach within life course framework. Drawing upon a qualitative study resulting from a multi-sited ethnography (Spain-China), we argue that young adults raised in transnational context build their narratives as a result of an otherness process related to their parents and an essentialization of their Chinese roots to use selectively among different contexts. In doing so, these family narratives constitute a part of their social identity that interact with other dimensions such as the ethnic one. We suggest when building their parent's otherness they also build their sameness among pairs, as members of the same club, marked by transnational care on a double time basis: the practices of their parents as wrong past, and their own as an amendable future.

Keywords: Chinese families, narratives, transnational care, young adults

Procedia PDF Downloads 374
14700 Approach Based on Fuzzy C-Means for Band Selection in Hyperspectral Images

Authors: Diego Saqui, José H. Saito, José R. Campos, Lúcio A. de C. Jorge

Abstract:

Hyperspectral images and remote sensing are important for many applications. A problem in the use of these images is the high volume of data to be processed, stored and transferred. Dimensionality reduction techniques can be used to reduce the volume of data. In this paper, an approach to band selection based on clustering algorithms is presented. This approach allows to reduce the volume of data. The proposed structure is based on Fuzzy C-Means (or K-Means) and NWHFC algorithms. New attributes in relation to other studies in the literature, such as kurtosis and low correlation, are also considered. A comparison of the results of the approach using the Fuzzy C-Means and K-Means with different attributes is performed. The use of both algorithms show similar good results but, particularly when used attributes variance and kurtosis in the clustering process, however applicable in hyperspectral images.

Keywords: band selection, fuzzy c-means, k-means, hyperspectral image

Procedia PDF Downloads 388
14699 Communicative Strategies in Colombian Political Speech: On the Example of the Speeches of Francia Marquez

Authors: Danila Arbuzov

Abstract:

In this article the author examines the communicative strategies used in the Colombian political discourse, following the example of the speeches of the Vice President of Colombia Francia Marquez, who took office in 2022 and marked a new development vector for the Colombian nation. The lexical and syntactic means are analyzed to achieve the communicative objectives. The material presented may be useful for those who are interested in investigating various aspects of discursive linguistics, particularly political discourse, as well as the implementation of communicative strategies in certain types of discourse.

Keywords: political discourse, communication strategies, Colombian political discourse, Colombia, manipulation

Procedia PDF Downloads 97
14698 PID Control of Quad-Rotor Unnamed Vehicle Based on Lagrange Approach Modelling

Authors: A. Benbouali, H. Saidi, A. Derrouazin, T. Bessaad

Abstract:

Aerial robotics is a very exciting research field dealing with a variety of subjects, including the attitude control. This paper deals with the control of a four rotor vertical take-off and landing (VTOL) Unmanned Aerial Vehicle. The paper presents a mathematical model based on the approach of Lagrange for the flight control of an autonomous quad-rotor. It also describes the controller architecture which is based on PID regulators. The control method has been simulated in closed loop in different situations. All the calculation stages and the simulation results have been detailed.

Keywords: quad-rotor, lagrange approach, proportional integral derivate (PID) controller, Matlab/Simulink

Procedia PDF Downloads 386
14697 Understanding Retail Benefits Trade-offs of Dynamic Expiration Dates (DED) Associated with Food Waste

Authors: Junzhang Wu, Yifeng Zou, Alessandro Manzardo, Antonio Scipioni

Abstract:

Dynamic expiration dates (DEDs) play an essential role in reducing food waste in the context of the sustainable cold chain and food system. However, it is unknown for the trades-off in retail benefits when setting an expiration date on fresh food products. This study aims to develop a multi-dimensional decision-making model that integrates DEDs with food waste based on wireless sensor network technology. The model considers the initial quality of fresh food and the change rate of food quality with the storage temperature as cross-independent variables to identify the potential impacts of food waste in retail by applying s DEDs system. The results show that retail benefits from the DEDs system depend on each scenario despite its advanced technology. In the DEDs, the storage temperature of the retail shelf leads to the food waste rate, followed by the change rate of food quality and the initial quality of food products. We found that the DEDs system could reduce food waste when food products are stored at lower temperature areas. Besides, the potential of food savings in an extended replenishment cycle is significantly more advantageous than the fixed expiration dates (FEDs). On the other hand, the information-sharing approach of the DEDs system is relatively limited in improving sustainable assessment performance of food waste in retail and even misleads consumers’ choices. The research provides a comprehensive understanding to support the techno-economic choice of the DEDs associated with food waste in retail.

Keywords: dynamic expiry dates (DEDs), food waste, retail benefits, fixed expiration dates (FEDs)

Procedia PDF Downloads 102
14696 Design of Reinforced Concrete (RC) Walls Considering Shear Amplification by Nonlinear Dynamic Behavior

Authors: Sunghyun Kim, Hong-Gun Park

Abstract:

In the performance-based design (PBD), by using the nonlinear dynamic analysis (NDA), the actual performance of the structure is evaluated. Unlike frame structures, in the wall structures, base shear force which is resulted from the NDA, is greatly amplified than that from the elastic analysis. This shear amplifying effect causes repeated designs which make designer difficult to apply the PBD. Therefore, in this paper, factors which affect shear amplification were studied. For the 20-story wall model, the NDA was performed. From the analysis results, the base shear amplification factor was proposed.

Keywords: performance based design, shear amplification factor, nonlinear dynamic analysis, RC shear wall

Procedia PDF Downloads 373
14695 Improved Computational Efficiency of Machine Learning Algorithm Based on Evaluation Metrics to Control the Spread of Coronavirus in the UK

Authors: Swathi Ganesan, Nalinda Somasiri, Rebecca Jeyavadhanam, Gayathri Karthick

Abstract:

The COVID-19 crisis presents a substantial and critical hazard to worldwide health. Since the occurrence of the disease in late January 2020 in the UK, the number of infected people confirmed to acquire the illness has increased tremendously across the country, and the number of individuals affected is undoubtedly considerably high. The purpose of this research is to figure out a predictive machine learning archetypal that could forecast COVID-19 cases within the UK. This study concentrates on the statistical data collected from 31st January 2020 to 31st March 2021 in the United Kingdom. Information on total COVID cases registered, new cases encountered on a daily basis, total death registered, and patients’ death per day due to Coronavirus is collected from World Health Organisation (WHO). Data preprocessing is carried out to identify any missing values, outliers, or anomalies in the dataset. The data is split into 8:2 ratio for training and testing purposes to forecast future new COVID cases. Support Vector Machines (SVM), Random Forests, and linear regression algorithms are chosen to study the model performance in the prediction of new COVID-19 cases. From the evaluation metrics such as r-squared value and mean squared error, the statistical performance of the model in predicting the new COVID cases is evaluated. Random Forest outperformed the other two Machine Learning algorithms with a training accuracy of 99.47% and testing accuracy of 98.26% when n=30. The mean square error obtained for Random Forest is 4.05e11, which is lesser compared to the other predictive models used for this study. From the experimental analysis Random Forest algorithm can perform more effectively and efficiently in predicting the new COVID cases, which could help the health sector to take relevant control measures for the spread of the virus.

Keywords: COVID-19, machine learning, supervised learning, unsupervised learning, linear regression, support vector machine, random forest

Procedia PDF Downloads 113
14694 Comparing SVM and Naïve Bayes Classifier for Automatic Microaneurysm Detections

Authors: A. Sopharak, B. Uyyanonvara, S. Barman

Abstract:

Diabetic retinopathy is characterized by the development of retinal microaneurysms. The damage can be prevented if disease is treated in its early stages. In this paper, we are comparing Support Vector Machine (SVM) and Naïve Bayes (NB) classifiers for automatic microaneurysm detection in images acquired through non-dilated pupils. The Nearest Neighbor classifier is used as a baseline for comparison. Detected microaneurysms are validated with expert ophthalmologists’ hand-drawn ground-truths. The sensitivity, specificity, precision and accuracy of each method are also compared.

Keywords: diabetic retinopathy, microaneurysm, naive Bayes classifier, SVM classifier

Procedia PDF Downloads 317
14693 Impact of Surface Roughness on Light Absorption

Authors: V. Gareyan, Zh. Gevorkian

Abstract:

We study oblique incident light absorption in opaque media with rough surfaces. An analytical approach with modified boundary conditions taking into account the surface roughness in metallic or dielectric films has been discussed. Our approach reveals interference-linked terms that modify the absorption dependence on different characteristics. We have discussed the limits of our approach that hold valid from the visible to the microwave region. Polarization and angular dependences of roughness-induced absorption are revealed. The existence of an incident angle or a wavelength for which the absorptance of a rough surface becomes equal to that of a flat surface is predicted. Based on this phenomenon, a method of determining roughness correlation length is suggested.

Keywords: light, absorption, surface, roughness

Procedia PDF Downloads 42