Search results for: adaptive method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19753

Search results for: adaptive method

17263 Fluorescence Effect of Carbon Dots Modified with Silver Nanoparticles

Authors: Anna Piasek, Anna Szymkiewicz, Gabriela Wiktor, Jolanta Pulit-Prociak, Marcin Banach

Abstract:

Carbon dots (CDs) have great potential for application in many fields of science. They are characterized by fluorescent properties that can be manipulated. The nanomaterial has many advantages in addition to its unique properties. CDs may be obtained easily, and they undergo surface functionalization in a simple way. In addition, there is a wide range of raw materials that can be used for their synthesis. An interesting possibility is the use of numerous waste materials of natural origin. In the research presented here, the synthesis of CDs was carried out according to the principles of Green chemistry. Beet molasses was used as a natural raw material. It has a high sugar content. This makes it an excellent high-carbon precursor for obtaining CDs. To increase the fluorescence effect, we modified the surface of CDs with silver (Ag-CDs) nanoparticles. The process of obtaining CQD was based on the hydrothermal method by applying microwave radiation. Silver nanoparticles were formed via the chemical reduction method. The synthesis plans were performed on the Design of the Experimental method (DoE). Variable process parameters such as concentration of beet molasses, temperature and concentration of nanosilver were used in these syntheses. They affected the obtained properties and particle parameters. The Ag-CDs were analyzed by UV-vis spectroscopy. The fluorescence properties and selection of the appropriate excitation light wavelength were performed by spectrofluorimetry. Particle sizes were checked using the DLS method. The influence of the input parameters on the obtained results was also studied.

Keywords: fluorescence, modification, nanosilver, molasses, Green chemistry, carbon dots

Procedia PDF Downloads 85
17262 Bioinformatic Approaches in Population Genetics and Phylogenetic Studies

Authors: Masoud Sheidai

Abstract:

Biologists with a special field of population genetics and phylogeny have different research tasks such as populations’ genetic variability and divergence, species relatedness, the evolution of genetic and morphological characters, and identification of DNA SNPs with adaptive potential. To tackle these problems and reach a concise conclusion, they must use the proper and efficient statistical and bioinformatic methods as well as suitable genetic and morphological characteristics. In recent years application of different bioinformatic and statistical methods, which are based on various well-documented assumptions, are the proper analytical tools in the hands of researchers. The species delineation is usually carried out with the use of different clustering methods like K-means clustering based on proper distance measures according to the studied features of organisms. A well-defined species are assumed to be separated from the other taxa by molecular barcodes. The species relationships are studied by using molecular markers, which are analyzed by different analytical methods like multidimensional scaling (MDS) and principal coordinate analysis (PCoA). The species population structuring and genetic divergence are usually investigated by PCoA and PCA methods and a network diagram. These are based on bootstrapping of data. The Association of different genes and DNA sequences to ecological and geographical variables is determined by LFMM (Latent factor mixed model) and redundancy analysis (RDA), which are based on Bayesian and distance methods. Molecular and morphological differentiating characters in the studied species may be identified by linear discriminant analysis (DA) and discriminant analysis of principal components (DAPC). We shall illustrate these methods and related conclusions by giving examples from different edible and medicinal plant species.

Keywords: GWAS analysis, K-Means clustering, LFMM, multidimensional scaling, redundancy analysis

Procedia PDF Downloads 126
17261 Lentiviral-Based Novel Bicistronic Therapeutic Vaccine against Chronic Hepatitis B Induces Robust Immune Response

Authors: Mohamad F. Jamiluddin, Emeline Sarry, Ana Bejanariu, Cécile Bauche

Abstract:

Introduction: Over 360 million people are chronically infected with hepatitis B virus (HBV), of whom 1 million die each year from HBV-associated liver cirrhosis or hepatocellular carcinoma. Current treatment options for chronic hepatitis B depend on interferon-α (IFNα) or nucleos(t)ide analogs, which control virus replication but rarely eliminate the virus. Treatment with PEG-IFNα leads to a sustained antiviral response in only one third of patients. After withdrawal of the drugs, the rebound of viremia is observed in the majority of patients. Furthermore, the long-term treatment is subsequently associated with the appearance of drug resistant HBV strains that is often the cause of the therapy failure. Among the new therapeutic avenues being developed, therapeutic vaccine aimed at inducing immune responses similar to those found in resolvers is of growing interest. The high prevalence of chronic hepatitis B necessitates the design of better vaccination strategies capable of eliciting broad-spectrum of cell-mediated immunity(CMI) and humoral immune response that can control chronic hepatitis B. Induction of HBV-specific T cells and B cells by therapeutic vaccination may be an innovative strategy to overcome virus persistence. Lentiviral vectors developed and optimized by THERAVECTYS, due to their ability to transduce non-dividing cells, including dendritic cells, and induce CMI response, have demonstrated their effectiveness as vaccination tools. Method: To develop a HBV therapeutic vaccine that can induce a broad but specific immune response, we generated recombinant lentiviral vector carrying IRES(Internal Ribosome Entry Site)-containing bicistronic constructs which allow the coexpression of two vaccine products, namely HBV T- cell epitope vaccine and HBV virus like particle (VLP) vaccine. HBV T-cell epitope vaccine consists of immunodominant cluster of CD4 and CD8 epitopes with spacer in between them and epitopes are derived from HBV surface protein, HBV core, HBV X and polymerase. While HBV VLP vaccine is a HBV core protein based chimeric VLP with surface protein B-cell epitopes displayed. In order to evaluate the immunogenicity, mice were immunized with lentiviral constructs by intramuscular injection. The T cell and antibody immune responses of the two vaccine products were analyzed using IFN-γ ELISpot assay and ELISA respectively to quantify the adaptive response to HBV antigens. Results: Following a single administration in mice, lentiviral construct elicited robust antigen-specific IFN-γ responses to the encoded antigens. The HBV T- cell epitope vaccine demonstrated significantly higher T cell immunogenicity than HBV VLP vaccine. Importantly, we demonstrated by ELISA that antibodies are induced against both HBV surface protein and HBV core protein when mice injected with vaccine construct (p < 0.05). Conclusion: Our results highlight that THERAVECTYS lentiviral vectors may represent a powerful platform for immunization strategy against chronic hepatitis B. Our data suggests the likely importance of Lentiviral vector based novel bicistronic construct for further study, in combination with drugs or as standalone antigens, as a therapeutic lentiviral based HBV vaccines. THERAVECTYS bicistronic HBV vaccine will be further evaluated in animal efficacy studies.

Keywords: chronic hepatitis B, lentiviral vectors, therapeutic vaccine, virus-like particle

Procedia PDF Downloads 335
17260 Analysis Of Non-uniform Characteristics Of Small Underwater Targets Based On Clustering

Authors: Tianyang Xu

Abstract:

Small underwater targets generally have a non-centrosymmetric geometry, and the acoustic scattering field of the target has spatial inhomogeneity under active sonar detection conditions. In view of the above problems, this paper takes the hemispherical cylindrical shell as the research object, and considers the angle continuity implied in the echo characteristics, and proposes a cluster-driven research method for the non-uniform characteristics of target echo angle. First, the target echo features are extracted, and feature vectors are constructed. Secondly, the t-SNE algorithm is used to improve the internal connection of the feature vector in the low-dimensional feature space and to construct the visual feature space. Finally, the implicit angular relationship between echo features is extracted under unsupervised condition by cluster analysis. The reconstruction results of the local geometric structure of the target corresponding to different categories show that the method can effectively divide the angle interval of the local structure of the target according to the natural acoustic scattering characteristics of the target.

Keywords: underwater target;, non-uniform characteristics;, cluster-driven method;, acoustic scattering characteristics

Procedia PDF Downloads 135
17259 Comparison of Statistical Methods for Estimating Missing Precipitation Data in the River Subbasin Lenguazaque, Colombia

Authors: Miguel Cañon, Darwin Mena, Ivan Cabeza

Abstract:

In this work was compared and evaluated the applicability of statistical methods for the estimation of missing precipitations data in the basin of the river Lenguazaque located in the departments of Cundinamarca and Boyacá, Colombia. The methods used were the method of simple linear regression, distance rate, local averages, mean rates, correlation with nearly stations and multiple regression method. The analysis used to determine the effectiveness of the methods is performed by using three statistical tools, the correlation coefficient (r2), standard error of estimation and the test of agreement of Bland and Altmant. The analysis was performed using real rainfall values removed randomly in each of the seasons and then estimated using the methodologies mentioned to complete the missing data values. So it was determined that the methods with the highest performance and accuracy in the estimation of data according to conditions that were counted are the method of multiple regressions with three nearby stations and a random application scheme supported in the precipitation behavior of related data sets.

Keywords: statistical comparison, precipitation data, river subbasin, Bland and Altmant

Procedia PDF Downloads 469
17258 Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control

Authors: Oliver Ohneiser, Francesca De Crescenzio, Gianluca Di Flumeri, Jan Kraemer, Bruno Berberian, Sara Bagassi, Nicolina Sciaraffa, Pietro Aricò, Gianluca Borghini, Fabio Babiloni

Abstract:

An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.

Keywords: automation, human factors, air traffic controller, MINIMA, OOTL (Out-Of-The-Loop), EEG (Electroencephalography), HMI (Human Machine Interface)

Procedia PDF Downloads 385
17257 Wind Wave Modeling Using MIKE 21 SW Spectral Model

Authors: Pouya Molana, Zeinab Alimohammadi

Abstract:

Determining wind wave characteristics is essential for implementing projects related to Coastal and Marine engineering such as designing coastal and marine structures, estimating sediment transport rates and coastal erosion rates in order to predict significant wave height (H_s), this study applies the third generation spectral wave model, Mike 21 SW, along with CEM model. For SW model calibration and verification, two data sets of meteorology and wave spectroscopy are used. The model was exposed to time-varying wind power and the results showed that difference ratio mean, standard deviation of difference ratio and correlation coefficient in SW model for H_s parameter are 1.102, 0.279 and 0.983, respectively. Whereas, the difference ratio mean, standard deviation and correlation coefficient in The Choice Experiment Method (CEM) for the same parameter are 0.869, 1.317 and 0.8359, respectively. Comparing these expected results it is revealed that the Choice Experiment Method CEM has more errors in comparison to MIKE 21 SW third generation spectral wave model and higher correlation coefficient does not necessarily mean higher accuracy.

Keywords: MIKE 21 SW, CEM method, significant wave height, difference ratio

Procedia PDF Downloads 404
17256 Multiple Fusion Based Single Image Dehazing

Authors: Joe Amalraj, M. Arunkumar

Abstract:

Haze is an atmospheric phenomenon that signicantly degrades the visibility of outdoor scenes. This is mainly due to the atmosphere particles that absorb and scatter the light. This paper introduces a novel single image approach that enhances the visibility of such degraded images. In this method is a fusion-based strategy that derives from two original hazy image inputs by applying a white balance and a contrast enhancing procedure. To blend effectively the information of the derived inputs to preserve the regions with good visibility, we filter their important features by computing three measures (weight maps): luminance, chromaticity, and saliency. To minimize artifacts introduced by the weight maps, our approach is designed in a multiscale fashion, using a Laplacian pyramid representation. This paper demonstrates the utility and effectiveness of a fusion-based technique for de-hazing based on a single degraded image. The method performs in a per-pixel fashion, which is straightforward to implement. The experimental results demonstrate that the method yields results comparative to and even better than the more complex state-of-the-art techniques, having the advantage of being appropriate for real-time applications.

Keywords: single image de-hazing, outdoor images, enhancing, DSP

Procedia PDF Downloads 410
17255 Hyperspectral Data Classification Algorithm Based on the Deep Belief and Self-Organizing Neural Network

Authors: Li Qingjian, Li Ke, He Chun, Huang Yong

Abstract:

In this paper, the method of combining the Pohl Seidman's deep belief network with the self-organizing neural network is proposed to classify the target. This method is mainly aimed at the high nonlinearity of the hyperspectral image, the high sample dimension and the difficulty in designing the classifier. The main feature of original data is extracted by deep belief network. In the process of extracting features, adding known labels samples to fine tune the network, enriching the main characteristics. Then, the extracted feature vectors are classified into the self-organizing neural network. This method can effectively reduce the dimensions of data in the spectrum dimension in the preservation of large amounts of raw data information, to solve the traditional clustering and the long training time when labeled samples less deep learning algorithm for training problems, improve the classification accuracy and robustness. Through the data simulation, the results show that the proposed network structure can get a higher classification precision in the case of a small number of known label samples.

Keywords: DBN, SOM, pattern classification, hyperspectral, data compression

Procedia PDF Downloads 342
17254 The Impact of Reshuffle in Indonesian Working Cabinet Volume II to Abnormal Return and Abnormal Trading Activity of Companies Listed in the Jakarta Islamic Index

Authors: Fatin Fadhilah Hasib, Dewi Nuraini, Nisful Laila, Muhammad Madyan

Abstract:

A big political event such as Cabinet reshuffle mostly can affect the stock price positively or negatively, depend on the perception of each investor and potential investor. This study aims to analyze the movement of the market and trading activities which respect to an event using event study method. This method is used to measure the movement of the stock exchange in which abnormal return can be obtained by investor related to the event. This study examines the differences of reaction on abnormal return and trading volume activity from the companies listed in the Jakarta Islamic Index (JII), before and after the announcement of the Cabinet Work Volume II on 27 July 2016. The study was conducted in observation of 21 days in total which consists of 10 days before the event and 10 days after the event. The method used in this study is event study with market adjusted model method that observes market reaction to the information of an announcement or publicity events. The Results from the study showed that there is no significant negative nor positive reaction at the abnormal return and abnormal trading before and after the announcement of the cabinet reshuffle. It is indicated by the results of statistical tests whose value not exceeds the level of significance. Stock exchange of the JII just reflects from the previous stock prices without reflecting the information regarding to the Cabinet reshuffle event. It can be concluded that the capital market is efficient with a weak form.

Keywords: abnormal return, abnormal trading volume activity, event study, political event

Procedia PDF Downloads 294
17253 Spread Spectrum with Notch Frequency Using Pulse Coding Method for Switching Converter of Communication Equipment

Authors: Yasunori Kobori, Futoshi Fukaya, Takuya Arafune, Nobukazu Tsukiji, Nobukazu Takai, Haruo Kobayashi

Abstract:

This paper proposes an EMI spread spectrum technique to enable to set notch frequencies using pulse coding method for DC-DC switching converters of communication equipment. The notches in the spectrum of the switching pulses appear at the frequencies obtained from empirically derived equations with the proposed spread spectrum technique using the pulse coding methods, the PWM (Pulse Width Modulation) coding or the PCM (Pulse Cycle Modulation) coding. This technique would be useful for the switching converters in the communication equipment which receives standard radio waves, without being affected by noise from the switching converters. In our proposed technique, the notch frequencies in the spectrum depend on the pulse coding method. We have investigated this technique to apply to the switching converters and found that there is good relationship agreement between the notch frequencies and the empirical equations. The notch frequencies with the PWM coding is equal to the equation F=k/(WL-WS). With the PCM coding, that is equal to the equation F=k/(TL-TS).

Keywords: notch frequency, pulse coding, spread spectrum, switching converter

Procedia PDF Downloads 375
17252 [Keynote Talk]: Unlocking Transformational Resilience in the Aftermath of a Flood Disaster: A Case Study from Cumbria

Authors: Kate Crinion, Martin Haran, Stanley McGreal, David McIlhatton

Abstract:

Past research has demonstrated that disasters are continuing to escalate in frequency and magnitude worldwide, representing a key concern for the global community. Understanding and responding to the increasing risk posed by disaster events has become a key concern for disaster managers. An emerging trend within literature, acknowledges the need to move beyond a state of coping and reinstatement of the status quo, towards incremental adaptive change and transformational actions for long-term sustainable development. As such, a growing interest in research concerns the understanding of the change required to address ever increasing and unpredictable disaster events. Capturing transformational capacity and resilience, however is not without its difficulties and explains the dearth in attempts to capture this capacity. Adopting a case study approach, this research seeks to enhance an awareness of transformational resilience by identifying key components and indicators that determine the resilience of flood-affected communities within Cumbria. Grounding and testing a theoretical resilience framework within the case studies, permits the identification of how perceptions of risk influence community resilience actions. Further, it assesses how levels of social capital and connectedness impacts upon the extent of interplay between resources and capacities that drive transformational resilience. Thus, this research seeks to expand the existing body of knowledge by enhancing the awareness of resilience in post-disaster affected communities, by investigating indicators of community capacity building and resilience actions that facilitate transformational resilience during the recovery and reconstruction phase of a flood disaster.

Keywords: capacity building, community, flooding, transformational resilience

Procedia PDF Downloads 289
17251 Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP) for Recovering Signal

Authors: Israa Sh. Tawfic, Sema Koc Kayhan

Abstract:

Given a large sparse signal, great wishes are to reconstruct the signal precisely and accurately from lease number of measurements as possible as it could. Although this seems possible by theory, the difficulty is in built an algorithm to perform the accuracy and efficiency of reconstructing. This paper proposes a new proved method to reconstruct sparse signal depend on using new method called Least Support Matching Pursuit (LS-OMP) merge it with the theory of Partial Knowing Support (PSK) given new method called Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP). The new methods depend on the greedy algorithm to compute the support which depends on the number of iterations. So to make it faster, the PKLS-OMP adds the idea of partial knowing support of its algorithm. It shows the efficiency, simplicity, and accuracy to get back the original signal if the sampling matrix satisfies the Restricted Isometry Property (RIP). Simulation results also show that it outperforms many algorithms especially for compressible signals.

Keywords: compressed sensing, lest support orthogonal matching pursuit, partial knowing support, restricted isometry property, signal reconstruction

Procedia PDF Downloads 245
17250 Technology Futures in Global Militaries: A Forecasting Method Using Abstraction Hierarchies

Authors: Mark Andrew

Abstract:

Geopolitical tensions are at a thirty-year high, and the pace of technological innovation is driving asymmetry in force capabilities between nation states and between non-state actors. Technology futures are a vital component of defence capability growth, and investments in technology futures need to be informed by accurate and reliable forecasts of the options for ‘systems of systems’ innovation, development, and deployment. This paper describes a method for forecasting technology futures developed through an analysis of four key systems’ development stages, namely: technology domain categorisation, scanning results examining novel systems’ signals and signs, potential system-of systems’ implications in warfare theatres, and political ramifications in terms of funding and development priorities. The method has been applied to several technology domains, including physical systems (e.g., nano weapons, loitering munitions, inflight charging, and hypersonic missiles), biological systems (e.g., molecular virus weaponry, genetic engineering, brain-computer interfaces, and trans-human augmentation), and information systems (e.g., sensor technologies supporting situation awareness, cyber-driven social attacks, and goal-specification challenges to proliferation and alliance testing). Although the current application of the method has been team-centred using paper-based rapid prototyping and iteration, the application of autonomous language models (such as GPT-3) is anticipated as a next-stage operating platform. The importance of forecasting accuracy and reliability is considered a vital element in guiding technology development to afford stronger contingencies as ideological changes are forecast to expand threats to ecology and earth systems, possibly eclipsing the traditional vulnerabilities of nation states. The early results from the method will be subjected to ground truthing using longitudinal investigation.

Keywords: forecasting, technology futures, uncertainty, complexity

Procedia PDF Downloads 116
17249 Parental Involvement and Students' Outcomes: A Study in a Special Education School in Singapore

Authors: E. Er, Y. S. Cheng

Abstract:

The role of parents and caregivers in their children’s education is pivotal. Parental involvement (PI) is often associated with a range of student outcomes. This includes academic achievements, socioemotional development, adaptive skills, physical fitness and school attendance. This study is the first in Singapore to (1) explore the relationship between parental involvement and student outcomes; (2) determine the effects of family structure and socioeconomic status (SES) on parental involvement and (3) investigate factors that inform involvement in parents of children with specific developmental disabilities. Approval for the study was obtained from Nanyang Technological University’s Institutional Review Board in Singapore. The revised version of a comprehensive theoretical model on parental involvement was used as the theoretical framework in this study. Parents were recruited from a SPED school in Singapore which caters to school-aged children (7 to 21 years old). Pearson’s product moment correlation, analysis of variance and multiple regression analyses were used as statistical techniques in this study. Results indicate that there are significant associations between parental involvement and educational outcomes in students with developmental disabilities. Next, SES has a significant impact on levels of parental involvement. In addition, parents in the current study reported being more involved at home, in school activities and the community, when teachers specifically requested their involvement. Home-based involvement was also predicted by parents’ perceptions of their time and energy, efficacy and beliefs in supporting their child’s education, as well as their children’s invitations to be more involved. An interesting and counterintuitive inverse relationship was found between general school invitations and parental involvement at home. Research findings are further discussed, and suggestions are put forth to increase involvement for this specific group of parents.

Keywords: autism, developmental disabilities, intellectual disabilities, parental involvement, Singapore

Procedia PDF Downloads 203
17248 [Keynote Speech]: Bridge Damage Detection Using Frequency Response Function

Authors: Ahmed Noor Al-Qayyim

Abstract:

During the past decades, the bridge structures are considered very important portions of transportation networks, due to the fast urban sprawling. With the failure of bridges that under operating conditions lead to focus on updating the default bridge inspection methodology. The structures health monitoring (SHM) using the vibration response appeared as a promising method to evaluate the condition of structures. The rapid development in the sensors technology and the condition assessment techniques based on the vibration-based damage detection made the SHM an efficient and economical ways to assess the bridges. SHM is set to assess state and expects probable failures of designated bridges. In this paper, a presentation for Frequency Response function method that uses the captured vibration test information of structures to evaluate the structure condition. Furthermore, the main steps of the assessment of bridge using the vibration information are presented. The Frequency Response function method is applied to the experimental data of a full-scale bridge.

Keywords: bridge assessment, health monitoring, damage detection, frequency response function (FRF), signal processing, structure identification

Procedia PDF Downloads 350
17247 Detection of Autistic Children's Voice Based on Artificial Neural Network

Authors: Royan Dawud Aldian, Endah Purwanti, Soegianto Soelistiono

Abstract:

In this research we have been developed an automatic investigation to classify normal children voice or autistic by using modern computation technology that is computation based on artificial neural network. The superiority of this computation technology is its capability on processing and saving data. In this research, digital voice features are gotten from the coefficient of linear-predictive coding with auto-correlation method and have been transformed in frequency domain using fast fourier transform, which used as input of artificial neural network in back-propagation method so that will make the difference between normal children and autistic automatically. The result of back-propagation method shows that successful classification capability for normal children voice experiment data is 100% whereas, for autistic children voice experiment data is 100%. The success rate using back-propagation classification system for the entire test data is 100%.

Keywords: autism, artificial neural network, backpropagation, linier predictive coding, fast fourier transform

Procedia PDF Downloads 461
17246 Flexible Capacitive Sensors Based on Paper Sheets

Authors: Mojtaba Farzaneh, Majid Baghaei Nejad

Abstract:

This article proposes a new Flexible Capacitive Tactile Sensors based on paper sheets. This method combines the parameters of sensor's material and dielectric, and forms a new model of flexible capacitive sensors. The present article tries to present a practical explanation of this method's application and advantages. With the use of this new method, it is possible to make a more flexibility and accurate sensor in comparison with the current models. To assess the performance of this model, the common capacitive sensor is simulated and the proposed model of this article and one of the existing models are assessed. The results of this article indicate that the proposed model of this article can enhance the speed and accuracy of tactile sensor and has less error in comparison with the current models. Based on the results of this study, it can be claimed that in comparison with the current models, the proposed model of this article is capable of representing more flexibility and more accurate output parameters for touching the sensor, especially in abnormal situations and uneven surfaces, and increases accuracy and practicality.

Keywords: capacitive sensor, paper sheets, flexible, tactile, uneven

Procedia PDF Downloads 354
17245 Web Search Engine Based Naming Procedure for Independent Topic

Authors: Takahiro Nishigaki, Takashi Onoda

Abstract:

In recent years, the number of document data has been increasing since the spread of the Internet. Many methods have been studied for extracting topics from large document data. We proposed Independent Topic Analysis (ITA) to extract topics independent of each other from large document data such as newspaper data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis. The topic represented by ITA is represented by a set of words. However, the set of words is quite different from the topics the user imagines. For example, the top five words with high independence of a topic are as follows. Topic1 = {"scor", "game", "lead", "quarter", "rebound"}. This Topic 1 is considered to represent the topic of "SPORTS". This topic name "SPORTS" has to be attached by the user. ITA cannot name topics. Therefore, in this research, we propose a method to obtain topics easy for people to understand by using the web search engine, topics given by the set of words given by independent topic analysis. In particular, we search a set of topical words, and the title of the homepage of the search result is taken as the topic name. And we also use the proposed method for some data and verify its effectiveness.

Keywords: independent topic analysis, topic extraction, topic naming, web search engine

Procedia PDF Downloads 120
17244 The Analysis of Thermal Conductivity in Porcine Meat Due to Electricity by Finite Element Method

Authors: Orose Rugchati, Sarawut Wattanawongpitak

Abstract:

This research studied the analysis of the thermal conductivity and heat transfer in porcine meat due to the electric current flowing between the electrode plates in parallel. Hot-boned pork sample was prepared in 2*1*1 cubic centimeter. The finite element method with ANSYS workbench program was applied to simulate this heat transfer problem. In the thermal simulation, the input thermoelectric energy was calculated from measured current that flowing through the pork and the input voltage from the dc voltage source. The comparison of heat transfer in pork according to two voltage sources: DC voltage 30 volts and dc pulsed voltage 60 volts (pulse width 50 milliseconds and 50 % duty cycle) were demonstrated. From the result, it shown that the thermal conductivity trends to be steady at temperature 40C and 60C around 1.39 W/mC and 2.65 W/mC for dc voltage source 30 volts and dc pulsed voltage 60 volts, respectively. For temperature increased to 50C at 5 minutes, the appearance color of porcine meat at the exposer point has become to fade. This technique could be used for predicting of thermal conductivity caused by some meat’s characteristics.

Keywords: thermal conductivity, porcine meat, electricity, finite element method

Procedia PDF Downloads 142
17243 Identification, Isolation and Characterization of Unknown Degradation Products of Cefprozil Monohydrate by HPTLC

Authors: Vandana T. Gawande, Kailash G. Bothara, Chandani O. Satija

Abstract:

The present research work was aimed to determine stability of cefprozil monohydrate (CEFZ) as per various stress degradation conditions recommended by International Conference on Harmonization (ICH) guideline Q1A (R2). Forced degradation studies were carried out for hydrolytic, oxidative, photolytic and thermal stress conditions. The drug was found susceptible for degradation under all stress conditions. Separation was carried out by using High Performance Thin Layer Chromatographic System (HPTLC). Aluminum plates pre-coated with silica gel 60F254 were used as the stationary phase. The mobile phase consisted of ethyl acetate: acetone: methanol: water: glacial acetic acid (7.5:2.5:2.5:1.5:0.5v/v). Densitometric analysis was carried out at 280 nm. The system was found to give compact spot for cefprozil monohydrate (0.45 Rf). The linear regression analysis data showed good linear relationship in the concentration range 200-5.000 ng/band for cefprozil monohydrate. Percent recovery for the drug was found to be in the range of 98.78-101.24. Method was found to be reproducible with % relative standard deviation (%RSD) for intra- and inter-day precision to be < 1.5% over the said concentration range. The method was validated for precision, accuracy, specificity and robustness. The method has been successfully applied in the analysis of drug in tablet dosage form. Three unknown degradation products formed under various stress conditions were isolated by preparative HPTLC and characterized by mass spectroscopic studies.

Keywords: cefprozil monohydrate, degradation products, HPTLC, stress study, stability indicating method

Procedia PDF Downloads 299
17242 The Jordanian Traditional Dress of Women as a Form of Cultural Heritage

Authors: Sarah Alkhateeb

Abstract:

This research explores the Jordanian traditional dress of women as a form of cultural heritage. The dress of the Jordanian woman expresses her social and cultural functions and reflects the local environment in its social and cultural frameworks and the determinants of the natural formation of climate and terrain, in addition to what is expressed by the person’s social status and position in the social ladder of any society. Therefore, the traditional dress of Jordanian women is distinguished by its abundance and diversity. Few studies have been conducted on the Jordanian traditional dress of women, the lack of studies about the Jordanian traditional dress of women needs highlighting and the characteristics of this dress have to be featured and documented as a part of cultural heritage. The main aim of this research is to contribute or to develop a conservation strategy to save this part of cultural heritage from loss. In this research, the qualitative method approach will be used and will follow the ethnographic method. The data will be gathered from a primary source which is the single focus group discussion with the TIRAZ museum team; the Jordanian traditional dress will be explored across three regions: The North, Middle and South of Jordan, investigating the regional differences and focusing on the details of the individual garment.

Keywords: Jordanian traditional dress, cultural heritage, tiraz museum, ethnographic method

Procedia PDF Downloads 168
17241 Performance of Constant Load Feed Machining for Robotic Drilling

Authors: Youji Miyake

Abstract:

In aircraft assembly, a large number of preparatory holes are required for screw and rivet joints. Currently, many holes are drilled manually because it is difficult to machine the holes using conventional computerized numerical control(CNC) machines. The application of industrial robots to drill the hole has been considered as an alternative to the CNC machines. However, the rigidity of robot arms is so low that vibration is likely to occur during drilling. In this study, it is proposed constant-load feed machining as a method to perform high-precision drilling while minimizing the thrust force, which is considered to be the cause of vibration. In this method, the drill feed is realized by a constant load applied onto the tool so that the thrust force is theoretically kept below the applied load. The performance of the proposed method was experimentally examined through the deep hole drilling of plastic and simultaneous drilling of metal/plastic stack plates. It was confirmed that the deep hole drilling and simultaneous drilling could be performed without generating vibration by controlling the tool feed rate in the appropriate range.

Keywords: constant load feed machining, robotic drilling, deep hole, simultaneous drilling

Procedia PDF Downloads 199
17240 Vulnerability of People to Climate Change: Influence of Methods and Computation Approaches on Assessment Outcomes

Authors: Adandé Belarmain Fandohan

Abstract:

Climate change has become a major concern globally, particularly in rural communities that have to find rapid coping solutions. Several vulnerability assessment approaches have been developed in the last decades. This comes along with a higher risk for different methods to result in different conclusions, thereby making comparisons difficult and decision-making non-consistent across areas. The effect of methods and computational approaches on estimates of people’s vulnerability was assessed using data collected from the Gambia. Twenty-four indicators reflecting vulnerability components: (exposure, sensitivity, and adaptive capacity) were selected for this purpose. Data were collected through household surveys and key informant interviews. One hundred and fifteen respondents were surveyed across six communities and two administrative districts. Results were compared over three computational approaches: the maximum value transformation normalization, the z-score transformation normalization, and simple averaging. Regardless of the approaches used, communities that have high exposure to climate change and extreme events were the most vulnerable. Furthermore, the vulnerability was strongly related to the socio-economic characteristics of farmers. The survey evidenced variability in vulnerability among communities and administrative districts. Comparing output across approaches, overall, people in the study area were found to be highly vulnerable using the simple average and maximum value transformation, whereas they were only moderately vulnerable using the z-score transformation approach. It is suggested that assessment approach-induced discrepancies be accounted for in international debates to harmonize/standardize assessment approaches to the end of making outputs comparable across regions. This will also likely increase the relevance of decision-making for adaptation policies.

Keywords: maximum value transformation, simple averaging, vulnerability assessment, West Africa, z-score transformation

Procedia PDF Downloads 105
17239 A Method to Ease the Military Certification Process by Taking Advantage of Civil Standards in the Scope of Human Factors

Authors: Burcu Uçan

Abstract:

The certification approach differs in civil and military projects in aviation. Sets of criteria and standards created by airworthiness authorities for the determination of certification basis are distinct. While the civil standards are more understandable and clear because of not only include detailed specifications but also the help of guidance materials such as Advisory Circular, military criteria do not provide this level of guidance. Therefore, specifications that are more negotiable and sometimes more difficult to reconcile arise for the certification basis of a military aircraft. This study investigates a method of how to develop a military specification set by taking advantage of civil standards, regarding the European Military Airworthiness Criteria (EMACC) that establishes the airworthiness criteria for aircraft systems. Airworthiness Certification Criteria (MIL-HDBK-516C) is a handbook published for guidance that contains qualitative evaluation for military aircrafts meanwhile Certification Specifications (CS-29) is published for civil aircrafts by European Union Aviation Safety Agency (EASA). This method intends to compare and contrast specifications that MIL-HDBK-516C and CS-29 contain within the scope of Human Factors. Human Factors supports human performance and aims to improve system performance by encompassing knowledge from a range of scientific disciplines. Human Factors focuses on how people perform their tasks and reduce the risk of an accident occurring due to human physical and cognitive limitations. Hence, regardless of whether the project is civil or military, the specifications must be guided at a certain level by taking into account human limits. This study presents an advisory method for this purpose. The method in this study develops a solution for the military certification process by identifying the CS requirement corresponding to the criteria in the MIL-HDBK-516C by means of EMACC. Thus, it eases understanding the expectations of the criteria and establishing derived requirements. As a result of this method, it may not always be preferred to derive new requirements. Instead, it is possible to add remarks to make the expectancy of the criteria and required verification methods more comprehensible for all stakeholders. This study contributes to creating a certification basis for military aircraft, which is difficult and takes plenty of time for stakeholders to agree due to gray areas in the certification process for military aircrafts.

Keywords: human factors, certification, aerospace, requirement

Procedia PDF Downloads 79
17238 An Improved Modular Multilevel Converter Voltage Balancing Approach for Grid Connected PV System

Authors: Safia Bashir, Zulfiqar Memon

Abstract:

During the last decade, renewable energy sources in particular solar photovoltaic (PV) has gained increased attention. Therefore, various PV converters topologies have emerged. Among this topology, the modular multilevel converter (MMC) is considered as one of the most promising topologies for the grid-connected PV system due to its modularity and transformerless features. When it comes to the safe operation of MMC, the balancing of the Submodules Voltages (SMs) plays a critical role. This paper proposes a balancing approach based on space vector PWM (SVPWM). Unlike the existing techniques, this method generates the switching vectors for the MMC by using only one SVPWM for the upper arm. The lower arm switching vectors are obtained by finding the complement of the upper arm switching vectors. The use of one SVPWM not only simplifies the calculation but also helped in reducing the circulating current in the MMC. The proposed method is varied through simulation using Matlab/Simulink and compared with other available modulation methods. The results validate the ability of the suggested method in balancing the SMs capacitors voltages and reducing the circulating current which will help in reducing the power loss of the PV system.

Keywords: capacitor voltage balancing, circulating current, modular multilevel converter, PV system

Procedia PDF Downloads 160
17237 Genetic Algorithm and Multi Criteria Decision Making Approach for Compressive Sensing Based Direction of Arrival Estimation

Authors: Ekin Nurbaş

Abstract:

One of the essential challenges in array signal processing, which has drawn enormous research interest over the past several decades, is estimating the direction of arrival (DOA) of plane waves impinging on an array of sensors. In recent years, the Compressive Sensing based DoA estimation methods have been proposed by researchers, and it has been discovered that the Compressive Sensing (CS)-based algorithms achieved significant performances for DoA estimation even in scenarios where there are multiple coherent sources. On the other hand, the Genetic Algorithm, which is a method that provides a solution strategy inspired by natural selection, has been used in sparse representation problems in recent years and provides significant improvements in performance. With all of those in consideration, in this paper, a method that combines the Genetic Algorithm (GA) and the Multi-Criteria Decision Making (MCDM) approaches for Direction of Arrival (DoA) estimation in the Compressive Sensing (CS) framework is proposed. In this method, we generate a multi-objective optimization problem by splitting the norm minimization and reconstruction loss minimization parts of the Compressive Sensing algorithm. With the help of the Genetic Algorithm, multiple non-dominated solutions are achieved for the defined multi-objective optimization problem. Among the pareto-frontier solutions, the final solution is obtained with the multiple MCDM methods. Moreover, the performance of the proposed method is compared with the CS-based methods in the literature.

Keywords: genetic algorithm, direction of arrival esitmation, multi criteria decision making, compressive sensing

Procedia PDF Downloads 148
17236 Indoor Fingerprint Localization Using 5G NR Multi-SSB Beam Features with GAN-Based Interpolation

Authors: LiRen Kang, LingXia Li, KaiKai Liu, Yue Jin, ZengShan Tian

Abstract:

With the widespread adoption of 5G technology in the Internet of Things (IoT), indoor localization methods based on 5G signals have gradually become a research hotspot. However, traditional methods often perform poorly in multipath interference and signal attenuation environments. To address these challenges, this paper proposes an innovative fingerprint localization method that utilizes the multiple synchronization signal block (SSB) beam features of 5G signals combined with generative adversarial networks (GANs) for interpolation. Our method incorporates a ray tracing model as an auxiliary, integrating signal propagation models to enhance the interpolation process. We precisely extract the multiple SSB beam features from 5G signals; in the localization stage, deep learning neural networks (DNN) are used for localization. Field tests show that localization errors of less than 1.5 meters can be achieved within about 200 square meters of indoor environment. Our method represents a 56.7% improvement compared to traditional methods that use received signal strength (RSS) as a single feature.

Keywords: 5G NR, fingerprint localization, generative adversarial networks, Internet of Things, indoor localization systems

Procedia PDF Downloads 8
17235 Wasting Human and Computer Resources

Authors: Mária Csernoch, Piroska Biró

Abstract:

The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem-solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text-based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text-based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.

Keywords: deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources

Procedia PDF Downloads 383
17234 Temperature Investigations in Two Type of Crimped Connection Using Experimental Determinations

Authors: C. F. Ocoleanu, A. I. Dolan, G. Cividjian, S. Teodorescu

Abstract:

In this paper we make a temperature investigations in two type of superposed crimped connections using experimental determinations. All the samples use 8 copper wire 7.1 x 3 mm2 crimped by two methods: the first method uses one crimp indents and the second is a proposed method with two crimp indents. The ferrule is a parallel one. We study the influence of number and position of crimp indents. The samples are heated in A.C. current at different current values until steady state heating regime. After obtaining of temperature values, we compare them and present the conclusion.

Keywords: crimped connections, experimental determinations, temperature, heat transfer

Procedia PDF Downloads 270