Search results for: time domain analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 40019

Search results for: time domain analysis

37559 Chemical Fingerprinting of the Ephedrine Pathway to Methamphetamine

Authors: Luke Andrighetto, Paul G. Stevenson, Luke C. Henderson, Jim Pearson, Xavier A. Conlan

Abstract:

As pseudoephedrine, a common ingredient in cold and flu medications is closely monitored and restricted in Australia, alternative methods of accessing it are of interest. The impurities and by-products of every reaction step of pseudoephedrine/ephedrine and methamphetamine synthesis have been mapped in order to develop a chemical fingerprint based on synthetic route. Likewise, seized methamphetamine contains a combination of different cutting agents and starting materials. Therefore, in-silico optimised two-dimensional HPLC with DryLab® and OpenMS® software has been used to efficiently separate complex seizure samples. An excellent match between simulated and real separations was observed. Targeted separation of model compounds was completed with significantly reduced method development time. This study produced a two-dimensional separation regime that offers unprecedented separation power (separation space) while maintaining a rapid analysis time that is faster than those previously reported for gas chromatography, single dimension high performance liquid chromatography or capillary electrophoresis.

Keywords: chemical fingerprint, ephedrine, methamphetamine, two-dimensional HPLC

Procedia PDF Downloads 450
37558 Optimization of Quercus cerris Bark Liquefaction

Authors: Luísa P. Cruz-Lopes, Hugo Costa e Silva, Idalina Domingos, José Ferreira, Luís Teixeira de Lemos, Bruno Esteves

Abstract:

The liquefaction process of cork based tree barks has led to an increase of interest due to its potential innovation in the lumber and wood industries. In this particular study the bark of Quercus cerris (Turkish oak) is used due to its appreciable amount of cork tissue, although of inferior quality when compared to the cork provided by other Quercus trees. This study aims to optimize alkaline catalysis liquefaction conditions, regarding several parameters. To better comprehend the possible chemical characteristics of the bark of Quercus cerris, a complete chemical analysis was performed. The liquefaction process was performed in a double-jacket reactor heated with oil, using glycerol and a mixture of glycerol/ethylene glycol as solvents, potassium hydroxide as a catalyst, and varying the temperature, liquefaction time and granulometry. Due to low liquefaction efficiency resulting from the first experimental procedures a study was made regarding different washing techniques after the filtration process using methanol and methanol/water. The chemical analysis stated that the bark of Quercus cerris is mostly composed by suberin (ca. 30%) and lignin (ca. 24%) as well as insolvent hemicelluloses in hot water (ca. 23%). On the liquefaction stage, the results that led to higher yields were: using a mixture of methanol/ethylene glycol as reagents and a time and temperature of 120 minutes and 200 ºC, respectively. It is concluded that using a granulometry of <80 mesh leads to better results, even if this parameter barely influences the liquefaction efficiency. Regarding the filtration stage, washing the residue with methanol and then distilled water leads to a considerable increase on final liquefaction percentages, which proves that this procedure is effective at liquefying suberin content and lignocellulose fraction.

Keywords: liquefaction, Quercus cerris, polyalcohol liquefaction, temperature

Procedia PDF Downloads 323
37557 Scheduling Residential Daily Energy Consumption Using Bi-criteria Optimization Methods

Authors: Li-hsing Shih, Tzu-hsun Yen

Abstract:

Because of the long-term commitment to net zero carbon emission, utility companies include more renewable energy supply, which generates electricity with time and weather restrictions. This leads to time-of-use electricity pricing to reflect the actual cost of energy supply. From an end-user point of view, better residential energy management is needed to incorporate the time-of-use prices and assist end users in scheduling their daily use of electricity. This study uses bi-criteria optimization methods to schedule daily energy consumption by minimizing the electricity cost and maximizing the comfort of end users. Different from most previous research, this study schedules users’ activities rather than household appliances to have better measures of users’ comfort/satisfaction. The relation between each activity and the use of different appliances could be defined by users. The comfort level is at the highest when the time and duration of an activity completely meet the user’s expectation, and the comfort level decreases when the time and duration do not meet expectations. A questionnaire survey was conducted to collect data for establishing regression models that describe users’ comfort levels when the execution time and duration of activities are different from user expectations. Six regression models representing the comfort levels for six types of activities were established using the responses to the questionnaire survey. A computer program is developed to evaluate electricity cost and the comfort level for each feasible schedule and then find the non-dominated schedules. The Epsilon constraint method is used to find the optimal schedule out of the non-dominated schedules. A hypothetical case is presented to demonstrate the effectiveness of the proposed approach and the computer program. Using the program, users can obtain the optimal schedule of daily energy consumption by inputting the intended time and duration of activities and the given time-of-use electricity prices.

Keywords: bi-criteria optimization, energy consumption, time-of-use price, scheduling

Procedia PDF Downloads 43
37556 Internal Combustion Engine Fuel Composition Detection by Analysing Vibration Signals Using ANFIS Network

Authors: M. N. Khajavi, S. Nasiri, E. Farokhi, M. R. Bavir

Abstract:

Alcohol fuels are renewable, have low pollution and have high octane number; therefore, they are important as fuel in internal combustion engines. Percentage detection of these alcoholic fuels with gasoline is a complicated, time consuming, and expensive process. Nowadays, these processes are done in equipped laboratories, based on international standards. The aim of this research is to determine percentage detection of different fuels based on vibration analysis of engine block signals. By doing, so considerable saving in time and cost can be achieved. Five different fuels consisted of pure gasoline (G) as base fuel and combination of this fuel with different percent of ethanol and methanol are prepared. For example, volumetric combination of pure gasoline with 10 percent ethanol is called E10. By this convention, we made M10 (10% methanol plus 90% pure gasoline), E30 (30% ethanol plus 70% pure gasoline), and M30 (30% Methanol plus 70% pure gasoline) were prepared. To simulate real working condition for this experiment, the vehicle was mounted on a chassis dynamometer and run under 1900 rpm and 30 KW load. To measure the engine block vibration, a three axis accelerometer was mounted between cylinder 2 and 3. After acquisition of vibration signal, eight time feature of these signals were used as inputs to an Adaptive Neuro Fuzzy Inference System (ANFIS). The designed ANFIS was trained for classifying these five different fuels. The results show suitable classification ability of the designed ANFIS network with 96.3 percent of correct classification.

Keywords: internal combustion engine, vibration signal, fuel composition, classification, ANFIS

Procedia PDF Downloads 388
37555 AI Software Algorithms for Drivers Monitoring within Vehicles Traffic - SiaMOTO

Authors: Ioan Corneliu Salisteanu, Valentin Dogaru Ulieru, Mihaita Nicolae Ardeleanu, Alin Pohoata, Bogdan Salisteanu, Stefan Broscareanu

Abstract:

Creating a personalized statistic for an individual within the population using IT systems, based on the searches and intercepted spheres of interest they manifest, is just one 'atom' of the artificial intelligence analysis network. However, having the ability to generate statistics based on individual data intercepted from large demographic areas leads to reasoning like that issued by a human mind with global strategic ambitions. The DiaMOTO device is a technical sensory system that allows the interception of car events caused by a driver, positioning them in time and space. The device's connection to the vehicle allows the creation of a source of data whose analysis can create psychological, behavioural profiles of the drivers involved. The SiaMOTO system collects data from many vehicles equipped with DiaMOTO, driven by many different drivers with a unique fingerprint in their approach to driving. In this paper, we aimed to explain the software infrastructure of the SiaMOTO system, a system designed to monitor and improve driver driving behaviour, as well as the criteria and algorithms underlying the intelligent analysis process.

Keywords: artificial intelligence, data processing, driver behaviour, driver monitoring, SiaMOTO

Procedia PDF Downloads 67
37554 Comparative Analysis of Two Approaches to Joint Signal Detection, ToA and AoA Estimation in Multi-Element Antenna Arrays

Authors: Olesya Bolkhovskaya, Alexey Davydov, Alexander Maltsev

Abstract:

In this paper two approaches to joint signal detection, time of arrival (ToA) and angle of arrival (AoA) estimation in multi-element antenna array are investigated. Two scenarios were considered: first one, when the waveform of the useful signal is known a priori and, second one, when the waveform of the desired signal is unknown. For first scenario, the antenna array signal processing based on multi-element matched filtering (MF) with the following non-coherent detection scheme and maximum likelihood (ML) parameter estimation blocks is exploited. For second scenario, the signal processing based on the antenna array elements covariance matrix estimation with the following eigenvector analysis and ML parameter estimation blocks is applied. The performance characteristics of both signal processing schemes are thoroughly investigated and compared for different useful signals and noise parameters.

Keywords: antenna array, signal detection, ToA, AoA estimation

Procedia PDF Downloads 478
37553 Titanium-Aluminium Oxide Coating on Aluminized Steel

Authors: Fuyan Sun, Guang Wang, Xueyuan Nie

Abstract:

In this study, a plasma electrolytic oxidation (PEO) process was used to form titanium-aluminium oxide coating on aluminized steel. The present work was mainly to study the effects of treatment time of PEO process on properties of the titanium coating. A potentiodynamic polarization corrosion test was employed to investigate the corrosion resistance of the coating. The friction coefficient and wear resistance of the coating were studied by using pin-on-disc test. The thermal transfer behaviours of uncoated and PEO-coated aluminized steels were also studied. It could be seen that treatment time of PEO process significantly influenced the properties of the titanium oxide coating. Samples with a longer treatment time had a better performance for corrosion and wear protection. This paper demonstrated different treatment time could alter the surface behaviour of the coating material.

Keywords: titanium-aluminum oxide, plasma electrolytic oxidation, corrosion, wear, thermal property

Procedia PDF Downloads 344
37552 A Comparative Study of GTC and PSP Algorithms for Mining Sequential Patterns Embedded in Database with Time Constraints

Authors: Safa Adi

Abstract:

This paper will consider the problem of sequential mining patterns embedded in a database by handling the time constraints as defined in the GSP algorithm (level wise algorithms). We will compare two previous approaches GTC and PSP, that resumes the general principles of GSP. Furthermore this paper will discuss PG-hybrid algorithm, that using PSP and GTC. The results show that PSP and GTC are more efficient than GSP. On the other hand, the GTC algorithm performs better than PSP. The PG-hybrid algorithm use PSP algorithm for the two first passes on the database, and GTC approach for the following scans. Experiments show that the hybrid approach is very efficient for short, frequent sequences.

Keywords: database, GTC algorithm, PSP algorithm, sequential patterns, time constraints

Procedia PDF Downloads 370
37551 A Parametric Investigation into the Free Vibration and Flutter Characteristics of High Aspect Ratio Aircraft Wings Using Polynomial Distributions of Stiffness and Mass Properties

Authors: Ranjan Banerjee, W. D. Gunawardana

Abstract:

The free vibration and flutter analysis plays a major part in aircraft design which is indeed, a mandatory requirement. In particular, high aspect ratio transport airliner wings are prone to free vibration and flutter problems that must be addressed during the design process as demanded by the airworthiness authorities. The purpose of this paper is to carry out a detailed free vibration and flutter analysis for a wide range of high aspect ratio aircraft wings and generate design curves to provide useful visions and understandings of aircraft design from an aeroelastic perspective. In the initial stage of the investigation, the bending and torsional stiffnesses of a number of transport aircraft wings are looked at and critically examined to see whether it is possible to express the stiffness distributions in polynomial form, but in a sufficiently accurate manner. A similar attempt is made for mass and mass moment of inertia distributions of the wing. Once the choice of stiffness and mass distributions in polynomial form is made, the high aspect ratio wing is idealised by a series of bending-torsion coupled beams from a structural standpoint. Then the dynamic stiffness method is applied to compute the natural frequencies and mode shape of the wing. Next the wing is idealised aerodynamically and to this end, unsteady aerodynamic of Theodorsen type is employed to represent the harmonically oscillating wing. Following this step, a normal mode method through the use of generalised coordinates is applied to formulate the flutter problem. In essence, the generalised mass, stiffness and aerodynamic matrices are combined to obtain the flutter matrix which is subsequently solved in the complex domain to determine the flutter speed and flutter frequency. In the final stage of the investigation, an exhaustive parametric study is carried out by varying significant wing parameters to generate design curves which help to predict the free vibration and flutter behaviour of high aspect ratio transport aircraft wings in a generic manner. It is in the aeroelastic context of aircraft design where the results are expected to be most useful.

Keywords: high-aspect ratio wing, flutter, dynamic stiffness method, free vibration, aeroelasticity

Procedia PDF Downloads 274
37550 Robust Stabilization against Unknown Consensus Network

Authors: Myung-Gon Yoon, Jung-Ho Moon, Tae Kwon Ha

Abstract:

This paper considers a robust stabilization problem of a single agent in a multi-agent consensus system composed of identical agents, when the network topology of the system is completely unknown. It is shown that the transfer function of an agent in a consensus system can be described as a multiplicative perturbation of the isolated agent transfer function in frequency domain. Applying known robust stabilization results, we present sufficient conditions for a robust stabilization of an agent against unknown network topology.

Keywords: single agent control, multi-agent system, transfer function, graph angle

Procedia PDF Downloads 437
37549 Ultra-Sensitive and Real Time Detection of ZnO NW Using QCM

Authors: Juneseok You, Kuewhan Jang, Chanho Park, Jaeyeong Choi, Hyunjun Park, Sehyun Shin, Changsoo Han, Sungsoo Na

Abstract:

Nanomaterials occur toxic effects to human being or ecological systems. Some sensors have been developed to detect toxic materials and the standard for toxic materials has been established. Zinc oxide nanowire (ZnO NW) is known for toxic material. By ionizing in cell body, ionized Zn ions are overexposed to cell components, which cause critical damage or death. In this paper, we detected ZnO NW in water using QCM (Quartz Crystal Microbalance) and ssDNA (single strand DNA). We achieved 30 minutes of response time for real time detection and 100 pg/mL of limit of detection (LOD).

Keywords: zinc oxide nanowire, QCM, ssDNA, toxic material, biosensor

Procedia PDF Downloads 414
37548 Crashworthiness Optimization of an Automotive Front Bumper in Composite Material

Authors: S. Boria

Abstract:

In the last years, the crashworthiness of an automotive body structure can be improved, since the beginning of the design stage, thanks to the development of specific optimization tools. It is well known how the finite element codes can help the designer to investigate the crashing performance of structures under dynamic impact. Therefore, by coupling nonlinear mathematical programming procedure and statistical techniques with FE simulations, it is possible to optimize the design with reduced number of analytical evaluations. In engineering applications, many optimization methods which are based on statistical techniques and utilize estimated models, called meta-models, are quickly spreading. A meta-model is an approximation of a detailed simulation model based on a dataset of input, identified by the design of experiments (DOE); the number of simulations needed to build it depends on the number of variables. Among the various types of meta-modeling techniques, Kriging method seems to be excellent in accuracy, robustness and efficiency compared to other ones when applied to crashworthiness optimization. Therefore the application of such meta-model was used in this work, in order to improve the structural optimization of a bumper for a racing car in composite material subjected to frontal impact. The specific energy absorption represents the objective function to maximize and the geometrical parameters subjected to some design constraints are the design variables. LS-DYNA codes were interfaced with LS-OPT tool in order to find the optimized solution, through the use of a domain reduction strategy. With the use of the Kriging meta-model the crashworthiness characteristic of the composite bumper was improved.

Keywords: composite material, crashworthiness, finite element analysis, optimization

Procedia PDF Downloads 243
37547 Containment/Penetration Analysis for the Protection of Aircraft Engine External Configuration and Nuclear Power Plant Structures

Authors: Dong Wook Lee, Adrian Mistreanu

Abstract:

The authors have studied a method for analyzing containment and penetration using an explicit nonlinear Finite Element Analysis. This method may be used in the stage of concept design for the protection of external configurations or components of aircraft engines and nuclear power plant structures. This paper consists of the modeling method, the results obtained from the method and the comparison of the results with those calculated from simple analytical method. It shows that the containment capability obtained by proposed method matches well with analytically calculated containment capability.

Keywords: computer aided engineering, containment analysis, finite element analysis, impact analysis, penetration analysis

Procedia PDF Downloads 123
37546 The Use of Stroke Journey Map in Improving Patients' Perceived Knowledge in Acute Stroke Unit

Authors: C. S. Chen, F. Y. Hui, B. S. Farhana, J. De Leon

Abstract:

Introduction: Stroke can lead to long-term disability, affecting one’s quality of life. Providing stroke education to patient and family members is essential to optimize stroke recovery and prevent recurrent stroke. Currently, nurses conduct stroke education by handing out pamphlets and explaining their contents to patients. However, this is not always effective as nurses have varying levels of knowledge and depth of content discussed with the patient may not be consistent. With the advancement of information technology, health education is increasingly being disseminated via electronic software and studies have shown this to have benefitted patients. Hence, a multi-disciplinary team consisting of doctors, nurses and allied health professionals was formed to create the stroke journey map software to deliver consistent and concise stroke education. Research Objectives: To evaluate the effectiveness of using a stroke journey map software in improving patients’ perceived knowledge in the acute stroke unit during hospitalization. Methods: Patients admitted to the acute stroke unit were given stroke journey map software during patient education. The software consists of 31 interactive slides that are brightly coloured and 4 videos, based on input provided by the multi-disciplinary team. Participants were then assessed with pre-and-post survey questionnaires before and after viewing the software. The questionnaire consists of 10 questions with a 5-point Likert scale which sums up to a total score of 50. The inclusion criteria are patients diagnosed with ischemic stroke and are cognitively alert and oriented. This study was conducted between May 2017 to October 2017. Participation was voluntary. Results: A total of 33 participants participated in the study. The results demonstrated that the use of a stroke journey map as a stroke education medium was effective in improving patients’ perceived knowledge. A comparison of pre- and post-implementation data of stroke journey map revealed an overall mean increase in patients’ perceived knowledge from 24.06 to 40.06. The data is further broken down to evaluate patients’ perceived knowledge in 3 domains: (1) Understanding of disease process; (2) Management and treatment plans; (3) Post-discharge care. Each domain saw an increase in mean score from 10.7 to 16.2, 6.9 to 11.9 and 6.6 to 11.7 respectively. Project Impact: The implementation of stroke journey map has a positive impact in terms of (1) Increasing patient’s perceived knowledge which could contribute to greater empowerment of health; (2) Reducing need for stroke education material printouts making it environmentally friendly; (3) Decreasing time nurses spent on giving education resulting in more time to attend to patients’ needs. Conclusion: This study has demonstrated the benefit of using stroke journey map as a platform for stroke education. Overall, it has increased patients’ perceived knowledge in understanding their disease process, the management and treatment plans as well as the discharge process.

Keywords: acute stroke, education, ischemic stroke, knowledge, stroke

Procedia PDF Downloads 151
37545 Multi-source Question Answering Framework Using Transformers for Attribute Extraction

Authors: Prashanth Pillai, Purnaprajna Mangsuli

Abstract:

Oil exploration and production companies invest considerable time and efforts to extract essential well attributes (like well status, surface, and target coordinates, wellbore depths, event timelines, etc.) from unstructured data sources like technical reports, which are often non-standardized, multimodal, and highly domain-specific by nature. It is also important to consider the context when extracting attribute values from reports that contain information on multiple wells/wellbores. Moreover, semantically similar information may often be depicted in different data syntax representations across multiple pages and document sources. We propose a hierarchical multi-source fact extraction workflow based on a deep learning framework to extract essential well attributes at scale. An information retrieval module based on the transformer architecture was used to rank relevant pages in a document source utilizing the page image embeddings and semantic text embeddings. A question answering framework utilizingLayoutLM transformer was used to extract attribute-value pairs incorporating the text semantics and layout information from top relevant pages in a document. To better handle context while dealing with multi-well reports, we incorporate a dynamic query generation module to resolve ambiguities. The extracted attribute information from various pages and documents are standardized to a common representation using a parser module to facilitate information comparison and aggregation. Finally, we use a probabilistic approach to fuse information extracted from multiple sources into a coherent well record. The applicability of the proposed approach and related performance was studied on several real-life well technical reports.

Keywords: natural language processing, deep learning, transformers, information retrieval

Procedia PDF Downloads 182
37544 Effect of Interaction between Colchicine Concentrations and Treatment Time Duration on the Percentage of Chromosome Polyploidy of Crepis capillaris (with and without 2B Chromosome) in vitro Culture

Authors: Payman A. A. Zibari, Mosleh M. S. Duhoky

Abstract:

These experiments were conducted at Tissue Culture Laboratory/ Faculty of Agriculture / University of Duhok during the period from January 2011 to May 2013. The objectives of this study were to study the effects of interaction between colchcine concentrations and treatment time duration of Creps capilaris (with and without 2B chromosome) on chromosome polyploidy during fifteen passages until regeneration of plants from the callus. Data showed that high percentage of chromosome polyploidy approximately can be obtained from high concentration of colchicin and long time of duration.

Keywords: polyploidy, Crepis capilaris, colchicine, B chromosome

Procedia PDF Downloads 178
37543 Microwave Imaging by Application of Information Theory Criteria in MUSIC Algorithm

Authors: Majid Pourahmadi

Abstract:

The performance of time-reversal MUSIC algorithm will be dramatically degrades in presence of strong noise and multiple scattering (i.e. when scatterers are close to each other). This is due to error in determining the number of scatterers. The present paper provides a new approach to alleviate such a problem using an information theoretic criterion referred as minimum description length (MDL). The merits of the novel approach are confirmed by the numerical examples. The results indicate the time-reversal MUSIC yields accurate estimate of the target locations with considerable noise and multiple scattering in the received signals.

Keywords: microwave imaging, time reversal, MUSIC algorithm, minimum description length (MDL)

Procedia PDF Downloads 318
37542 A Study of Behaviors in Using Social Networks of Corporate Personnel of Suan Sunandha Rajabhat University

Authors: Wipada Chaiwchan

Abstract:

This research aims to study behaviors in using social networks of Corporate personnel of Suan Sunandha Rajabhat University. The sample used in the study were two groups: 1) Academic Officer 70 persons and 2) Operation Officer 143 persons were used in this study. The tools in this research consisted of questionnaire which the data were analyzed by using percentage, average (X) and Standard deviation (S.D.) and Independent Sample T-Test to test the difference between the mean values obtained from two independent samples, and One-way anova to analysis of variance, and Multiple comparisons to test that the average pair of different methods by Fisher’s Least Significant Different (LSD). The study result found that the most of corporate personnel have purpose in using social network to information awareness aspect was knowledge and online conference with social media. By using the average more than 3 hours per day in everyday. Using time in working in one day and there are computers connected to the Internet at home, by using the communication in the operational processes. Behaviors using social networks in relation to gender, age, job title, department, and type of personnel. Hypothesis testing, and analysis of variance for the effects of this analysis is divided into three aspects: The use of online social networks, the attitude of the users and the security analysis has found that Corporate Personnel of Suan Sunandha Rajabhat University. Overall and specifically at the high level, and considering each item found all at a high level. By sorting of the social network (X=3.22), The attitude of the users (X= 3.06) and the security (X= 3.11). The overall behaviors using of each side (X=3.11).

Keywords: social network, behaviors, social media, computer information systems

Procedia PDF Downloads 382
37541 Cryptocurrency Realities: Insights from Social and Economic Psychology

Authors: Sarah Marie

Abstract:

In today's dynamic financial landscape, cryptocurrencies represent a paradigm shift characterized by innovation and intense debate. This study probes into their transformative potential and the challenges they present, offering a balanced perspective that recognizes both their promise and pitfalls. Emulating the engaging style of a TED Talk, this research goes beyond academic analysis, serving as a critical bridge to reconcile the perspectives of cryptocurrency skeptics and enthusiasts, fostering a well-informed dialogue. The study employs a mixed-method approach, analyzing current trends, regulatory landscapes, and public perceptions in the cryptocurrency domain. It distinguishes genuine innovators in this field from ostentatious opportunists, echoing the sentiment that real innovation should be separated from mere showmanship. If one is unfamiliar with who is being referenced, they can likely spot them leaning against their Lamborghinis outside "Crypto" conventions, looking greasy. Major findings reveal a complex scenario dominated by regulatory uncertainties, market volatility, and security issues, emphasizing the need for a coherent regulatory framework that balances innovation with risk management and sustainable practices. The study underscores the importance of transparency and consumer protection in fostering responsible growth within the cryptocurrency ecosystem. In conclusion, the research advocates for education, innovation, and ethical governance in the realm of cryptocurrencies. It calls for collaborative efforts to navigate the intricacies of this evolving landscape and to realize its full potential in a responsible, inclusive, and forward-thinking manner.

Keywords: financial landscape, innovation, public perception, transparency

Procedia PDF Downloads 32
37540 Discrete Tracking Control of Nonholonomic Mobile Robots: Backstepping Design Approach

Authors: Alexander S. Andreev, Olga A. Peregudova

Abstract:

In this paper, we propose a discrete tracking control of nonholonomic mobile robots with two degrees of freedom. The electro-mechanical model of a mobile robot moving on a horizontal surface without slipping, with two rear wheels controlled by two independent DC electric, and one front roal wheel is considered. We present back-stepping design based on the Euler approximate discrete-time model of a continuous-time plant. Theoretical considerations are verified by numerical simulation. The work was supported by RFFI (15-01-08482).

Keywords: actuator dynamics, back stepping, discrete-time controller, Lyapunov function, wheeled mobile robot

Procedia PDF Downloads 397
37539 Use of Hierarchical Temporal Memory Algorithm in Heart Attack Detection

Authors: Tesnim Charrad, Kaouther Nouira, Ahmed Ferchichi

Abstract:

In order to reduce the number of deaths due to heart problems, we propose the use of Hierarchical Temporal Memory Algorithm (HTM) which is a real time anomaly detection algorithm. HTM is a cortical learning algorithm based on neocortex used for anomaly detection. In other words, it is based on a conceptual theory of how the human brain can work. It is powerful in predicting unusual patterns, anomaly detection and classification. In this paper, HTM have been implemented and tested on ECG datasets in order to detect cardiac anomalies. Experiments showed good performance in terms of specificity, sensitivity and execution time.

Keywords: cardiac anomalies, ECG, HTM, real time anomaly detection

Procedia PDF Downloads 206
37538 Prevalence, Median Time, and Associated Factors with the Likelihood of Initial Antidepressant Change: A Cross-Sectional Study

Authors: Nervana Elbakary, Sami Ouanes, Sadaf Riaz, Oraib Abdallah, Islam Mahran, Noriya Al-Khuzaei, Yassin Eltorki

Abstract:

Major Depressive Disorder (MDD) requires therapeutic interventions during the initial month after being diagnosed for better disease outcomes. International guidelines recommend a duration of 4–12 weeks for an initial antidepressant (IAD) trial at an optimized dose to get a response. If depressive symptoms persist after this duration, guidelines recommend switching, augmenting, or combining strategies as the next step. Most patients with MDD in the mental health setting have been labeled incorrectly as treatment-resistant where in fact they have not been subjected to an adequate trial of guideline-recommended therapy. Premature discontinuation of IAD due to ineffectiveness can cause unfavorable consequences. Avoiding irrational practices such as subtherapeutic doses of IAD, premature switching between the ADs, and refraining from unjustified polypharmacy can help the disease to go into a remission phase We aimed to determine the prevalence and the patterns of strategies applied after an IAD was changed because of a suboptimal response as a primary outcome. Secondary outcomes included the median survival time on IAD before any change; and the predictors that were associated with IAD change. This was a retrospective cross- sectional study conducted in Mental Health Services in Qatar. A dataset between January 1, 2018, and December 31, 2019, was extracted from the electronic health records. Inclusion and exclusion criteria were defined and applied. The sample size was calculated to be at least 379 patients. Descriptive statistics were reported as frequencies and percentages, in addition, to mean and standard deviation. The median time of IAD to any change strategy was calculated using survival analysis. Associated predictors were examined using two unadjusted and adjusted cox regression models. A total of 487 patients met the inclusion criteria of the study. The average age for participants was 39.1 ± 12.3 years. Patients with first experience MDD episode 255 (52%) constituted a major part of our sample comparing to the relapse group 206(42%). About 431 (88%) of the patients had an occurrence of IAD change to any strategy before end of the study. Almost half of the sample (212 (49%); 95% CI [44–53%]) had their IAD changed less than or equal to 30 days. Switching was consistently more common than combination or augmentation at any timepoint. The median time to IAD change was 43 days with 95% CI [33.2–52.7]. Five independent variables (age, bothersome side effects, un-optimization of the dose before any change, comorbid anxiety, first onset episode) were significantly associated with the likelihood of IAD change in the unadjusted analysis. The factors statistically associated with higher hazard of IAD change in the adjusted analysis were: younger age, un-optimization of the IAD dose before any change, and comorbid anxiety. Because almost half of the patients in this study changed their IAD as early as within the first month, efforts to avoid treatment failure are needed to ensure patient-treatment targets are met. The findings of this study can have direct clinical guidance for health care professionals since an optimized, evidence-based use of AD medication can improve the clinical outcomes of patients with MDD; and also, to identify high-risk factors that could worsen the survival time on IAD such as young age and comorbid anxiety

Keywords: initial antidepressant, dose optimization, major depressive disorder, comorbid anxiety, combination, augmentation, switching, premature discontinuation

Procedia PDF Downloads 134
37537 Analysis the Nexus among Ethnic Polarization, Globalization and Export Diversification of Pakistan

Authors: Naima Mubeen

Abstract:

Multi-ethnic societies play a crucial role in managing relevant policies and their implication. Pakistan is a classic case of multicultural identity, social evils and a wide-range of preferential ethnic policies. The major objectives of this study are to explore the relationship between ethnic diversity, globalization and export diversification of Pakistan. For empirical analysis of this underlying nexus by utilizing time series data from 1970 to 2016, this study used the autoregressive distributed lags (ARDL) technique. The empirical finding of this study reveals that ethnic diversity is an essential component for enhancing globalization and export diversification in the case of Pakistan. Regarding the promotion of globalization and export diversification at different forums of the country, this study suggested that government needs to take steps for the promotion of society towards more cohesiveness by fair justice-based system and awareness programs.

Keywords: ethnic diversity, social exclusion, globalization, export diversification

Procedia PDF Downloads 102
37536 Frequency Recognition Models for Steady State Visual Evoked Potential Based Brain Computer Interfaces (BCIs)

Authors: Zeki Oralhan, Mahmut Tokmakçı

Abstract:

SSVEP based brain computer interface (BCI) systems have been preferred, because of high information transfer rate (ITR) and practical use. ITR is the parameter of BCI overall performance. For high ITR value, one of specification BCI system is that has high accuracy. In this study, we investigated to recognize SSVEP with shorter time and lower error rate. In the experiment, there were 8 flickers on light crystal display (LCD). Participants gazed to flicker which had 12 Hz frequency and 50% duty cycle ratio on the LCD during 10 seconds. During the experiment, EEG signals were acquired via EEG device. The EEG data was filtered in preprocessing session. After that Canonical Correlation Analysis (CCA), Multiset CCA (MsetCCA), phase constrained CCA (PCCA), and Multiway CCA (MwayCCA) methods were applied on data. The highest average accuracy value was reached when MsetCCA was applied.

Keywords: brain computer interface, canonical correlation analysis, human computer interaction, SSVEP

Procedia PDF Downloads 253
37535 Kinematic Analysis of Human Gait for Typical Postures of Walking, Running and Cart Pulling

Authors: Nupur Karmaker, Hasin Aupama Azhari, Abdul Al Mortuza, Abhijit Chanda, Golam Abu Zakaria

Abstract:

Purpose: The purpose of gait analysis is to determine the biomechanics of the joint, phases of gait cycle, graphical and analytical analysis of degree of rotation, analysis of the electrical activity of muscles and force exerted on the hip joint at different locomotion during walking, running and cart pulling. Methods and Materials: Visual gait analysis and electromyography method has been used to detect the degree of rotation of joints and electrical activity of muscles. In cinematography method an object is observed from different sides and takes its video. Cart pulling length has been divided into frames with respect to time by using video splitter software. Phases of gait cycle, degree of rotation of joints, EMG profile and force analysis during walking and running has been taken from different papers. Gait cycle and degree of rotation of joints during cart pulling has been prepared by using video camera, stop watch, video splitter software and Microsoft Excel. Results and Discussion: During the cart pulling the force exerted on hip is the resultant of various forces. The force on hip is the vector sum of the force Fg= mg, due the body of weight of the person and Fa= ma, due to the velocity. Maximum stance phase shows during cart pulling and minimum shows during running. During cart pulling shows maximum degree of rotation of hip joint, knee: running, and ankle: cart pulling. During walking, it has been observed minimum degree of rotation of hip, ankle: during running. During cart pulling, dynamic force depends on the walking velocity, body weight and load weight. Conclusions: 80% people suffer gait related disease with increasing their age. Proper care should take during cart pulling. It will be better to establish the gait laboratory to determine the gait related diseases. If the way of cart pulling is changed i.e the design of cart pulling machine, load bearing system is changed then it would possible to reduce the risk of limb loss, flat foot syndrome and varicose vein in lower limb.

Keywords: kinematic, gait, gait lab, phase, force analysis

Procedia PDF Downloads 567
37534 Stochastic Model Predictive Control for Linear Discrete-Time Systems with Random Dither Quantization

Authors: Tomoaki Hashimoto

Abstract:

Recently, feedback control systems using random dither quantizers have been proposed for linear discrete-time systems. However, the constraints imposed on state and control variables have not yet been taken into account for the design of feedback control systems with random dither quantization. Model predictive control is a kind of optimal feedback control in which control performance over a finite future is optimized with a performance index that has a moving initial and terminal time. An important advantage of model predictive control is its ability to handle constraints imposed on state and control variables. Based on the model predictive control approach, the objective of this paper is to present a control method that satisfies probabilistic state constraints for linear discrete-time feedback control systems with random dither quantization. In other words, this paper provides a method for solving the optimal control problems subject to probabilistic state constraints for linear discrete-time feedback control systems with random dither quantization.

Keywords: optimal control, stochastic systems, random dither, quantization

Procedia PDF Downloads 428
37533 Collision Theory Based Sentiment Detection Using Discourse Analysis in Hadoop

Authors: Anuta Mukherjee, Saswati Mukherjee

Abstract:

Data is growing everyday. Social networking sites such as Twitter are becoming an integral part of our daily lives, contributing a large increase in the growth of data. It is a rich source especially for sentiment detection or mining since people often express honest opinion through tweets. However, although sentiment analysis is a well-researched topic in text, this analysis using Twitter data poses additional challenges since these are unstructured data with abbreviations and without a strict grammatical correctness. We have employed collision theory to achieve sentiment analysis in Twitter data. We have also incorporated discourse analysis in the collision theory based model to detect accurate sentiment from tweets. We have also used the retweet field to assign weights to certain tweets and obtained the overall weightage of a topic provided in the form of a query. Hadoop has been exploited for speed. Our experiments show effective results.

Keywords: sentiment analysis, twitter, collision theory, discourse analysis

Procedia PDF Downloads 515
37532 Performance Analysis of Vision-Based Transparent Obstacle Avoidance for Construction Robots

Authors: Siwei Chang, Heng Li, Haitao Wu, Xin Fang

Abstract:

Construction robots are receiving more and more attention as a promising solution to the manpower shortage issue in the construction industry. The development of intelligent control techniques that assist in controlling the robots to avoid transparency and reflected building obstacles is crucial for guaranteeing the adaptability and flexibility of mobile construction robots in complex construction environments. With the boom of computer vision techniques, a number of studies have proposed vision-based methods for transparent obstacle avoidance to improve operation accuracy. However, vision-based methods are also associated with disadvantages such as high computational costs. To provide better perception and value evaluation, this study aims to analyze the performance of vision-based techniques for avoiding transparent building obstacles. To achieve this, commonly used sensors, including a lidar, an ultrasonic sensor, and a USB camera, are equipped on the robotic platform to detect obstacles. A Raspberry Pi 3 computer board is employed to compute data collecting and control algorithms. The turtlebot3 burger is employed to test the programs. On-site experiments are carried out to observe the performance in terms of success rate and detection distance. Control variables include obstacle shapes and environmental conditions. The findings contribute to demonstrating how effectively vision-based obstacle avoidance strategies for transparent building obstacle avoidance and provide insights and informed knowledge when introducing computer vision techniques in the aforementioned domain.

Keywords: construction robot, obstacle avoidance, computer vision, transparent obstacle

Procedia PDF Downloads 64
37531 Numerical Modeling of Air Shock Wave Generated by Explosive Detonation and Dynamic Response of Structures

Authors: Michał Lidner, Zbigniew SzcześNiak

Abstract:

The ability to estimate blast load overpressure properly plays an important role in safety design of buildings. The issue of studying of blast loading on structural elements has been explored for many years. However, in many literature reports shock wave overpressure is estimated with simplified triangular or exponential distribution in time. This indicates some errors when comparing real and numerical reaction of elements. Nonetheless, it is possible to further improve setting similar to the real blast load overpressure function versus time. The paper presents a method of numerical analysis of the phenomenon of the air shock wave propagation. It uses Finite Volume Method and takes into account energy losses due to a heat transfer with respect to an adiabatic process rule. A system of three equations (conservation of mass, momentum and energy) describes the flow of a volume of gaseous medium in the area remote from building compartments, which can inhibit the movement of gas. For validation three cases of a shock wave flow were analyzed: a free field explosion, an explosion inside a steel insusceptible tube (the 1D case) and an explosion inside insusceptible cube (the 3D case). The results of numerical analysis were compared with the literature reports. Values of impulse, pressure, and its duration were studied. Finally, an overall good convergence of numerical results with experiments was achieved. Also the most important parameters were well reflected. Additionally analyses of dynamic response of one of considered structural element were made.

Keywords: adiabatic process, air shock wave, explosive, finite volume method

Procedia PDF Downloads 174
37530 Change of Physicochemical Properties of Grain in the Germination of Chickpea Grain

Authors: Mira Zhonyssova, Nurlaym Ongarbayeva, Makpal Atykhanova

Abstract:

Indicators of quality of grain chickpeas, the absorption of water different temperatures by grain chickpeas studied. Organoleptic and physicochemical changes in the germination of chickpeas studied. The total time of the duration of germination of chickpea grain is determined. As a result of the analysis of experimental data, it was found that the germination time at which the chickpea sprout length was 0.5- 3 mm varies from 21 to 25 hours. The change in the volume of chickpea grain during germination was investigated. It was found that in the first 2 hours the volume of chickpeas changes slightly – by 38%. This is due to the process of adsorption of water to a critical state. From 2 to 9 hours, the process of swelling of chickpea grain is observed – the vital activity of cells increases, enzymatic systems become active, the respiratory coefficient increases; gibberellin, stimulating the formation of a number of enzymes, is released. During this period, there is a sharp increase in the volume of chickpea grains – up to 138%. From 9 to 19 hours, “sprouting” of chickpea grains is observed, no morphological changes occur in the corcule – the grain volume remains at 138%. From 19 hours, the grain growth process begins, while the grain volume increases by 143%.

Keywords: chickpea, seeds, legumes, germination, physic-chemical properties

Procedia PDF Downloads 43