Search results for: continuous speed profile data
26721 Exposure to Radio Frequency Waves of Mobile Phone and Temperature Changes of Brain Tissue
Authors: Farhad Forouharmajd, Hossein Ebrahimi, Siamak Pourabdian
Abstract:
Introduction: Prevalent use of cell phones (mobile phones) has led to increasing worries about the effect of radiofrequency waves on the physiology of human body. This study was done to determine different reactions of the temperatures in different depths of brain tissue in confronting with radiofrequency waves of cell phones. Methodology: This study was an empirical research. A cow's brain tissue was placed in a compartment and the effects of radiofrequency waves of the cell phone was analyzed during confrontation and after confrontation, in three different depths of 2, 12, and 22 mm of the tissue, in 4 mm and 4 cm distances of the tissue to a cell phone, for 15 min. Lutron thermometer was used to measure the tissue temperatures. Data analysis was done by Lutron software. Findings: The rate of increasing the temperature at the depth of 22 mm was higher than 2 mm and 12mm depths, during confrontation of the brain tissue at the distance of 4 mm with the cell phone, such that the tissue temperatures at 2, 12, and 22 mm depths increased by 0.29 ˚C, 0.31 ˚C, and 0.37 ˚C, respectively, relative to the base temperature (tissue temperature before confrontation). Moreover, the temperature of brain tissue at the distance of 4 cm by increasing the tissue depth was more than other depths. Increasing the tissue temperature also existed by increasing the brain tissue depth after the confrontation with the cell phone. The temperature of the 22 mm depth increased with higher speed at the time confrontation. Conclusion: Not only radiofrequency waves of cell phones increased the tissue temperature in all the depths of the brain tissue, but also the temperature due to radiofrequency waves of the cell phone was more at the depths higher than 22 mm of the tissue. In fact, the thermal effect of radiofrequency waves was higher in higher depths.Keywords: mobile phone, radio frequency waves, brain tissue, temperature
Procedia PDF Downloads 20126720 A Study of Barriers and Challenges Associated with Agriculture E-commerce in Afghanistan
Authors: Khwaja Bahman Qaderi, Noorullah Rafiqee
Abstract:
Background: With today's increasing Internet users, e-commerce has become a viable model for strengthening relationships between sellers, entrepreneurs, and consumers due to its speed, efficiency, and cost reduction. Agriculture is the economic backbone for 80 percent of the Afghan population. According to MCIT statistics, there are currently around 10 million internet users in Afghanistan. With this data, it was expected that Afghan people should have utilized e-commerce in their agricultural aspects, although it appears to be less used. Objective: This study examines the scope of e-commerce in Afghanistan's agriculture enterprises, how they harness the potential of internet users, and what obstacles they face in implementing e-commerce in their businesses. Method: The study distributed a 39-question questionnaire to agribusinesses in five different zones of Afghanistan. After extracting the responses and excluding the incomplete questionnaires, 280 were included in the analysis step to perform a non-parametric sign test. Result: E-commerce in Afghanistan faces four major political, economic, Internet, and technological obstacles, and no company in the country has implemented e-commerce. In addition, e-commerce is still in its infancy among agricultural companies in the country. Internet use is still primarily limited to email and sharing product images on Facebook & Instagram for advertising purposes. There are no companies that conduct international transactions via the Internet. Conclusion: This study contributes to knowing the challenges and barriers that the agriculture e-commerce faces in Afghanistan to find the effective solutions to use the capacity of internet users in the country and increase the sales rate of agricultural products through the Internet.Keywords: E-commerce, barriers and challenges, agriculture companies, Afghanistan
Procedia PDF Downloads 8926719 Nuclear Decay Data Evaluation for 217Po
Authors: S. S. Nafee, A. M. Al-Ramady, S. A. Shaheen
Abstract:
Evaluated nuclear decay data for the 217Po nuclide ispresented in the present work. These data include recommended values for the half-life T1/2, α-, β--, and γ-ray emission energies and probabilities. Decay data from 221Rn α and 217Bi β—decays are presented. Q(α) has been updated based on the recent published work of the Atomic Mass Evaluation AME2012. In addition, the logft values were calculated using the Logft program from the ENSDF evaluation package. Moreover, the total internal conversion electrons has been calculated using Bricc program. Meanwhile, recommendation values or the multi-polarities have been assigned based on recently measurement yield a better intensity balance at the 254 keV and 264 keV gamma transitions.Keywords: nuclear decay data evaluation, mass evaluation, total converison coefficients, atomic mass evaluation
Procedia PDF Downloads 43326718 The Unscented Kalman Filter Implementation for the Sensorless Speed Control of a Permanent Magnet Synchronous Motor
Authors: Justas Dilys
Abstract:
ThispaperaddressestheimplementationandoptimizationofanUnscentedKalmanFilter(UKF) for the Permanent Magnet Synchronous Motor (PMSM) sensorless control using an ARM Cortex- M3 microcontroller. A various optimization levels based on arithmetic calculation reduction was implemented in ARM Cortex-M3 microcontroller. The execution time of UKF estimator was up to 90µs without loss of accuracy. Moreover, simulation studies on the Unscented Kalman filters are carried out using Matlab to explore the usability of the UKF in a sensorless PMSMdrive.Keywords: unscented kalman filter, ARM, PMSM, implementation
Procedia PDF Downloads 16726717 Experimental Investigation of Mechanical Friction Influence in Semi-Hydraulic Clutch Actuation System Over Mileage
Authors: Abdul Azarrudin M. A., Pothiraj K., Kandasamy Satish
Abstract:
In the current automobile scenario, there comes a demand on more sophistication and comfort drive feel on passenger segments. The clutch pedal effort is one such customer touch feels in manual transmission vehicles, where the driver continuous to operate the clutch pedal in his entire the driving maneuvers. Hence optimum pedal efforts at green condition and over mileage to be ensured for fatigue free the driving. As friction is one the predominant factor and its tendency to challenge the technicality by causing the function degradation. One such semi-hydraulic systems shows load efficiency of about 70-75% over lifetime only due to the increase in friction which leads to the increase in pedal effort and cause fatigue to the vehicle driver. This work deals with the study of friction with different interfaces and its influence in the fulcrum points over mileage, with the objective of understanding the trend over mileage and determining the alternative ways of resolving it. In that one way of methodology is the reduction of friction by experimental investigation of various friction reduction interfaces like metal-to-metal interface and it has been tried out and is detailed further. Also, the specific attention has been put up considering the fulcrum load and its contact interfaces to move on with this study. The main results of the experimental data with the influence of three different contact interfaces are being presented with an ultimate intention of ending up into less fatigue with longer consistent pedal effort, thus smoothens the operation of the end user. The Experimental validation also has been done through rig-level test setup to depict the performance at static condition and in-parallel vehicle level test has also been performed to record the additional influences if any.Keywords: automobile, clutch, friction, fork
Procedia PDF Downloads 12426716 Affects Associations Analysis in Emergency Situations
Authors: Joanna Grzybowska, Magdalena Igras, Mariusz Ziółko
Abstract:
Association rule learning is an approach for discovering interesting relationships in large databases. The analysis of relations, invisible at first glance, is a source of new knowledge which can be subsequently used for prediction. We used this data mining technique (which is an automatic and objective method) to learn about interesting affects associations in a corpus of emergency phone calls. We also made an attempt to match revealed rules with their possible situational context. The corpus was collected and subjectively annotated by two researchers. Each of 3306 recordings contains information on emotion: (1) type (sadness, weariness, anxiety, surprise, stress, anger, frustration, calm, relief, compassion, contentment, amusement, joy) (2) valence (negative, neutral, or positive) (3) intensity (low, typical, alternating, high). Also, additional information, that is a clue to speaker’s emotional state, was annotated: speech rate (slow, normal, fast), characteristic vocabulary (filled pauses, repeated words) and conversation style (normal, chaotic). Exponentially many rules can be extracted from a set of items (an item is a previously annotated single information). To generate the rules in the form of an implication X → Y (where X and Y are frequent k-itemsets) the Apriori algorithm was used - it avoids performing needless computations. Then, two basic measures (Support and Confidence) and several additional symmetric and asymmetric objective measures (e.g. Laplace, Conviction, Interest Factor, Cosine, correlation coefficient) were calculated for each rule. Each applied interestingness measure revealed different rules - we selected some top rules for each measure. Owing to the specificity of the corpus (emergency situations), most of the strong rules contain only negative emotions. There are though strong rules including neutral or even positive emotions. Three examples of the strongest rules are: {sadness} → {anxiety}; {sadness, weariness, stress, frustration} → {anger}; {compassion} → {sadness}. Association rule learning revealed the strongest configurations of affects (as well as configurations of affects with affect-related information) in our emergency phone calls corpus. The acquired knowledge can be used for prediction to fulfill the emotional profile of a new caller. Furthermore, a rule-related possible context analysis may be a clue to the situation a caller is in.Keywords: data mining, emergency phone calls, emotional profiles, rules
Procedia PDF Downloads 40826715 Inclusive Practices in Health Sciences: Equity Proofing Higher Education Programs
Authors: Mitzi S. Brammer
Abstract:
Given that the cultural make-up of programs of study in institutions of higher learning is becoming increasingly diverse, much has been written about cultural diversity from a university-level perspective. However, there are little data in the way of specific programs and how they address inclusive practices when teaching and working with marginalized populations. This research study aimed to discover baseline knowledge and attitudes of health sciences faculty, instructional staff, and students related to inclusive teaching/learning and interactions. Quantitative data were collected via an anonymous online survey (one designed for students and another designed for faculty/instructional staff) using a web-based program called Qualtrics. Quantitative data were analyzed amongst the faculty/instructional staff and students, respectively, using descriptive and comparative statistics (t-tests). Additionally, some participants voluntarily engaged in a focus group discussion in which qualitative data were collected around these same variables. Collecting qualitative data to triangulate the quantitative data added trustworthiness to the overall data. The research team analyzed collected data and compared identified categories and trends, comparing those data between faculty/staff and students, and reported results as well as implications for future study and professional practice.Keywords: inclusion, higher education, pedagogy, equity, diversity
Procedia PDF Downloads 6726714 Employability Skills: The Route to Achieve Demographic Dividend in India
Authors: Malathi Iyer, Jayesh Vaidya
Abstract:
The demographic dividend of India will last for thirty years from now. However, reduction in birth rate, an increase in working population, improvements in medicine and better health practices lead to an ever-expanding elderly population, bringing additional burden to the economy and putting an end to the demographic dividend. To reap the dividend India needs to train the youth for employability. The need of the hour is to improve their life skills which lead the youth to become industrious and have continuous employment. The study will be conducted in perceiving the skill gaps that exist in commerce students for employability. The analysis results indicate the relation between the core study and the right skills for the workforce, with the steps that are taken to open the window for the demographic dividend.Keywords: demographic dividend, life skills, employability, workforce
Procedia PDF Downloads 52226713 Two Day Ahead Short Term Load Forecasting Neural Network Based
Authors: Firas M. Tuaimah
Abstract:
This paper presents an Artificial Neural Network based approach for short-term load forecasting and exactly for two days ahead. Two seasons have been discussed for Iraqi power system, namely summer and winter; the hourly load demand is the most important input variables for ANN based load forecasting. The recorded daily load profile with a lead time of 1-48 hours for July and December of the year 2012 was obtained from the operation and control center that belongs to the Ministry of Iraqi electricity. The results of the comparison show that the neural network gives a good prediction for the load forecasting and for two days ahead.Keywords: short-term load forecasting, artificial neural networks, back propagation learning, hourly load demand
Procedia PDF Downloads 46426712 Image Segmentation with Deep Learning of Prostate Cancer Bone Metastases on Computed Tomography
Authors: Joseph M. Rich, Vinay A. Duddalwar, Assad A. Oberai
Abstract:
Prostate adenocarcinoma is the most common cancer in males, with osseous metastases as the commonest site of metastatic prostate carcinoma (mPC). Treatment monitoring is based on the evaluation and characterization of lesions on multiple imaging studies, including Computed Tomography (CT). Monitoring of the osseous disease burden, including follow-up of lesions and identification and characterization of new lesions, is a laborious task for radiologists. Deep learning algorithms are increasingly used to perform tasks such as identification and segmentation for osseous metastatic disease and provide accurate information regarding metastatic burden. Here, nnUNet was used to produce a model which can segment CT scan images of prostate adenocarcinoma vertebral bone metastatic lesions. nnUNet is an open-source Python package that adds optimizations to deep learning-based UNet architecture but has not been extensively combined with transfer learning techniques due to the absence of a readily available functionality of this method. The IRB-approved study data set includes imaging studies from patients with mPC who were enrolled in clinical trials at the University of Southern California (USC) Health Science Campus and Los Angeles County (LAC)/USC medical center. Manual segmentation of metastatic lesions was completed by an expert radiologist Dr. Vinay Duddalwar (20+ years in radiology and oncologic imaging), to serve as ground truths for the automated segmentation. Despite nnUNet’s success on some medical segmentation tasks, it only produced an average Dice Similarity Coefficient (DSC) of 0.31 on the USC dataset. DSC results fell in a bimodal distribution, with most scores falling either over 0.66 (reasonably accurate) or at 0 (no lesion detected). Applying more aggressive data augmentation techniques dropped the DSC to 0.15, and reducing the number of epochs reduced the DSC to below 0.1. Datasets have been identified for transfer learning, which involve balancing between size and similarity of the dataset. Identified datasets include the Pancreas data from the Medical Segmentation Decathlon, Pelvic Reference Data, and CT volumes with multiple organ segmentations (CT-ORG). Some of the challenges of producing an accurate model from the USC dataset include small dataset size (115 images), 2D data (as nnUNet generally performs better on 3D data), and the limited amount of public data capturing annotated CT images of bone lesions. Optimizations and improvements will be made by applying transfer learning and generative methods, including incorporating generative adversarial networks and diffusion models in order to augment the dataset. Performance with different libraries, including MONAI and custom architectures with Pytorch, will be compared. In the future, molecular correlations will be tracked with radiologic features for the purpose of multimodal composite biomarker identification. Once validated, these models will be incorporated into evaluation workflows to optimize radiologist evaluation. Our work demonstrates the challenges of applying automated image segmentation to small medical datasets and lays a foundation for techniques to improve performance. As machine learning models become increasingly incorporated into the workflow of radiologists, these findings will help improve the speed and accuracy of vertebral metastatic lesions detection.Keywords: deep learning, image segmentation, medicine, nnUNet, prostate carcinoma, radiomics
Procedia PDF Downloads 9626711 An Analysis of Sequential Pattern Mining on Databases Using Approximate Sequential Patterns
Authors: J. Suneetha, Vijayalaxmi
Abstract:
Sequential Pattern Mining involves applying data mining methods to large data repositories to extract usage patterns. Sequential pattern mining methodologies used to analyze the data and identify patterns. The patterns have been used to implement efficient systems can recommend on previously observed patterns, in making predictions, improve usability of systems, detecting events, and in general help in making strategic product decisions. In this paper, identified performance of approximate sequential pattern mining defines as identifying patterns approximately shared with many sequences. Approximate sequential patterns can effectively summarize and represent the databases by identifying the underlying trends in the data. Conducting an extensive and systematic performance over synthetic and real data. The results demonstrate that ApproxMAP effective and scalable in mining large sequences databases with long patterns.Keywords: multiple data, performance analysis, sequential pattern, sequence database scalability
Procedia PDF Downloads 34226710 Improving the Utility of Social Media in Pharmacovigilance: A Mixed Methods Study
Authors: Amber Dhoot, Tarush Gupta, Andrea Gurr, William Jenkins, Sandro Pietrunti, Alexis Tang
Abstract:
Background: The COVID-19 pandemic has driven pharmacovigilance towards a new paradigm. Nowadays, more people than ever before are recognising and reporting adverse reactions from medications, treatments, and vaccines. In the modern era, with over 3.8 billion users, social media has become the most accessible medium for people to voice their opinions and so provides an opportunity to engage with more patient-centric and accessible pharmacovigilance. However, the pharmaceutical industry has been slow to incorporate social media into its modern pharmacovigilance strategy. This project aims to make social media a more effective tool in pharmacovigilance, and so reduce drug costs, improve drug safety and improve patient outcomes. This will be achieved by firstly uncovering and categorising the barriers facing the widespread adoption of social media in pharmacovigilance. Following this, the potential opportunities of social media will be explored. We will then propose realistic, practical recommendations to make social media a more effective tool for pharmacovigilance. Methodology: A comprehensive systematic literature review was conducted to produce a categorised summary of these barriers. This was followed by conducting 11 semi-structured interviews with pharmacovigilance experts to confirm the literature review findings whilst also exploring the unpublished and real-life challenges faced by those in the pharmaceutical industry. Finally, a survey of the general public (n = 112) ascertained public knowledge, perception, and opinion regarding the use of their social media data for pharmacovigilance purposes. This project stands out by offering perspectives from the public and pharmaceutical industry that fill the research gaps identified in the literature review. Results: Our results gave rise to several key analysis points. Firstly, inadequacies of current Natural Language Processing algorithms hinder effective pharmacovigilance data extraction from social media, and where data extraction is possible, there are significant questions over its quality. Social media also contains a variety of biases towards common drugs, mild adverse drug reactions, and the younger generation. Additionally, outdated regulations for social media pharmacovigilance do not align with new, modern General Data Protection Regulations (GDPR), creating ethical ambiguity about data privacy and level of access. This leads to an underlying mindset of avoidance within the pharmaceutical industry, as firms are disincentivised by the legal, financial, and reputational risks associated with breaking ambiguous regulations. Conclusion: Our project uncovered several barriers that prevent effective pharmacovigilance on social media. As such, social media should be used to complement traditional sources of pharmacovigilance rather than as a sole source of pharmacovigilance data. However, this project adds further value by proposing five practical recommendations that improve the effectiveness of social media pharmacovigilance. These include: prioritising health-orientated social media; improving technical capabilities through investment and strategic partnerships; setting clear regulatory guidelines using multi-stakeholder processes; creating an adverse drug reaction reporting interface inbuilt into social media platforms; and, finally, developing educational campaigns to raise awareness of the use of social media in pharmacovigilance. Implementation of these recommendations would speed up the efficient, ethical, and systematic adoption of social media in pharmacovigilance.Keywords: adverse drug reaction, drug safety, pharmacovigilance, social media
Procedia PDF Downloads 8226709 Medical Knowledge Management since the Integration of Heterogeneous Data until the Knowledge Exploitation in a Decision-Making System
Authors: Nadjat Zerf Boudjettou, Fahima Nader, Rachid Chalal
Abstract:
Knowledge management is to acquire and represent knowledge relevant to a domain, a task or a specific organization in order to facilitate access, reuse and evolution. This usually means building, maintaining and evolving an explicit representation of knowledge. The next step is to provide access to that knowledge, that is to say, the spread in order to enable effective use. Knowledge management in the medical field aims to improve the performance of the medical organization by allowing individuals in the care facility (doctors, nurses, paramedics, etc.) to capture, share and apply collective knowledge in order to make optimal decisions in real time. In this paper, we propose a knowledge management approach based on integration technique of heterogeneous data in the medical field by creating a data warehouse, a technique of extracting knowledge from medical data by choosing a technique of data mining, and finally an exploitation technique of that knowledge in a case-based reasoning system.Keywords: data warehouse, data mining, knowledge discovery in database, KDD, medical knowledge management, Bayesian networks
Procedia PDF Downloads 39526708 Mean Shift-Based Preprocessing Methodology for Improved 3D Buildings Reconstruction
Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour
Abstract:
In this work we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift
Procedia PDF Downloads 31526707 Transdermal Medicated- Layered Extended-Release Patches for Co-delivery of Carbamazepine and Pyridoxine
Authors: Sarah K. Amer, Walaa Alaa
Abstract:
Epilepsy is an important cause of mortality and morbidity, according to WHO statistics. It is characterized by the presence of frequent seizures occurring more than 24 hours apart. Carbamazepine (CBZ) is considered first-line treatment for epilepsy. However, reports have shown that CBZ oral formulations failed to achieve optimum systemic delivery, minimize side effects, and enhance patient compliance. Besides, the literature has signified the lack of therapeutically efficient CBZ transdermal formulation and the urge for its existence owing to its ease and convenient method of application and highlighted capability to attain higher bioavailability and more extended-release profiles compared to conventional oral CBZ tablets. This work aims to prepare CBZ microspheres (MS) that are embedded in a transdermal gel containing Vitamin B to be co-delivered. MS were prepared by emulsion-solvent diffusion method using Eudragit S as core forming polymer and hydroxypropyl methylcellulose (HPMC) polymer. The MS appeared to be spherical and porous in nature, offering a large surface area and high entrapment efficiency of CBZ. The transdermal gel was prepared by solvent-evaporation technique using HPMC that, offered high entrapment efficiency and Eudragit S that provided an extended-release profile. Polyethylene glycol, Span 80 and Pyridoxine were also added. Data indicated that combinations of CBZ with pyridoxine can reduce epileptic seizures without affecting motor coordination. Extended-release profiles were evident for this system. The patches were furthermore tested for thickness, moisture content, folding endurance, spreadability and viscosity measurements. This novel pharmaceutical formulation would be of great influence on seizure control, offering better therapeutic effects.Keywords: epilepsy, carbamazepine, pyridoxine, transdermal
Procedia PDF Downloads 5926706 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman Electricity Transmission Company
Authors: Rahma Al Balushi
Abstract:
Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS dept. This paper will describe in detail the GIS data submission process and the journey to develop the current process. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting, and data alterations salso aided to reduce the missing attributes of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the year 2017 and the year 2021. Overall, concluding that by governance, asset information & GIS department can control GIS data process; collect, properly record, and manage asset data and information within OETC network. This control extends to other applications and systems integrated with/related to GIS systems.Keywords: asset management ISO55001, standard procedures process, governance, geodatabase, NOC, CMMS
Procedia PDF Downloads 20726705 Importance of Ethics in Cloud Security
Authors: Pallavi Malhotra
Abstract:
This paper examines the importance of ethics in cloud computing. In the modern society, cloud computing is offering individuals and businesses an unlimited space for storing and processing data or information. Most of the data and information stored in the cloud by various users such as banks, doctors, architects, engineers, lawyers, consulting firms, and financial institutions among others require a high level of confidentiality and safeguard. Cloud computing offers centralized storage and processing of data, and this has immensely contributed to the growth of businesses and improved sharing of information over the internet. However, the accessibility and management of data and servers by a third party raise concerns regarding the privacy of clients’ information and the possible manipulations of the data by third parties. This document suggests the approaches various stakeholders should take to address various ethical issues involving cloud-computing services. Ethical education and training is key to all stakeholders involved in the handling of data and information stored or being processed in the cloud.Keywords: IT ethics, cloud computing technology, cloud privacy and security, ethical education
Procedia PDF Downloads 32526704 Study on the Self-Location Estimate by the Evolutional Triangle Similarity Matching Using Artificial Bee Colony Algorithm
Authors: Yuji Kageyama, Shin Nagata, Tatsuya Takino, Izuru Nomura, Hiroyuki Kamata
Abstract:
In previous study, technique to estimate a self-location by using a lunar image is proposed. We consider the improvement of the conventional method in consideration of FPGA implementation in this paper. Specifically, we introduce Artificial Bee Colony algorithm for reduction of search time. In addition, we use fixed point arithmetic to enable high-speed operation on FPGA.Keywords: SLIM, Artificial Bee Colony Algorithm, location estimate, evolutional triangle similarity
Procedia PDF Downloads 51826703 Evaluation of Practicality of On-Demand Bus Using Actual Taxi-Use Data through Exhaustive Simulations
Authors: Jun-ichi Ochiai, Itsuki Noda, Ryo Kanamori, Keiji Hirata, Hitoshi Matsubara, Hideyuki Nakashima
Abstract:
We conducted exhaustive simulations for data assimilation and evaluation of service quality for various setting in a new shared transportation system, called SAVS. Computational social simulation is a key technology to design recent social services like SAVS as new transportation service. One open issue in SAVS was to determine the service scale through the social simulation. Using our exhaustive simulation framework, OACIS, we did data-assimilation and evaluation of effects of SAVS based on actual tax-use data at Tajimi city, Japan. Finally, we get the conditions to realize the new service in a reasonable service quality.Keywords: on-demand bus sytem, social simulation, data assimilation, exhaustive simulation
Procedia PDF Downloads 32126702 Exploring the Discrepancy: The Influence of Instagram in Shaping Idealized Lifestyles and Self-Perceptions Among Indian University Students
Authors: Dhriti Kirpalani
Abstract:
The survey aims to explore the impact of Instagram on the perception of lifestyle aspirations (such as social life, fitness, trends followed in fashion, etc.) and perception of self in relation to an idealized lifestyle: Amidst today's media-saturated environment, university students are constantly exposed to idealized portrayals of lifestyles, often leading to unrealistic expectations and dissatisfaction with their own lives. This study investigates the impact of media on university students' perceptions of their own lifestyle, the discrepancy between their self-perception and idealized lifestyle, and their mental health. Employing a mixed-methods approach, the study combines quantitative and qualitative data collection methods to understand the issue comprehensively. A literature review was conducted in order to determine the effects of idealized lifestyle portrayal on Instagram; however, less attention has been received in the Indian setting. The researchers wish to employ a convenience sampling method among undergraduate students from India. The surveys that would be employed for quantitative analysis are Negative Social Media Comparison (NSMCS), Lifestyle Satisfaction Scale (LSS), Psychological Well-being Scale (PWB), and Self-Perception Profile for Adolescents (SPPA). The qualitative aspect would include in-depth interviews to provide deeper insights into participants' experiences and the mechanisms by which media influences their lifestyle aspirations and mental health. With the aim of being an exploratory study, the basis of the idea is found in the social comparison theory described by Leon Festinger. The findings aim to inform interventions to promote realistic expectations about lifestyle, reduce the negative effects of media on university students, and improve their mental health and well-being.Keywords: declined self-perception, idealized lifestyle, Instagram, Indian university students, social comparison
Procedia PDF Downloads 3926701 Structural Behavior of Precast Foamed Concrete Sandwich Panel Subjected to Vertical In-Plane Shear Loading
Authors: Y. H. Mugahed Amran, Raizal S. M. Rashid, Farzad Hejazi, Nor Azizi Safiee, A. A. Abang Ali
Abstract:
Experimental and analytical studies were accomplished to examine the structural behavior of precast foamed concrete sandwich panel (PFCSP) under vertical in-plane shear load. PFCSP full-scale specimens with total number of six were developed with varying heights to study an important parameter slenderness ratio (H/t). The production technique of PFCSP and the procedure of test setup were described. The results obtained from the experimental tests were analysed in the context of in-plane shear strength capacity, load-deflection profile, load-strain relationship, slenderness ratio, shear cracking patterns and mode of failure. Analytical study of finite element analysis was implemented and the theoretical calculations of the ultimate in-plane shear strengths using the adopted ACI318 equation for reinforced concrete wall were determined aimed at predicting the in-plane shear strength of PFCSP. The decrease in slenderness ratio from 24 to 14 showed an increase of 26.51% and 21.91% on the ultimate in-plane shear strength capacity as obtained experimentally and in FEA models, respectively. The experimental test results, FEA models data and theoretical calculation values were compared and provided a significant agreement with high degree of accuracy. Therefore, on the basis of the results obtained, PFCSP wall has the potential use as an alternative to the conventional load-bearing wall system.Keywords: deflection curves, foamed concrete (FC), load-strain relationships, precast foamed concrete sandwich panel (PFCSP), slenderness ratio, vertical in-plane shear strength capacity
Procedia PDF Downloads 22026700 Models of Innovation Processes and Their Evolution: A Literature Review
Authors: Maier Dorin, Maier Andreea
Abstract:
Today, any organization - regardless of the specific activity - must be prepared to face continuous radical changes, innovation thus becoming a condition of survival in a globalized market. Not all managers have an overall view on the real size of necessary innovation potential. Unfortunately there is still no common (and correct) understanding of the term of innovation among managers. Moreover, not all managers are aware of the need for innovation. This article highlights and analyzes a series of models of innovation processes and their evolution. The models analyzed encompass both the strategic level and the operational one within an organization, indicating performance innovation on each landing. As the literature review shows, there are no easy answers to the innovation process as there are no shortcuts to great results. Successful companies do not have a silver innovative bullet - they do not get results by making one or few things better than others, they make everything better.Keywords: innovation, innovation process, business success, models of innovation
Procedia PDF Downloads 40126699 Succeeding through Disruption: Exploring the Factors Influencing the Adoption of Disruptive Technologies in the Mobile Telecommunications Industry in Zimbabwe
Authors: Africa Makasi
Abstract:
The research explored factors influencing the adoption of disruptive technologies in the mobile telecommunications industry in Zimbabwe. Data was gathered from the second biggest competitor in the industry with over 3 million subscribers as the main case of study. The survey was conducted by purposively selecting 70 respondents from a population of 3,000,000 (three million) active subscribers from the company’s database. A skip interval of 42,857 was used to randomly select the sample. Customer representatives were selected from the company’s five regional offices using a two-stage cluster sampling technique. Employee participants were purposively selected from the company’s head office. Self-administered questionnaires were used in the research. A pilot test was conducted and the assessment of the reliability of the research instruments used in the research performed. Results of the pilot study were analyzed to test for reliability using SPSS. The results confirmed that the style of leadership and its thrust may help speed up or reduce the adoption of disruptive technologies. This was reflected by a p–value of 0.01 which is less than 0.05. The null hypothesis was thus rejected and the strong relationship between leadership and adoption of disruptive technology is confirmed. Similar results were also obtained with respect to staff competence, availability of funding and the type of infrastructure available Future research should look at organizational ambidexterity as well as exploitation and exploration paradigms in organizations in the telecommunications industry and their impact on the adoption of disruptive technologies.Keywords: disruptive innovation, adoption, mobile telecommunication industry, exploration and exploitation
Procedia PDF Downloads 36926698 Representations of Childcare Robots as a Controversial Issue
Authors: Raya A. Jones
Abstract:
This paper interrogates online representations of robot companions for children, including promotional material by manufacturers, media articles and technology blogs. The significance of the study lies in its contribution to understanding attitudes to robots. The prospect of childcare robots is particularly controversial ethically, and is associated with emotive arguments. The sampled material is restricted to relatively recent posts (the past three years) though the analysis identifies both continuous and changing themes across the past decade. The method extrapolates social representations theory towards examining the ways in which information about robotic products is provided for the general public. Implications for social acceptance of robot companions for the home and robot ethics are considered.Keywords: acceptance of robots, childcare robots, ethics, social representations
Procedia PDF Downloads 25226697 Optimal Pricing Based on Real Estate Demand Data
Authors: Vanessa Kummer, Maik Meusel
Abstract:
Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning
Procedia PDF Downloads 28526696 Parallel Pipelined Conjugate Gradient Algorithm on Heterogeneous Platforms
Authors: Sergey Kopysov, Nikita Nedozhogin, Leonid Tonkov
Abstract:
The article presents a parallel iterative solver for large sparse linear systems which can be used on a heterogeneous platform. Traditionally, the problem of solving linear systems does not scale well on multi-CPU/multi-GPUs clusters. For example, most of the attempts to implement the classical conjugate gradient method were at best counted in the same amount of time as the problem was enlarged. The paper proposes the pipelined variant of the conjugate gradient method (PCG), a formulation that is potentially better suited for hybrid CPU/GPU computing since it requires only one synchronization point per one iteration instead of two for standard CG. The standard and pipelined CG methods need the vector entries generated by the current GPU and other GPUs for matrix-vector products. So the communication between GPUs becomes a major performance bottleneck on multi GPU cluster. The article presents an approach to minimize the communications between parallel parts of algorithms. Additionally, computation and communication can be overlapped to reduce the impact of data exchange. Using the pipelined version of the CG method with one synchronization point, the possibility of asynchronous calculations and communications, load balancing between the CPU and GPU for solving the large linear systems allows for scalability. The algorithm is implemented with the combined use of technologies: MPI, OpenMP, and CUDA. We show that almost optimum speed up on 8-CPU/2GPU may be reached (relatively to a one GPU execution). The parallelized solver achieves a speedup of up to 5.49 times on 16 NVIDIA Tesla GPUs, as compared to one GPU.Keywords: conjugate gradient, GPU, parallel programming, pipelined algorithm
Procedia PDF Downloads 16526695 Unlocking the Puzzle of Borrowing Adult Data for Designing Hybrid Pediatric Clinical Trials
Authors: Rajesh Kumar G
Abstract:
A challenging aspect of any clinical trial is to carefully plan the study design to meet the study objective in optimum way and to validate the assumptions made during protocol designing. And when it is a pediatric study, there is the added challenge of stringent guidelines and difficulty in recruiting the necessary subjects. Unlike adult trials, there is not much historical data available for pediatrics, which is required to validate assumptions for planning pediatric trials. Typically, pediatric studies are initiated as soon as approval is obtained for a drug to be marketed for adults, so with the adult study historical information and with the available pediatric pilot study data or simulated pediatric data, the pediatric study can be well planned. Generalizing the historical adult study for new pediatric study is a tedious task; however, it is possible by integrating various statistical techniques and utilizing the advantage of hybrid study design, which will help to achieve the study objective in a smoother way even with the presence of many constraints. This research paper will explain how well the hybrid study design can be planned along with integrated technique (SEV) to plan the pediatric study; In brief the SEV technique (Simulation, Estimation (using borrowed adult data and applying Bayesian methods)) incorporates the use of simulating the planned study data and getting the desired estimates to Validate the assumptions.This method of validation can be used to improve the accuracy of data analysis, ensuring that results are as valid and reliable as possible, which allow us to make informed decisions well ahead of study initiation. With professional precision, this technique based on the collected data allows to gain insight into best practices when using data from historical study and simulated data alike.Keywords: adaptive design, simulation, borrowing data, bayesian model
Procedia PDF Downloads 7626694 Evaluation of Antidiabetic Activity of a Combination Extract of Nigella Sativa & Cinnamomum Cassia in Streptozotocin Induced Type-I Diabetic Rats
Authors: Ginpreet Kaur, Mohammad Yasir Usmani, Mohammed Kamil Khan
Abstract:
Diabetes mellitus is a disease with a high global burden and results in significant morbidity and mortality. In India, the number of people suffering with diabetes is expected to rise from 19 to 57 million in 2025. At present, interest in herbal remedies is growing to reduce the side effects associated with conventional dosage form like oral hypoglycemic agents and insulin for the treatment of diabetes mellitus. Our aim was to investigate the antidiabetic activities of combinatorial extract of N. sativa & C. cassia in Streptozotocin induced type-I Diabetic Rats. Thus, the present study was undertaken to screen postprandial glucose excursion potential through α- glucosidase inhibitory activity (In Vitro) and effect of combinatorial extract of N. sativa & C. cassia in Streptozotocin induced type-I Diabetic Rats (In Vivo). In addition changes in body weight, plasma glucose, lipid profile and kidney profile were also determined. The IC50 values for both extract and Acarbose was calculated by extrapolation method. Combinatorial extract of N. sativa & C. cassia at different dosages (100 and 200 mg/kg orally) and Metformin (50 mg/kg orally) as the standard drug was administered for 28 days and then biochemical estimation, body weights and OGTT (Oral glucose tolerance test) were determined. Histopathological studies were also performed on kidney and pancreatic tissue. In In-Vitro the combinatorial extract shows much more inhibiting effect than the individual extracts. The results reveals that combinatorial extract of N. sativa & C. cassia has shown significant decrease in plasma glucose (p<0.0001), total cholesterol and LDL levels when compared with the STZ group The decreasing level of BUN and creatinine revealed the protection of N. sativa & C. cassia extracts against nephropathy associated with diabetes. Combination of N. sativa & C. cassia significantly improved glucose tolerance to exogenously administered glucose (2 g/kg) after 60, 90 and 120 min interval on OGTT in high dose streptozotocin induced diabetic rats compared with the untreated control group. Histopathological studies shown that treatment with N. sativa & C. cassia extract alone and in combination restored pancreatic tissue integrity and was able to regenerate the STZ damaged pancreatic β cells. Thus, the present study reveals that combination of N. sativa & C. cassia extract has significant α- glucosidase inhibitory activity and thus has great potential as a new source for diabetes treatment.Keywords: lipid levels, OGTT, diabetes, herbs, glucosidase
Procedia PDF Downloads 43126693 Hydrodynamic Characterisation of a Hydraulic Flume with Sheared Flow
Authors: Daniel Rowe, Christopher R. Vogel, Richard H. J. Willden
Abstract:
The University of Oxford’s recirculating water flume is a combined wave and current test tank with a 1 m depth, 1.1 m width, and 10 m long working section, and is capable of flow speeds up to 1 ms−1 . This study documents the hydrodynamic characteristics of the facility in preparation for experimental testing of horizontal axis tidal stream turbine models. The turbine to be tested has a rotor diameter of 0.6 m and is a modified version of one of two model-scale turbines tested in previous experimental campaigns. An Acoustic Doppler Velocimeter (ADV) was used to measure the flow at high temporal resolution at various locations throughout the flume, enabling the spatial uniformity and turbulence flow parameters to be investigated. The mean velocity profiles exhibited high levels of spatial uniformity at the design speed of the flume, 0.6 ms−1 , with variations in the three-dimensional velocity components on the order of ±1% at the 95% confidence level, along with a modest streamwise acceleration through the measurement domain, a target 5 m working section of the flume. A high degree of uniformity was also apparent for the turbulence intensity, with values ranging between 1-2% across the intended swept area of the turbine rotor. The integral scales of turbulence exhibited a far higher degree of variation throughout the water column, particularly in the streamwise and vertical scales. This behaviour is believed to be due to the high signal noise content leading to decorrelation in the sampling records. To achieve more realistic levels of vertical velocity shear in the flume, a simple procedure to practically generate target vertical shear profiles in open-channel flows is described. Here, the authors arranged a series of non-uniformly spaced parallel bars placed across the width of the flume and normal to the onset flow. By adjusting the resistance grading across the height of the working section, the downstream profiles could be modified accordingly, characterised by changes in the velocity profile power law exponent, 1/n. Considering the significant temporal variation in a tidal channel, the choice of the exponent denominator, n = 6 and n = 9, effectively provides an achievable range around the much-cited value of n = 7 observed at many tidal sites. The resulting flow profiles, which we intend to use in future turbine tests, have been characterised in detail. The results indicate non-uniform vertical shear across the survey area and reveal substantial corner flows, arising from the differential shear between the target vertical and cross-stream shear profiles throughout the measurement domain. In vertically sheared flow, the rotor-equivalent turbulence intensity ranges between 3.0-3.8% throughout the measurement domain for both bar arrangements, while the streamwise integral length scale grows from a characteristic dimension on the order of the bar width, similar to the flow downstream of a turbulence-generating grid. The experimental tests are well-defined and repeatable and serve as a reference for other researchers who wish to undertake similar investigations.Keywords: acoustic doppler Velocimeter, experimental hydrodynamics, open-channel flow, shear profiles, tidal stream turbines
Procedia PDF Downloads 8626692 Resilience of the American Agriculture Sector
Authors: Dipak Subedi, Anil Giri, Christine Whitt, Tia McDonald
Abstract:
This study aims to understand the impact of the pandemic on the overall economic well-being of the agricultural sector of the United States. The two key metrics used to examine the economic well-being are the bankruptcy rate of the U.S. farm operations and the operating profit margin. One of the primary reasons for farm operations (in the U.S.) to file for bankruptcy is continuous negative profit or a significant decrease in profit. The pandemic caused significant supply and demand shocks in the domestic market. Furthermore, the ongoing trade disruptions, especially with China, also impacted the prices of agricultural commodities. The significantly reduced demand for ethanol and closure of meat processing plants affected both livestock and crop producers. This study uses data from courts to examine the bankruptcy rate over time of U.S. farm operations. Preliminary results suggest there wasn’t an increase in farm operations filing for bankruptcy in 2020. This was most likely because of record high Government payments to producers in 2020. The Federal Government made direct payments of more than $45 billion in 2020. One commonly used economic metric to measure farm profitability is the operating profit margin (OPM). Operating profit margin measures profitability as a share of the total value of production and government payments. The Economic Research Service of the United States Department of Agriculture defines a farm operation to be in a) a high-risk zone if the OPM is less than 10 percent and b) a low-risk zone if the OPM is higher than 25 percent. For this study, OPM was calculated for small, medium, and large-scale farm operations using the data from the Agriculture Resource Management Survey (OPM). Results show that except for small family farms, the share of farms in high-risk zone decreased in 2020 compared to the most recent non-pandemic year, 2019. This was most likely due to higher commodity prices at the end of 2020 and record-high government payments. Further investigation suggests a lower share of smaller farm operations receiving lower average government payments resulting in a large share (over 70 percent) being in the critical zone. This study should be of interest to multiple stakeholders, including policymakers across the globe, as it shows the resilience of the U.S. agricultural system as well as (some) impact of government payments.Keywords: U.S. farm sector, COVID-19, operating profit margin, farm bankruptcy, ag finance, government payments to the farm sector
Procedia PDF Downloads 89