Search results for: improved sparrow search algorithm
6545 Small Entrepreneurs as Creators of Chaos: Increasing Returns Requires Scaling
Authors: M. B. Neace, Xin GAo
Abstract:
Small entrepreneurs are ubiquitous. Regardless of location their success depends on several behavioral characteristics and several market conditions. In this concept paper, we extend this paradigm to include elements from the science of chaos. Our observations, research findings, literature search and intuition lead us to the proposition that all entrepreneurs seek increasing returns, as did the many small entrepreneurs we have interviewed over the years. There will be a few whose initial perturbations may create tsunami-like waves of increasing returns over time resulting in very large market consequences–the butterfly impact. When small entrepreneurs perturb the market-place and their initial efforts take root a series of phase-space transitions begin to occur. They sustain the stream of increasing returns by scaling up. Chaos theory contributes to our understanding of this phenomenon. Sustaining and nourishing increasing returns of small entrepreneurs as complex adaptive systems requires scaling. In this paper we focus on the most critical element of the small entrepreneur scaling process–the mindset of the owner-operator.Keywords: entrepreneur, increasing returns, scaling, chaos
Procedia PDF Downloads 4566544 Design and Development of a Computerized Medical Record System for Hospitals in Remote Areas
Authors: Grace Omowunmi Soyebi
Abstract:
A computerized medical record system is a collection of medical information about a person that is stored on a computer. One principal problem of most hospitals in rural areas is using the file management system for keeping records. A lot of time is wasted when a patient visits the hospital, probably in an emergency, and the nurse or attendant has to search through voluminous files before the patient's file can be retrieved; this may cause an unexpected to happen to the patient. This data mining application is to be designed using a structured system analysis and design method which will help in a well-articulated analysis of the existing file management system, feasibility study, and proper documentation of the design and implementation of a computerized medical record system. This computerized system will replace the file management system and help to quickly retrieve a patient's record with increased data security, access clinical records for decision-making, and reduce the time range at which a patient gets attended to.Keywords: programming, data, software development, innovation
Procedia PDF Downloads 876543 Genomic Adaptation to Local Climate Conditions in Native Cattle Using Whole Genome Sequencing Data
Authors: Rugang Tian
Abstract:
In this study, we generated whole-genome sequence (WGS) data from110 native cattle. Together with whole-genome sequences from world-wide cattle populations, we estimated the genetic diversity and population genetic structure of different cattle populations. Our findings revealed clustering of cattle groups in line with their geographic locations. We identified noticeable genetic diversity between indigenous cattle breeds and commercial populations. Among all studied cattle groups, lower genetic diversity measures were found in commercial populations, however, high genetic diversity were detected in some local cattle, particularly in Rashoki and Mongolian breeds. Our search for potential genomic regions under selection in native cattle revealed several candidate genes related with immune response and cold shock protein on multiple chromosomes such as TRPM8, NMUR1, PRKAA2, SMTNL2 and OXR1 that are involved in energy metabolism and metabolic homeostasis.Keywords: cattle, whole-genome, population structure, adaptation
Procedia PDF Downloads 746542 A Scoping Review of Trends in Climate Change Research in Ghana
Authors: Emmanuel Bintaayi Jeil, Kabila Abass, David Forkuor, Divine Odame Appiah
Abstract:
In Ghana, the nature and trends of climate change-related research are not clear. This study synthesises various research evidence on climate change published in Ghana between 1999 and 2018. Data for the review was gathered using a set of search words performed in Google Scholar, Web of Science, ProQuest, and ScienceDirect following scoping review guidelines stipulated by the Joanna Briggs Institute. Data were analysed using a scoping review. A total of 114 eligible articles were identified and included in the synthesis. Findings revealed that research on climate change in Ghana is growing steadily, and most of the studies were conducted in 2018. Trends in climate change research in Ghana relate to agriculture and development. There is a lack of attention on climate change issues related to women, water availability and management, and health. Future research should therefore focus on addressing these issues in addition to alternative livelihoods for vulnerable people.Keywords: scoping review, trends, climate change, research, Ghana
Procedia PDF Downloads 1236541 Mechanical Behavior of 16NC6 Steel Hardened by Burnishing
Authors: Litim Tarek, Taamallah Ouahiba
Abstract:
This work relates to the physico-geometrical aspect of the surface layers of 16NC6 steel having undergone the burnishing treatment by hard steel ball. The results show that the optimal effects of burnishing are closely linked to the shape and the material of the active part of the device as well as to the surface plastic deformation ability of the material to be treated. Thus the roughness is improved by more than 70%, and the consolidation rate is increased by 30%. In addition, modeling of the rational traction curves provides a work hardening coefficient of up to 0.3 in the presence of burnishing.Keywords: 16NC6 steel, burnishing, hardening, roughness
Procedia PDF Downloads 1646540 A Bibliometric Analysis of Ukrainian Research Articles on SARS-COV-2 (COVID-19) in Compliance with the Standards of Current Research Information Systems
Authors: Sabina Auhunas
Abstract:
These days in Ukraine, Open Science dramatically develops for the sake of scientists of all branches, providing an opportunity to take a more close look on the studies by foreign scientists, as well as to deliver their own scientific data to national and international journals. However, when it comes to the generalization of data on science activities by Ukrainian scientists, these data are often integrated into E-systems that operate inconsistent and barely related information sources. In order to resolve these issues, developed countries productively use E-systems, designed to store and manage research data, such as Current Research Information Systems that enable combining uncompiled data obtained from different sources. An algorithm for selecting SARS-CoV-2 research articles was designed, by means of which we collected the set of papers published by Ukrainian scientists and uploaded by August 1, 2020. Resulting metadata (document type, open access status, citation count, h-index, most cited documents, international research funding, author counts, the bibliographic relationship of journals) were taken from Scopus and Web of Science databases. The study also considered the info from COVID-19/SARS-CoV-2-related documents published from December 2019 to September 2020, directly from documents published by authors depending on territorial affiliation to Ukraine. These databases are enabled to get the necessary information for bibliometric analysis and necessary details: copyright, which may not be available in other databases (e.g., Science Direct). Search criteria and results for each online database were considered according to the WHO classification of the virus and the disease caused by this virus and represented (Table 1). First, we identified 89 research papers that provided us with the final data set after consolidation and removing duplication; however, only 56 papers were used for the analysis. The total number of documents by results from the WoS database came out at 21641 documents (48 affiliated to Ukraine among them) in the Scopus database came out at 32478 documents (41 affiliated to Ukraine among them). According to the publication activity of Ukrainian scientists, the following areas prevailed: Education, educational research (9 documents, 20.58%); Social Sciences, interdisciplinary (6 documents, 11.76%) and Economics (4 documents, 8.82%). The highest publication activity by institution types was reported in the Ministry of Education and Science of Ukraine (its percent of published scientific papers equals 36% or 7 documents), Danylo Halytsky Lviv National Medical University goes next (5 documents, 15%) and P. L. Shupyk National Medical Academy of Postgraduate Education (4 documents, 12%). Basically, research activities by Ukrainian scientists were funded by 5 entities: Belgian Development Cooperation, the National Institutes of Health (NIH, U.S.), The United States Department of Health & Human Services, grant from the Whitney and Betty MacMillan Center for International and Area Studies at Yale, a grant from the Yale Women Faculty Forum. Based on the results of the analysis, we obtained a set of published articles and preprints to be assessed on the variety of features in upcoming studies, including citation count, most cited documents, a bibliographic relationship of journals, reference linking. Further research on the development of the national scientific E-database continues using brand new analytical methods.Keywords: content analysis, COVID-19, scientometrics, text mining
Procedia PDF Downloads 1156539 Green and Facile Fabrication and Characterization of Fe/ZnO Hollow Spheres and Photodegradation of Azo Dyes
Authors: Seyed Mohsen Mousavi, Ali Reza Mahjoub, Bahjat Afshari Razani
Abstract:
In this work, Fe/ZnO hollow spherical structures with high surface area using the template glucose was prepared by the hydrothermal method using an ultrasonic bath at room temperature was produced and were identified by FT-IR, XRD, FE-SEM and BET. The photocatalytic activity of synthesized spherical Fe/ZnO hollow sphere were studied in the destruction of Congo Red and Methylene Blue as Azo dyes. The results showed that the photocatalytic activity of Fe/ZnO hollow spherical structures is improved compared with ZnO hollow sphere and other morphologys.Keywords: azo dyes, Fe/ZnO hollow sphere, hollow sphere nanostructures, photocatalyst
Procedia PDF Downloads 3706538 Drivers of the Performance of Members of a Social Incubator Considering the Values of Work: A Qualitative Study with Social Entrepreneurs
Authors: Leticia Lengler, Vania Estivalete, Vivian Flores Costa, Tais De Andrade, Lisiane Fellini Faller
Abstract:
Social entrepreneurship has emerged and driven a new development perspective, and as the literature mentions, it is based on innovation, and mainly, on the creation of social value, rather than personal wealth and shareholders. In this field of study, one of the focuses of discussion refers to the distinct characteristics of the individuals responsible for socially directed initiatives, named as social entrepreneurs. To contribute to this perspective, the present study aims to identify the values related to work that guide the performance of social entrepreneurs, members of enterprises that have developed themselves within a social incubator at a federal institution of higher education in Brazil. Each person's value system is present in different facets of his life, manifesting himself in his choices and in the way he conducts the relationship with other people in society. Especially the values of work, the focus of this research, play a significant role in organizational studies, since they are considered one of the important guiding principles of the behavior of individuals in the work environment. Regarding the method of the study, a descriptive and qualitative research was carried out. In the data collection, 24 entrepreneurs, members of five different enterprises belonging to the social incubator, were interviewed. The research instrument consisted of three open questions, which could be answered with the support of a "disc of values", an artifact organized to clearly demonstrate the values of the work to the respondents. The analysis of the interviews took into account the categories defined a priori, based on the model proposed by previous authors who validated these constructs within their research contexts, contemplating the following dimensions: Self-determination and stimulation; Safety; Conformity; Universalism and benevolence; Achievement; and Power. It should be noted that, in order to provide a better understanding of the interviewees, in the "disc of values" used in the research, these dimensions were represented by the objectives that define them, being respectively: Challenge; Financial independence; Commitment; Welfare of others; Personal success; And Power. Some preliminary results show that, as guiding principles of the investigation, priority is given to work values related to Self-determination and stimulation, Conformity and Universalism and benevolence. Such findings point to the importance given by these individuals to independent thinking and acting, as well as to novelty and constant challenge. Still, they demonstrate the appreciation of commitment to their enterprise, the people who make it and the quality of their work. They also point to the relevance of the possibility of contributing to the greater social good, that is, of the search for the well-being of close people and of society, as it is implied in models of social entrepreneurship coming from literature. With a lower degree of priority, the values denominated Safety and Realization, as the financial question at work and the search for satisfaction and personal success, through the use of socially recognized skills were mentioned aspects with little emphasis by social entrepreneurs. The Power value was not considered as guiding principle of the work for the respondents.Keywords: qualitative study, social entrepreneur, social incubator, values of work
Procedia PDF Downloads 2606537 Impact of 99mTc-MDP Bone SPECT/CT Imaging in Failed Back Surgery Syndrome
Authors: Ching-Yuan Chen, Lung-Kwang Pan
Abstract:
Objective: Back pain is a major health problem costing billions of health budgets annually in Taiwan. Thousands of back pain surgeries are performed annually with up to 40% of patients complaining of back pain at time of post-surgery causing failed back surgery syndrome (FBSS), although diagnosis in these patients may be complex. The aim of study is to assess the feasibility of using bone SPECT-CT imaging to localize the active lesions causing persistent, recurrent or new backache after spine surgery. Materials and Methods: Bone SPECT-CT imaging was performed after the intravenous injection of 20 mCi of 99mTc-MDP for all the patients with diagnosis of FBSS. Patients were evaluated using status of subjectively pain relief, functional improvement and degree of satisfaction by reviewing the medical records and questionnaires in a 2 more years’ follow-up. Results: We enrolled a total of 16 patients were surveyed in our hospital from Jan. 2015 to Dec. 2016. Four people on SPEC/CT imaging ensured significant lesions were undergone a revised surgery (surgical treatment group). The mean visual analogue scale (VAS) decreased 5.3 points and mean Oswestry disability index (ODI) improved 38 points in the surgical group. The remaining 12 on SPECT/CT imaging were diagnosed as no significant lesions then received drug treatment (medical treatment group). The mean VAS only decreased 2 .1 point and mean ODI improved 12.6 points in the medical treatment group. In the posttherapeutic evaluation, the pain of the surgical treatment group showed a satisfactory improvement. In the medical treatment group, 10 of the 12 were also satisfied with the symptom relief while the other 2 did not improve significantly. Conclusions: Findings on SPECT-CT imaging appears to be easily explained the patients' pain. We recommended that SPECT/CT imaging was a feasible and useful clinical tool to improve diagnostic confidence or specificity when evaluating patients with FBSS.Keywords: failed back surgery syndrome, oswestry disability index, SPECT-CT imaging, 99mTc-MDP, visual analogue scale
Procedia PDF Downloads 1736536 Heuristic to Generate Random X-Monotone Polygons
Authors: Kamaljit Pati, Manas Kumar Mohanty, Sanjib Sadhu
Abstract:
A heuristic has been designed to generate a random simple monotone polygon from a given set of ‘n’ points lying on a 2-Dimensional plane. Our heuristic generates a random monotone polygon in O(n) time after O(nℓogn) preprocessing time which is improved over the previous work where a random monotone polygon is produced in the same O(n) time but the preprocessing time is O(k) for n < k < n2. However, our heuristic does not generate all possible random polygons with uniform probability. The space complexity of our proposed heuristic is O(n).Keywords: sorting, monotone polygon, visibility, chain
Procedia PDF Downloads 4276535 Optimization of Leaching Properties of a Low-Grade Copper Ore Using Central Composite Design (CCD)
Authors: Lawrence Koech, Hilary Rutto, Olga Mothibedi
Abstract:
Worldwide demand for copper has led to intensive search for methods of extraction and recovery of copper from different sources. The study investigates the leaching properties of a low-grade copper ore by optimizing the leaching variables using response surface methodology. The effects of key parameters, i.e., temperature, solid to liquid ratio, stirring speed and pH, on the leaching rate constant was investigated using a pH stat apparatus. A Central Composite Design (CCD) of experiments was used to develop a quadratic model which specifically correlates the leaching variables and the rate constant. The results indicated that the model is in good agreement with the experimental data with a correlation coefficient (R2) of 0.93. The temperature and solid to liquid ratio were found to have the most substantial influence on the leaching rate constant. The optimum operating conditions for copper leaching from the ore were identified as temperature at 65C, solid to liquid ratio at 1.625 and stirring speed of 325 rpm which yielded an average leaching efficiency of 93.16%.Keywords: copper, leaching, CCD, rate constant
Procedia PDF Downloads 2426534 Co-management Organizations: A Way to Facilitate Sustainable Management of the Sundarbans Mangrove Forests of Bangladesh
Authors: Md. Wasiul Islam, Md. Jamius Shams Sowrov
Abstract:
The Sundarbans is the largest single tract of mangrove forest in the world. This is located in the southwest corner of Bangladesh. This is a unique ecosystem which is a great breeding and nursing ground for a great biodiversity. It supports the livelihood of about 3.5 million coastal dwellers and also protects the coastal belt and inland areas from various natural calamities. Historically, the management of the Sundarbans was controlled by the Bangladesh Forest Department following top-down approach without the involvement of local communities. Such fence and fining-based blue-print approach was not effective to protect the forest which caused Sundarbans to degrade severely in the recent past. Fifty percent of the total tree cover has been lost in the last 30 years. Therefore, local multi-stakeholder based bottom-up co-management approach was introduced at some of the parts of the Sundarbans in 2006 to improve the biodiversity status by enhancing the protection level of the forest. Various co-management organizations were introduced under co-management approach where the local community people could actively involve in various activities related to the management and welfare of the Sundarbans including the decision-making process to achieve the goal. From this backdrop, the objective of the study was to assess the performance of co-management organizations to facilitate sustainable management of the Sundarbans mangrove forests. The qualitative study followed face-to-face interview to collect data using two sets of semi-structured questionnaires. A total of 40 respondents participated in the research that was from eight villagers under two forest ranges. 32 representatives from the local communities as well as 8 official representatives involved in co-management approach were interviewed using snowball sampling technique. The study shows that the co-management approach improved governance system of the Sundarbans through active participation of the local community people and their interactions with the officials via the platform of co-management organizations. It facilitated accountability and transparency system to some extent through following some formal and informal rules and regulations. It also improved the power structure of the management process by fostering local empowerment process particularly the women. Moreover, people were able to learn from their interactions with and within the co-management organizations as well as interventions improved environmental awareness and promoted social learning. The respondents considered good governance as the most important factor for achieving the goal of sustainable management and biodiversity conservation of the Sundarbans. The success of co-management planning process also depends on the active and functional participation of different stakeholders including the local communities where co-management organizations were considered as the most functional platform. However, the governance system was also facing various challenges which resulted in barriers to the sustainable management of the Sundarbans mangrove forest. But still there were some members involved in illegal forest operations and created obstacles against sustainable management of the Sundarbans. Respondents recommended greater patronization from the government, financial and logistic incentives for alternative income generation opportunities with effective participatory monitoring and evaluation system to improve sustainable management of the Sundarbans.Keywords: Bangladesh, co-management approach, co-management organizations, governance, Sundarbans, sustainable management
Procedia PDF Downloads 1796533 Review of Concepts and Tools Applied to Assess Risks Associated with Food Imports
Authors: A. Falenski, A. Kaesbohrer, M. Filter
Abstract:
Introduction: Risk assessments can be performed in various ways and in different degrees of complexity. In order to assess risks associated with imported foods additional information needs to be taken into account compared to a risk assessment on regional products. The present review is an overview on currently available best practise approaches and data sources used for food import risk assessments (IRAs). Methods: A literature review has been performed. PubMed was searched for articles about food IRAs published in the years 2004 to 2014 (English and German texts only, search string “(English [la] OR German [la]) (2004:2014 [dp]) import [ti] risk”). Titles and abstracts were screened for import risks in the context of IRAs. The finally selected publications were analysed according to a predefined questionnaire extracting the following information: risk assessment guidelines followed, modelling methods used, data and software applied, existence of an analysis of uncertainty and variability. IRAs cited in these publications were also included in the analysis. Results: The PubMed search resulted in 49 publications, 17 of which contained information about import risks and risk assessments. Within these 19 cross references were identified to be of interest for the present study. These included original articles, reviews and guidelines. At least one of the guidelines of the World Organisation for Animal Health (OIE) and the Codex Alimentarius Commission were referenced in any of the IRAs, either for import of animals or for imports concerning foods, respectively. Interestingly, also a combination of both was used to assess the risk associated with the import of live animals serving as the source of food. Methods ranged from full quantitative IRAs using probabilistic models and dose-response models to qualitative IRA in which decision trees or severity tables were set up using parameter estimations based on expert opinions. Calculations were done using @Risk, R or Excel. Most heterogeneous was the type of data used, ranging from general information on imported goods (food, live animals) to pathogen prevalence in the country of origin. These data were either publicly available in databases or lists (e.g., OIE WAHID and Handystatus II, FAOSTAT, Eurostat, TRACES), accessible on a national level (e.g., herd information) or only open to a small group of people (flight passenger import data at national airport customs office). In the IRAs, an uncertainty analysis has been mentioned in some cases, but calculations have been performed only in a few cases. Conclusion: The current state-of-the-art in the assessment of risks of imported foods is characterized by a great heterogeneity in relation to general methodology and data used. Often information is gathered on a case-by-case basis and reformatted by hand in order to perform the IRA. This analysis therefore illustrates the need for a flexible, modular framework supporting the connection of existing data sources with data analysis and modelling tools. Such an infrastructure could pave the way to IRA workflows applicable ad-hoc, e.g. in case of a crisis situation.Keywords: import risk assessment, review, tools, food import
Procedia PDF Downloads 3026532 Modern Technology-Based Methods in Neurorehabilitation for Social Competence Deficit in Children with Acquired Brain Injury
Authors: M. Saard, A. Kolk, K. Sepp, L. Pertens, L. Reinart, C. Kööp
Abstract:
Introduction: Social competence is often impaired in children with acquired brain injury (ABI), but evidence-based rehabilitation for social skills has remained undeveloped. Modern technology-based methods create effective and safe learning environments for pediatric social skills remediation. The aim of the study was to implement our structured model of neuro rehab for socio-cognitive deficit using multitouch-multiuser tabletop (MMT) computer-based platforms and virtual reality (VR) technology. Methods: 40 children aged 8-13 years (yrs) have participated in the pilot study: 30 with ABI -epilepsy, traumatic brain injury and/or tic disorder- and 10 healthy age-matched controls. From the patients, 12 have completed the training (M = 11.10 yrs, SD = 1.543) and 20 are still in training or in the waiting-list group (M = 10.69 yrs, SD = 1.704). All children performed the first individual and paired assessments. For patients, second evaluations were performed after the intervention period. Two interactive applications were implemented into rehabilitation design: Snowflake software on MMT tabletop and NoProblem on DiamondTouch Table (DTT), which allowed paired training (2 children at once). Also, in individual training sessions, HTC Vive VR device was used with VR metaphors of difficult social situations to treat social anxiety and train social skills. Results: At baseline (B) evaluations, patients had higher deficits in executive functions on the BRIEF parents’ questionnaire (M = 117, SD = 23.594) compared to healthy controls (M = 22, SD = 18.385). The most impaired components of social competence were emotion recognition, Theory of Mind skills (ToM), cooperation, verbal/non-verbal communication, and pragmatics (Friendship Observation Scale scores only 25-50% out of 100% for patients). In Sentence Completion Task and Spence Anxiety Scale, the patients reported a lack of friends, behavioral problems, bullying in school, and social anxiety. Outcome evaluations: Snowflake on MMT improved executive and cooperation skills and DTT developed communication skills, metacognitive skills, and coping. VR, video modelling and role-plays improved social attention, emotional attitude, gestural behaviors, and decreased social anxiety. NEPSY-II showed improvement in Affect Recognition [B = 7, SD = 5.01 vs outcome (O) = 10, SD = 5.85], Verbal ToM (B = 8, SD = 3.06 vs O = 10, SD = 4.08), Contextual ToM (B = 8, SD = 3.15 vs O = 11, SD = 2.87). ToM Stories test showed an improved understanding of Intentional Lying (B = 7, SD = 2.20 vs O = 10, SD = 0.50), and Sarcasm (B=6, SD = 2.20 vs O = 7, SD = 2.50). Conclusion: Neurorehabilitation based on the Structured Model of Neurorehab for Socio-Cognitive Deficit in children with ABI were effective in social skills remediation. The model helps to understand theoretical connections between components of social competence and modern interactive computerized platforms. We encourage therapists to implement these next-generation devices into the rehabilitation process as MMT and VR interfaces are motivating for children, thus ensuring good compliance. Improving children’s social skills is important for their and their families’ quality of life and social capital.Keywords: acquired brain injury, children, social skills deficit, technology-based neurorehabilitation
Procedia PDF Downloads 1206531 Neural Network and Support Vector Machine for Prediction of Foot Disorders Based on Foot Analysis
Authors: Monireh Ahmadi Bani, Adel Khorramrouz, Lalenoor Morvarid, Bagheri Mahtab
Abstract:
Background:- Foot disorders are common in musculoskeletal problems. Plantar pressure distribution measurement is one the most important part of foot disorders diagnosis for quantitative analysis. However, the association of plantar pressure and foot disorders is not clear. With the growth of dataset and machine learning methods, the relationship between foot disorders and plantar pressures can be detected. Significance of the study:- The purpose of this study was to predict the probability of common foot disorders based on peak plantar pressure distribution and center of pressure during walking. Methodologies:- 2323 participants were assessed in a foot therapy clinic between 2015 and 2021. Foot disorders were diagnosed by an experienced physician and then they were asked to walk on a force plate scanner. After the data preprocessing, due to the difference in walking time and foot size, we normalized the samples based on time and foot size. Some of force plate variables were selected as input to a deep neural network (DNN), and the probability of any each foot disorder was measured. In next step, we used support vector machine (SVM) and run dataset for each foot disorder (classification of yes or no). We compared DNN and SVM for foot disorders prediction based on plantar pressure distributions and center of pressure. Findings:- The results demonstrated that the accuracy of deep learning architecture is sufficient for most clinical and research applications in the study population. In addition, the SVM approach has more accuracy for predictions, enabling applications for foot disorders diagnosis. The detection accuracy was 71% by the deep learning algorithm and 78% by the SVM algorithm. Moreover, when we worked with peak plantar pressure distribution, it was more accurate than center of pressure dataset. Conclusion:- Both algorithms- deep learning and SVM will help therapist and patients to improve the data pool and enhance foot disorders prediction with less expense and error after removing some restrictions properly.Keywords: deep neural network, foot disorder, plantar pressure, support vector machine
Procedia PDF Downloads 3586530 A Study of Resin-Dye Fixation on Dyeing Properties of Cotton Fabrics Using Melamine Based Resins and a Reactive Dye
Authors: Nurudeen Ayeni, Kasali Bello, Ovi Abayeh
Abstract:
Study of the effect of dye–resin complexation on the degree of dye absorption were carried out using Procion Blue MX-R to dye cotton fabric in the presence hexamethylol melamine (MR 6) and its phosphate derivative (MPR 4) for resination. The highest degree of dye exhaustion was obtained at 400 C for 1 hour with the resinated fabric showing more affinity for the dye than the ordinary fiber. Improved fastness properties was recorded which show a relatively higher stability of dye–resin–cellulose network formed.Keywords: cotton fabric, reactive dye, dyeing, resination
Procedia PDF Downloads 4086529 Facial Recognition and Landmark Detection in Fitness Assessment and Performance Improvement
Authors: Brittany Richardson, Ying Wang
Abstract:
For physical therapy, exercise prescription, athlete training, and regular fitness training, it is crucial to perform health assessments or fitness assessments periodically. An accurate assessment is propitious for tracking recovery progress, preventing potential injury and making long-range training plans. Assessments include necessary measurements, height, weight, blood pressure, heart rate, body fat, etc. and advanced evaluation, muscle group strength, stability-mobility, and movement evaluation, etc. In the current standard assessment procedures, the accuracy of assessments, especially advanced evaluations, largely depends on the experience of physicians, coaches, and personal trainers. And it is challenging to track clients’ progress in the current assessment. Unlike the tradition assessment, in this paper, we present a deep learning based face recognition algorithm for accurate, comprehensive and trackable assessment. Based on the result from our assessment, physicians, coaches, and personal trainers are able to adjust the training targets and methods. The system categorizes the difficulty levels of the current activity for the client or user, furthermore make more comprehensive assessments based on tracking muscle group over time using a designed landmark detection method. The system also includes the function of grading and correcting the form of the clients during exercise. Experienced coaches and personal trainer can tell the clients' limit based on their facial expression and muscle group movements, even during the first several sessions. Similar to this, using a convolution neural network, the system is trained with people’s facial expression to differentiate challenge levels for clients. It uses landmark detection for subtle changes in muscle groups movements. It measures the proximal mobility of the hips and thoracic spine, the proximal stability of the scapulothoracic region and distal mobility of the glenohumeral joint, as well as distal mobility, and its effect on the kinetic chain. This system integrates data from other fitness assistant devices, including but not limited to Apple Watch, Fitbit, etc. for a improved training and testing performance. The system itself doesn’t require history data for an individual client, but the history data of a client can be used to create a more effective exercise plan. In order to validate the performance of the proposed work, an experimental design is presented. The results show that the proposed work contributes towards improving the quality of exercise plan, execution, progress tracking, and performance.Keywords: exercise prescription, facial recognition, landmark detection, fitness assessments
Procedia PDF Downloads 1346528 Localization of Near Field Radio Controlled Unintended Emitting Sources
Authors: Nurbanu Guzey, S. Jagannathan
Abstract:
Locating radio controlled (RC) devices using their unintended emissions has a great interest considering security concerns. Weak nature of these emissions requires near field localization approach since it is hard to detect these signals in far field region of array. Instead of only angle estimation, near field localization also requires range estimation of the source which makes this method more complicated than far field models. Challenges of locating such devices in a near field region and real time environment are analyzed in this paper. An ESPRIT like near field localization scheme is utilized for both angle and range estimation. 1-D search with symmetric subarrays is provided. Two 7 element uniform linear antenna arrays (ULA) are employed for locating RC source. Experiment results of location estimation for one unintended emitting walkie-talkie for different positions are given.Keywords: localization, angle of arrival (AoA), range estimation, array signal processing, ESPRIT, Uniform Linear Array (ULA)
Procedia PDF Downloads 5266527 Maximum Power Point Tracking Using FLC Tuned with GA
Authors: Mohamed Amine Haraoubia, Abdelaziz Hamzaoui, Najib Essounbouli
Abstract:
The pursuit of the MPPT has led to the development of many kinds of controllers, one of which is the Fuzzy Logic Controller, which has proven its worth. To further tune this controller this paper will discuss and analyze the use of Genetic Algorithms to tune the Fuzzy Logic Controller. It will provide an introduction to both systems, and test their compatibility and performance.Keywords: fuzzy logic controller, fuzzy logic, genetic algorithm, maximum power point, maximum power point tracking
Procedia PDF Downloads 3736526 Mobile Asthma Action Plan for Adolescent with Asthma: A Systematic Review
Authors: Reisy Tane
Abstract:
Asthma is the common health problems in adolescents. Self-management is one way to improve health status in adolescent with asthma. Mobile technology has the potential to improve self-management in adolescents with asthma. Objective: the aim of this study to determine the effectiveness of using the mobile technology Asthma Action Plan to improve self management. Method: this study is Systematic review approach using PRISM template. The literature search started on first September 2017 by using electronic data Pro Quest and Google Scholars with keywords ‘Mobile AAP’ and ‘Adolescent Asthma’. Results and Conclusion: M-AAP is effective to improve adolescent self-management with asthma because it is easy to use and provide information appropriately. The improvement of self-management in teenagers will enhance the quality of life of adolescents with asthma. The recommendation of this study is the addition of parental control content in the application appropriate with Family Centered Care (FCC) philosophy on pediatric nursing. In addition, it is expected the development of applications for other chronic diseases such as diabetes mellitus and congestive heart failure.Keywords: asthma, mobile AAP, adolescent, self-management
Procedia PDF Downloads 1966525 The Effect of Closed Circuit Television Image Patch Layout on Performance of a Simulated Train-Platform Departure Task
Authors: Aaron J. Small, Craig A. Fletcher
Abstract:
This study investigates the effect of closed circuit television (CCTV) image patch layout on performance of a simulated train-platform departure task. The within-subjects experimental design measures target detection rate and response latency during a CCTV visual search task conducted as part of the procedure for safe train dispatch. Three interface designs were developed by manipulating CCTV image patch layout. Eye movements, perceived workload and system usability were measured across experimental conditions. Task performance was compared to identify significant differences between conditions. The results of this study have not been determined.Keywords: rail human factors, workload, closed circuit television, platform departure, attention, information processing, interface design
Procedia PDF Downloads 1686524 Streamlining .NET Data Access: Leveraging JSON for Data Operations in .NET
Authors: Tyler T. Procko, Steve Collins
Abstract:
New features in .NET (6 and above) permit streamlined access to information residing in JSON-capable relational databases, such as SQL Server (2016 and above). Traditional methods of data access now comparatively involve unnecessary steps which compromise system performance. This work posits that the established ORM (Object Relational Mapping) based methods of data access in applications and APIs result in common issues, e.g., object-relational impedance mismatch. Recent developments in C# and .NET Core combined with a framework of modern SQL Server coding conventions have allowed better technical solutions to the problem. As an amelioration, this work details the language features and coding conventions which enable this streamlined approach, resulting in an open-source .NET library implementation called Codeless Data Access (CODA). Canonical approaches rely on ad-hoc mapping code to perform type conversions between the client and back-end database; with CODA, no mapping code is needed, as JSON is freely mapped to SQL and vice versa. CODA streamlines API data access by improving on three aspects of immediate concern to web developers, database engineers and cybersecurity professionals: Simplicity, Speed and Security. Simplicity is engendered by cutting out the “middleman” steps, effectively making API data access a whitebox, whereas traditional methods are blackbox. Speed is improved because of the fewer translational steps taken, and security is improved as attack surfaces are minimized. An empirical evaluation of the speed of the CODA approach in comparison to ORM approaches ] is provided and demonstrates that the CODA approach is significantly faster. CODA presents substantial benefits for API developer workflows by simplifying data access, resulting in better speed and security and allowing developers to focus on productive development rather than being mired in data access code. Future considerations include a generalization of the CODA method and extension outside of the .NET ecosystem to other programming languages.Keywords: API data access, database, JSON, .NET core, SQL server
Procedia PDF Downloads 666523 Freight Time and Cost Optimization in Complex Logistics Networks, Using a Dimensional Reduction Method and K-Means Algorithm
Authors: Egemen Sert, Leila Hedayatifar, Rachel A. Rigg, Amir Akhavan, Olha Buchel, Dominic Elias Saadi, Aabir Abubaker Kar, Alfredo J. Morales, Yaneer Bar-Yam
Abstract:
The complexity of providing timely and cost-effective distribution of finished goods from industrial facilities to customers makes effective operational coordination difficult, yet effectiveness is crucial for maintaining customer service levels and sustaining a business. Logistics planning becomes increasingly complex with growing numbers of customers, varied geographical locations, the uncertainty of future orders, and sometimes extreme competitive pressure to reduce inventory costs. Linear optimization methods become cumbersome or intractable due to a large number of variables and nonlinear dependencies involved. Here we develop a complex systems approach to optimizing logistics networks based upon dimensional reduction methods and apply our approach to a case study of a manufacturing company. In order to characterize the complexity in customer behavior, we define a “customer space” in which individual customer behavior is described by only the two most relevant dimensions: the distance to production facilities over current transportation routes and the customer's demand frequency. These dimensions provide essential insight into the domain of effective strategies for customers; direct and indirect strategies. In the direct strategy, goods are sent to the customer directly from a production facility using box or bulk trucks. In the indirect strategy, in advance of an order by the customer, goods are shipped to an external warehouse near a customer using trains and then "last-mile" shipped by trucks when orders are placed. Each strategy applies to an area of the customer space with an indeterminate boundary between them. Specific company policies determine the location of the boundary generally. We then identify the optimal delivery strategy for each customer by constructing a detailed model of costs of transportation and temporary storage in a set of specified external warehouses. Customer spaces help give an aggregate view of customer behaviors and characteristics. They allow policymakers to compare customers and develop strategies based on the aggregate behavior of the system as a whole. In addition to optimization over existing facilities, using customer logistics and the k-means algorithm, we propose additional warehouse locations. We apply these methods to a medium-sized American manufacturing company with a particular logistics network, consisting of multiple production facilities, external warehouses, and customers along with three types of shipment methods (box truck, bulk truck and train). For the case study, our method forecasts 10.5% savings on yearly transportation costs and an additional 4.6% savings with three new warehouses.Keywords: logistics network optimization, direct and indirect strategies, K-means algorithm, dimensional reduction
Procedia PDF Downloads 1396522 Amine Sulphonic Acid Additives for Improving Energy Storage Capacity in Alkaline Gallocyanine Flow Batteries
Authors: Eduardo Martínez González, Mousumi Dey, Pekka Peljo
Abstract:
Transitioning to a renewable energy model is inevitable owing to the effects of climate change. These energies are aimed at sustainability and a positive impact on the environment, but they are intermittent energies; their connection to the electrical grid depends on creating long-term, efficient, and low-cost energy storage devices. Redox flow batteries are attractive technologies to address this problem, as they store energy in solution through external tanks known as posolyte (solution to storage positive charge) and negolyte (solution to storage negative charge). During the charging process of the device, the posolyte and negolyte solutions are pumped into an electrochemical cell (which has the anode and cathode separated by an ionic membrane), where they undergo oxidation and reduction reactions at electrodes, respectively. The electrogenerated species should be stable and diffuse into the bulk solution. It has been possible to connect gigantic redox flow batteries to the electrical grid. However, the devices created do not fit with the sustainability criteria since their electroactive material consists of vanadium (material scarce and expensive) solutions dissolved in an acidic medium (e.g., 9 mol L-1 of H₂SO₄) that is highly corrosive; so, work is being done on the design of organic-electroactive electrolytes (posolytes and nogolytes) for their operation at different pH values, including neutral medium. As a main characteristic, negolyte species should have low reduction potential values, while the reverse is true for the oxidation process of posolytes. A wide variety of negolytes that store 1 and up to 2 electrons per molecule (in aqueous medium) have been publised. Gallocyanine compound was recently introduced as an electroactive material for developing alkaline flow battery negolytes. The system can storage two electrons per molecule, but its unexpectedly low water solubility was improved with an amino sulphonic acid additive. The cycling stability of and improved gallocyanine electrolyte was demonstrated by operating a flow battery cell (pairing the system to a posolyte composed of ferri/ferrocyanide solution) outside a glovebox. We also discovered that the additive improves the solubility of gallocyanine, but there is a kinetic price to pay for this advantage. Therefore, in this work, the effect of different amino sulphonic acid derivatives on the kinetics and solubility of gallocyanine compound was studied at alkaline solutions. The additive providing a faster electron transfer rate and high solubility was tested in a flow battery cell. An aqueous organic flow battery electrolyte working outside a glovebox with 15 mAhL-1 will be discussed. Acknowledgments: To Bi3BoostFlowBat Project (2021-2025), funded by the European Research Concil. For support with infrastructure, reagents, and a postdoctoral fellowship to Dr. Martínez-González.Keywords: alkaline flow battery, gallocyanine electroactive material, amine-sulphonic acid additives, improved solubility
Procedia PDF Downloads 286521 DHL CSI Solution Design Project
Authors: Mohammed Al-Yamani, Yaser Miaji
Abstract:
DHL Customer Solutions and Innovation Department (CSI) have been experiencing difficulties while comparing quotes for different customers in different years. Currently, the employees are processing data by opening several loaded Excel files where the quotes are and manually copying values to another Excel Workbook where the comparison is made. This project consists of developing a new and effective database for DHL CSI department so that information is stored altogether on the same catalog. That being said, we have been assigned to find an efficient algorithm that can deal with the different formats of the Excel Workbooks to copy and store the express customer rates for core products (DOX, WPX, IMP) for comparisons purposes.Keywords: DHL, solution design, ORACLE, EXCEL
Procedia PDF Downloads 4106520 Improved Performance Using Adaptive Pre-Coding in the Cellular Network
Authors: Yong-Jun Kim, Jae-Hyun Ro, Chang-Bin Ha, Hyoung-Kyu Song
Abstract:
This paper proposes the cooperative transmission scheme with pre-coding because the cellular communication requires high reliability. The cooperative transmission scheme uses pre-coding method with limited feedback information among small cells. Particularly, the proposed scheme has adaptive mode according to the position of mobile station. Thus, demand of recent wireless communication is resolved by this scheme. From the simulation results, the proposed scheme has better performance compared to the conventional scheme in the cellular network.Keywords: CDD, cellular network, pre-coding, SPC
Procedia PDF Downloads 5696519 Online Bakery Management System Proposal
Authors: Alexander Musyoki, Collins Odour
Abstract:
Over the past few years, the bakery industry in Kenya has experienced significant growth largely in part to the increased adoption of technology and automation in their processes; more specifically due to the adoption of bakery management systems to help in running bakeries. While they have been largely responsible for the improved productivity and efficiency in bakeries, most of them are now outdated and pose more challenges than benefits. The proposed online bakery management system mentioned in this paper aims to address this by allowing bakery owners to track inventory, budget, job progress, and data analytics on each job and in doing so, promote the Sustainable Development Goals 3 and 12, which aim to ensure healthy lives and promote sustainable economic growth as the proposed benefits of these features include scalability, easy accessibility, reduced acquisition costs, better reliability, and improved functionality that will allow bakeries to become more competitive, reduce waste and track inventory more efficiently. To better understand the challenges, a comprehensive study has been performed to assess these traditional systems and try to understand if an online bakery management system can prove to be advantageous to bakery owners. The study conducted gathered feedback from bakery owners and employees in Nairobi County, Kenya using an online survey with a response rate of about 86% from the target population. The responses cited complex and hard to use bakery management systems (59.7%), lack of portability from one device to the other (58.1%) and high acquisition costs (51.6%) as the top challenges of traditional bakery management systems. On the other hand, some of the top benefits that most of the respondents would realize from the online bakery management system was better reliability (58.1%) and reduced acquisition costs (58.1%). Overall, the findings suggest that an online bakery management system has a lot of advantages over traditional systems and is likely to be well-received in the market. In conclusion, the proposed online bakery management system has the potential to improve the efficiency and competitiveness of small-sized bakeries in Nairobi County. Further research is recommended to expand the sample size and diversity of respondents and to conduct more in-depth analyses of the data collected.Keywords: ICT, technology and automation, bakery management systems, food innovation
Procedia PDF Downloads 786518 Sparse Representation Based Spatiotemporal Fusion Employing Additional Image Pairs to Improve Dictionary Training
Authors: Dacheng Li, Bo Huang, Qinjin Han, Ming Li
Abstract:
Remotely sensed imagery with the high spatial and temporal characteristics, which it is hard to acquire under the current land observation satellites, has been considered as a key factor for monitoring environmental changes over both global and local scales. On a basis of the limited high spatial-resolution observations, challenged studies called spatiotemporal fusion have been developed for generating high spatiotemporal images through employing other auxiliary low spatial-resolution data while with high-frequency observations. However, a majority of spatiotemporal fusion approaches yield to satisfactory assumption, empirical but unstable parameters, low accuracy or inefficient performance. Although the spatiotemporal fusion methodology via sparse representation theory has advantage in capturing reflectance changes, stability and execution efficiency (even more efficient when overcomplete dictionaries have been pre-trained), the retrieval of high-accuracy dictionary and its response to fusion results are still pending issues. In this paper, we employ additional image pairs (here each image-pair includes a Landsat Operational Land Imager and a Moderate Resolution Imaging Spectroradiometer acquisitions covering the partial area of Baotou, China) only into the coupled dictionary training process based on K-SVD (K-means Singular Value Decomposition) algorithm, and attempt to improve the fusion results of two existing sparse representation based fusion models (respectively utilizing one and two available image-pair). The results show that more eligible image pairs are probably related to a more accurate overcomplete dictionary, which generally indicates a better image representation, and is then contribute to an effective fusion performance in case that the added image-pair has similar seasonal aspects and image spatial structure features to the original image-pair. It is, therefore, reasonable to construct multi-dictionary training pattern for generating a series of high spatial resolution images based on limited acquisitions.Keywords: spatiotemporal fusion, sparse representation, K-SVD algorithm, dictionary learning
Procedia PDF Downloads 2616517 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction
Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan
Abstract:
Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.Keywords: decision trees, neural network, myocardial infarction, Data Mining
Procedia PDF Downloads 4296516 Assessing of Social Comfort of the Russian Population with Big Data
Authors: Marina Shakleina, Konstantin Shaklein, Stanislav Yakiro
Abstract:
The digitalization of modern human life over the last decade has facilitated the acquisition, storage, and processing of data, which are used to detect changes in consumer preferences and to improve the internal efficiency of the production process. This emerging trend has attracted academic interest in the use of big data in research. The study focuses on modeling the social comfort of the Russian population for the period 2010-2021 using big data. Big data provides enormous opportunities for understanding human interactions at the scale of society with plenty of space and time dynamics. One of the most popular big data sources is Google Trends. The methodology for assessing social comfort using big data involves several steps: 1. 574 words were selected based on the Harvard IV-4 Dictionary adjusted to fit the reality of everyday Russian life. The set of keywords was further cleansed by excluding queries consisting of verbs and words with several lexical meanings. 2. Search queries were processed to ensure comparability of results: the transformation of data to a 10-point scale, elimination of popularity peaks, detrending, and deseasoning. The proposed methodology for keyword search and Google Trends processing was implemented in the form of a script in the Python programming language. 3. Block and summary integral indicators of social comfort were constructed using the first modified principal component resulting in weighting coefficients values of block components. According to the study, social comfort is described by 12 blocks: ‘health’, ‘education’, ‘social support’, ‘financial situation’, ‘employment’, ‘housing’, ‘ethical norms’, ‘security’, ‘political stability’, ‘leisure’, ‘environment’, ‘infrastructure’. According to the model, the summary integral indicator increased by 54% and was 4.631 points; the average annual rate was 3.6%, which is higher than the rate of economic growth by 2.7 p.p. The value of the indicator describing social comfort in Russia is determined by 26% by ‘social support’, 24% by ‘education’, 12% by ‘infrastructure’, 10% by ‘leisure’, and the remaining 28% by others. Among 25% of the most popular searches, 85% are of negative nature and are mainly related to the blocks ‘security’, ‘political stability’, ‘health’, for example, ‘crime rate’, ‘vulnerability’. Among the 25% most unpopular queries, 99% of the queries were positive and mostly related to the blocks ‘ethical norms’, ‘education’, ‘employment’, for example, ‘social package’, ‘recycling’. In conclusion, the introduction of the latent category ‘social comfort’ into the scientific vocabulary deepens the theory of the quality of life of the population in terms of the study of the involvement of an individual in the society and expanding the subjective aspect of the measurements of various indicators. Integral assessment of social comfort demonstrates the overall picture of the development of the phenomenon over time and space and quantitatively evaluates ongoing socio-economic policy. The application of big data in the assessment of latent categories gives stable results, which opens up possibilities for their practical implementation.Keywords: big data, Google trends, integral indicator, social comfort
Procedia PDF Downloads 200