Search results for: search data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25401

Search results for: search data

25011 Influence of Optimization Method on Parameters Identification of Hyperelastic Models

Authors: Bale Baidi Blaise, Gilles Marckmann, Liman Kaoye, Talaka Dya, Moustapha Bachirou, Gambo Betchewe, Tibi Beda

Abstract:

This work highlights the capabilities of particles swarm optimization (PSO) method to identify parameters of hyperelastic models. The study compares this method with Genetic Algorithm (GA) method, Least Squares (LS) method, Pattern Search Algorithm (PSA) method, Beda-Chevalier (BC) method and the Levenberg-Marquardt (LM) method. Four classic hyperelastic models are used to test the different methods through parameters identification. Then, the study compares the ability of these models to reproduce experimental Treloar data in simple tension, biaxial tension and pure shear.

Keywords: particle swarm optimization, identification, hyperelastic, model

Procedia PDF Downloads 142
25010 A Systematic Review of the Methodological and Reporting Quality of Case Series in Surgery

Authors: Riaz A. Agha, Alexander J. Fowler, Seon-Young Lee, Buket Gundogan, Katharine Whitehurst, Harkiran K. Sagoo, Kyung Jin Lee Jeong, Douglas G. Altman, Dennis P. Orgill

Abstract:

Introduction: Case Series are an important and common study type. Currently, no guideline exists for reporting case series and there is evidence of key data being missed from such reports. We propose to develop a reporting guideline for case series using a methodologically robust technique. The first step in this process is a systematic review of literature relevant to the reporting deficiencies of case series. Methods: A systematic review of methodological and reporting quality in surgical case series was performed. The electronic search strategy was developed by an information specialist and included MEDLINE, EMBASE, Cochrane Methods Register, Science Citation index and Conference Proceedings Citation index, from the start of indexing until 5th November 2014. Independent screening, eligibility assessments and data extraction was performed. Included articles were analyzed for five areas of deficiency: failure to use standardized definitions missing or selective data transparency or incomplete reporting whether alternate study designs were considered. Results: The database searching identified 2,205 records. Through the process of screening and eligibility assessments, 92 articles met inclusion criteria. Frequency of methodological and reporting issues identified was a failure to use standardized definitions (57%), missing or selective data (66%), transparency, or incomplete reporting (70%), whether alternate study designs were considered (11%) and other issues (52%). Conclusion: The methodological and reporting quality of surgical case series needs improvement. Our data shows that clear evidence-based guidelines for the conduct and reporting of a case series may be useful to those planning or conducting them.

Keywords: case series, reporting quality, surgery, systematic review

Procedia PDF Downloads 341
25009 Business Intelligence for Profiling of Telecommunication Customer

Authors: Rokhmatul Insani, Hira Laksmiwati Soemitro

Abstract:

Business Intelligence is a methodology that exploits the data to produce information and knowledge systematically, business intelligence can support the decision-making process. Some methods in business intelligence are data warehouse and data mining. A data warehouse can store historical data from transactional data. For data modelling in data warehouse, we apply dimensional modelling by Kimball. While data mining is used to extracting patterns from the data and get insight from the data. Data mining has many techniques, one of which is segmentation. For profiling of telecommunication customer, we use customer segmentation according to customer’s usage of services, customer invoice and customer payment. Customers can be grouped according to their characteristics and can be identified the profitable customers. We apply K-Means Clustering Algorithm for segmentation. The input variable for that algorithm we use RFM (Recency, Frequency and Monetary) model. All process in data mining, we use tools IBM SPSS modeller.

Keywords: business intelligence, customer segmentation, data warehouse, data mining

Procedia PDF Downloads 450
25008 Keyword Advertising: Still Need Construction in European Union; Perspective on Interflora vs. Marks and Spencer

Authors: Mohammadbagher Asghariaghamashhadi

Abstract:

Internet users normally are automatically linked to an advertisement sponsored by a bidder when Internet users enter any trademarked keyword on a search engine. This advertisement appears beside the search results. Through the process of keyword advertising, advertisers can connect with many Internet users and let them know about their goods and services. This concept has generated heated disagreements among legal scholars, trademark proprietors, advertisers, search engine owners, and consumers. Therefore, use of trademarks in keyword advertising has been one of the most debatable issues in trademark law for several years. This entirely new way of using trademarks over the Internet has provoked a discussion concerning the core concepts of trademark law. In respect to legal issues, European Union (EU) trademark law is mostly governed by the Trademark Directive and the Community Trademark Regulation. Article 5 of the directive and Article 9 of the trademark regulation determine the circumstances in which a trademark owner holds the right to prohibit a third party’s use of his/her registered sign. Harmonized EU trademark law proved to be ambiguous on whether using of a trademark is amounted to trademark infringement or not. The case law of the European Court of Justice (ECJ), with reference to this legislation, is mostly unfavorable to trademark owners. This ambivalence was also exhibited by the case law of EU Member States. European keyword advertisers simply could not tell which use of a competitor‘s trademark was lawful. In recent years, ECJ has continuously expanded the scope and reach of trademark protection in the EU. It is notable that Inconsistencies in the Court’s system of infringement criteria clearly come to the fore and this approach has been criticized by analysts who believe that the Court should have adopted a more traditional approach to the analysis of trademark infringement, which was suggested by its Advocate General, in order to arrive at the same conclusion. Regarding case law of keyword advertising within Europe, one of the most disputable cases is Interflora vs. Marks and Spencer, which is still on-going. This study examines and critically analyzes the decisions of the ECJ, the high court of England, and the Court of Appeals of England and address critically keyword advertising issue within European trademark legislation.

Keywords: ECJ, Google, Interflora, keyword advertising, Marks and Spencer, trademark infringement

Procedia PDF Downloads 322
25007 Creation of Greenhouses by Students, Using the Own Installations of the University and Increasing the Growth of Plants

Authors: Espinosa-Garza G., Loera-Hernandez I., Antonyan N.

Abstract:

To innovate, it is necessary to perform projects directed towards the search of improvement. The agricultural technique and the design of greenhouses have been studied by undergraduate engineering students from the Tecnológico de Monterrey using the campus areas. The purpose of this project was to incite students to create innovations and help rural populations of the state to solve one of the problems that they are dealing with nowadays. The main objective of the project was to search for an alternative technique that will allow the planting of the “chile piquín” plant, also known as Capsicum annuum, to grow quicker as it germinates. The “chile piquín” is one of the original crops of Mexico and forms the basis of the Mesoamerican cultures’ diet since the pre-hispanic era. To fulfill with today’s demand, it is required to implement new alternative methods to increase the “chile piquín’s” growth. The project lasted one semester with the participation of engineering students from multiple majors. The most important results from this academic experience were that, from the proposed goal, the students could analyze the needs of their town and were capable of introducing new and innovative ideas with the aim of resolving them. In the present article the pedagogic methodologies that allowed to carry out this project will be discussed.

Keywords: academic experience, chile piquín, engineering education, greenhouse design, innovation

Procedia PDF Downloads 126
25006 Autonomous Strategic Aircraft Deconfliction in a Multi-Vehicle Low Altitude Urban Environment

Authors: Loyd R. Hook, Maryam Moharek

Abstract:

With the envisioned future growth of low altitude urban aircraft operations for airborne delivery service and advanced air mobility, strategies to coordinate and deconflict aircraft flight paths must be prioritized. Autonomous coordination and planning of flight trajectories is the preferred approach to the future vision in order to increase safety, density, and efficiency over manual methods employed today. Difficulties arise because any conflict resolution must be constrained by all other aircraft, all airspace restrictions, and all ground-based obstacles in the vicinity. These considerations make pair-wise tactical deconfliction difficult at best and unlikely to find a suitable solution for the entire system of vehicles. In addition, more traditional methods which rely on long time scales and large protected zones will artificially limit vehicle density and drastically decrease efficiency. Instead, strategic planning, which is able to respond to highly dynamic conditions and still account for high density operations, will be required to coordinate multiple vehicles in the highly constrained low altitude urban environment. This paper develops and evaluates such a planning algorithm which can be implemented autonomously across multiple aircraft and situations. Data from this evaluation provide promising results with simulations showing up to 10 aircraft deconflicted through a relatively narrow low-altitude urban canyon without any vehicle to vehicle or obstacle conflict. The algorithm achieves this level of coordination beginning with the assumption that each vehicle is controlled to follow an independently constructed flight path, which is itself free of obstacle conflict and restricted airspace. Then, by preferencing speed change deconfliction maneuvers constrained by the vehicles flight envelope, vehicles can remain as close to the original planned path and prevent cascading vehicle to vehicle conflicts. Performing the search for a set of commands which can simultaneously ensure separation for each pair-wise aircraft interaction and optimize the total velocities of all the aircraft is further complicated by the fact that each aircraft's flight plan could contain multiple segments. This means that relative velocities will change when any aircraft achieves a waypoint and changes course. Additionally, the timing of when that aircraft will achieve a waypoint (or, more directly, the order upon which all of the aircraft will achieve their respective waypoints) will change with the commanded speed. Put all together, the continuous relative velocity of each vehicle pair and the discretized change in relative velocity at waypoints resembles a hybrid reachability problem - a form of control reachability. This paper proposes two methods for finding solutions to these multi-body problems. First, an analytical formulation of the continuous problem is developed with an exhaustive search of the combined state space. However, because of computational complexity, this technique is only computable for pairwise interactions. For more complicated scenarios, including the proposed 10 vehicle example, a discretized search space is used, and a depth-first search with early stopping is employed to find the first solution that solves the constraints.

Keywords: strategic planning, autonomous, aircraft, deconfliction

Procedia PDF Downloads 71
25005 Diagnostic Performance of Mean Platelet Volume in the Diagnosis of Acute Myocardial Infarction: A Meta-Analysis

Authors: Kathrina Aseanne Acapulco-Gomez, Shayne Julieane Morales, Tzar Francis Verame

Abstract:

Mean platelet volume (MPV) is the most accurate measure of the size of platelets and is routinely measured by most automated hematological analyzers. Several studies have shown associations between MPV and cardiovascular risks and outcomes. Although its measurement may provide useful data, MPV remains to be a diagnostic tool that is yet to be included in routine clinical decision making. The aim of this systematic review and meta-analysis is to determine summary estimates of the diagnostic accuracy of mean platelet volume for the diagnosis of myocardial infarction among adult patients with angina and/or its equivalents in terms of sensitivity, specificity, diagnostic odds ratio, and likelihood ratios, and to determine the difference of the mean MPV values between those with MI and those in the non-MI controls. The primary search was done through search in electronic databases PubMed, Cochrane Review CENTRAL, HERDIN (Health Research and Development Information Network), Google Scholar, Philippine Journal of Pathology, and Philippine College of Physicians Philippine Journal of Internal Medicine. The reference list of original reports was also searched. Cross-sectional, cohort, and case-control articles studying the diagnostic performance of mean platelet volume in the diagnosis of acute myocardial infarction in adult patients were included in the study. Studies were included if: (1) CBC was taken upon presentation to the ER or upon admission (within 24 hours of symptom onset); (2) myocardial infarction was diagnosed with serum markers, ECG, or according to accepted guidelines by the Cardiology societies (American Heart Association (AHA), American College of Cardiology (ACC), European Society of Cardiology (ESC); and, (3) if outcomes were measured as significant difference AND/OR sensitivity and specificity. The authors independently screened for inclusion of all the identified potential studies as a result of the search. Eligible studies were appraised using well-defined criteria. Any disagreement between the reviewers was resolved through discussion and consensus. The overall mean MPV value of those with MI (9.702 fl; 95% CI 9.07 – 10.33) was higher than in those of the non-MI control group (8.85 fl; 95% CI 8.23 – 9.46). Interpretation of the calculated t-value of 2.0827 showed that there was a significant difference in the mean MPV values of those with MI and those of the non-MI controls. The summary sensitivity (Se) and specificity (Sp) for MPV were 0.66 (95% CI; 0.59 - 0.73) and 0.60 (95% CI; 0.43 – 0.75), respectively. The pooled diagnostic odds ratio (DOR) was 2.92 (95% CI; 1.90 – 4.50). The positive likelihood ratio of MPV in the diagnosis of myocardial infarction was 1.65 (95% CI; 1.20 – 22.27), and the negative likelihood ratio was 0.56 (95% CI; 0.50 – 0.64). The intended role for MPV in the diagnostic pathway of myocardial infarction would perhaps be best as a triage tool. With a DOR of 2.92, MPV values can discriminate between those who have MI and those without. For a patient with angina presenting with elevated MPV values, it is 1.65 times more likely that he has MI. Thus, it is implied that the decision to treat a patient with angina or its equivalents as a case of MI could be supported by an elevated MPV value.

Keywords: mean platelet volume, MPV, myocardial infarction, angina, chest pain

Procedia PDF Downloads 56
25004 Using Printouts as Social Media Evidence and Its Authentication in the Courtroom

Authors: Chih-Ping Chang

Abstract:

Different from traditional objective evidence, social media evidence has its own characteristics with easily tampering, recoverability, and cannot be read without using other devices (such as a computer). Simply taking a screenshot from social network sites must be questioned its original identity. When the police search and seizure digital information, a common way they use is to directly print out digital data obtained and ask the signature of the parties at the presence, without taking original digital data back. In addition to the issue on its original identity, this conduct to obtain evidence may have another two results. First, it will easily allege that is tampering evidence because the police wanted to frame the suspect and falsified evidence. Second, it is not easy to discovery hidden information. The core evidence associated with crime may not appear in the contents of files. Through discovery the original file, data related to the file, such as the original producer, creation time, modification date, and even GPS location display can be revealed from hidden information. Therefore, how to show this kind of evidence in the courtroom will be arguably the most important task for ruling social media evidence. This article, first, will introduce forensic software, like EnCase, TCT, FTK, and analyze their function to prove the identity with another digital data. Then turning back to the court, the second part of this article will discuss legal standard for authentication of social media evidence and application of that forensic software in the courtroom. As the conclusion, this article will provide a rethinking, that is, what kind of authenticity is this rule of evidence chase for. Does legal system automatically operate the transcription of scientific knowledge? Or furthermore, it wants to better render justice, not only under scientific fact, but through multivariate debating.

Keywords: federal rule of evidence, internet forensic, printouts as evidence, social media evidence, United States v. Vayner

Procedia PDF Downloads 271
25003 A Systematic Review of Sensory Processing Patterns of Children with Autism Spectrum Disorders

Authors: Ala’a F. Jaber, Bara’ah A. Bsharat, Noor T. Ismael

Abstract:

Background: Sensory processing is a fundamental skill needed for the successful performance of daily living activities. These skills are impaired as parts of the neurodevelopmental process issues among children with autism spectrum disorder (ASD). This systematic review aimed to summarize the evidence on the differences in sensory processing and motor characteristic between children with ASD and children with TD. Method: This systematic review followed the guidelines of the preferred reporting items for systematic reviews and meta-analysis. The search terms included sensory, motor, condition, and child-related terms or phrases. The electronic search utilized Academic Search Ultimate, CINAHL Plus with Full Text, ERIC, MEDLINE, MEDLINE Complete, Psychology, and Behavioral Sciences Collection, and SocINDEX with full-text databases. The hand search included looking for potential studies in the references of related studies. The inclusion criteria included studies published in English between years 2009-2020 that included children aged 3-18 years with a confirmed ASD diagnosis, according to the DSM-V criteria, included a control group of typical children, included outcome measures related to the sensory processing and/or motor functions, and studies available in full-text. The review of included studies followed the Oxford Centre for Evidence-Based Medicine guidelines, and the Guidelines for Critical Review Form of Quantitative Studies, and the guidelines for conducting systematic reviews by the American Occupational Therapy Association. Results: Eighty-eight full-text studies related to the differences between children with ASD and children with TD in terms of sensory processing and motor characteristics were reviewed, of which eighteen articles were included in the quantitative synthesis. The results reveal that children with ASD had more extreme sensory processing patterns than children with TD, like hyper-responsiveness and hypo-responsiveness to sensory stimuli. Also, children with ASD had limited gross and fine motor abilities and lower strength, endurance, balance, eye-hand coordination, movement velocity, cadence, dexterity with a higher rate of gait abnormalities than children with TD. Conclusion: This systematic review provided preliminary evidence suggesting that motor functioning should be addressed in the evaluation and intervention for children with ASD, and sensory processing should be supported among children with TD. More future research should investigate whether how the performance and engagement in daily life activities are affected by sensory processing and motor skills.

Keywords: sensory processing, occupational therapy, children, motor skills

Procedia PDF Downloads 107
25002 Imputation Technique for Feature Selection in Microarray Data Set

Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam

Abstract:

Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.

Keywords: DNA microarray, feature selection, missing data, bioinformatics

Procedia PDF Downloads 536
25001 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework

Authors: Lutful Karim, Mohammed S. Al-kahtani

Abstract:

Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.

Keywords: big data, clustering, tree topology, data aggregation, sensor networks

Procedia PDF Downloads 306
25000 Development of High Temperature Mo-Si-B Based In-situ Composites

Authors: Erhan Ayas, Buse Katipoğlu, Eda Metin, Rifat Yılmaz

Abstract:

The search for new materials has begun to be used even higher than the service temperature (~1150ᵒC) where nickel-based superalloys are currently used. This search should also meet the increasing demands for energy efficiency improvements. The materials studied for aerospace applications are expected to have good oxidation resistance. Mo-Si-B alloys, which have higher operating temperatures than nickel-based superalloys, are candidates for ultra-high temperature materials used in gas turbine and jet engines. Because the Moss and Mo₅SiB₂ (T2) phases exhibit high melting temperature, excellent high-temperature creep strength and oxidation resistance properties, however, low fracture toughness value at room temperature is a disadvantage for these materials, but this feature can be improved with optimum Moss phase and microstructure control. High-density value is also a problem for structural parts. For example, in turbine rotors, the higher the weight, the higher the centrifugal force, which reduces the creep life of the material. The density value of the nickel-based superalloys and the T2 phase, which is the Mo-Si-B alloy phase, is in the range of 8.6 - 9.2 g/cm³. But under these conditions, T2 phase Moss (density value 10.2 g/cm³), this value is above the density value of nickel-based superalloys. So, with some ceramic-based contributions, this value is enhanced by optimum values.

Keywords: molybdenum, composites, in-situ, mmc

Procedia PDF Downloads 45
24999 Pedagogy to Involve Research Process in an Undergraduate Physical Fitness Course: A Case Study

Authors: Indhumathi Gopal

Abstract:

Undergraduate research is well documented in Science, Technology, Engineering, and Mathematics (STEM), neurosciences, and microbiology disciplines, though it is hardly part of a physical fitness & wellness discipline. However, students need experiential learning opportunities, like internships and research assistantships, to get ahead with graduate schools and be gainfully employed. The first step towards this goal is to have students do a simple research project in a semester-long course. The value of research experiences and how to integrate research activity in a physical fitness & wellness course are discussed. The investigator looks into a mini research project, “Awareness of Obesity among College Students” and explains how to guide students through the research process, including journal search, data collection, and basic statistics. Besides, students will be introduced to the statistical package program SPSS 22.0 to assist with data evaluation. The lab component of the combined lecture-physical activity course could include the measurement of student’s weight with respect to their height to obtain body mass index (BMI). Students could categorize themselves in accordance with the World Health Organization’s guidelines. Results obtained after completing the data analysis help students be aware of their own potential health risks associated with overweight and obesity. Overweight and obesity are risk factors for hypertension, hypercholesterolemia, heart disease, stroke, diabetes, and certain types of cancer. It is hoped that this experience will get students interested in scientific studies, gain confidence, think critically, and develop problem-solving and good communication skills.

Keywords: physical fitness, undergraduate research experience, obesity, BMI

Procedia PDF Downloads 45
24998 Sparse Principal Component Analysis: A Least Squares Approximation Approach

Authors: Giovanni Merola

Abstract:

Sparse Principal Components Analysis aims to find principal components with few non-zero loadings. We derive such sparse solutions by adding a genuine sparsity requirement to the original Principal Components Analysis (PCA) objective function. This approach differs from others because it preserves PCA's original optimality: uncorrelatedness of the components and least squares approximation of the data. To identify the best subset of non-zero loadings we propose a branch-and-bound search and an iterative elimination algorithm. This last algorithm finds sparse solutions with large loadings and can be run without specifying the cardinality of the loadings and the number of components to compute in advance. We give thorough comparisons with the existing sparse PCA methods and several examples on real datasets.

Keywords: SPCA, uncorrelated components, branch-and-bound, backward elimination

Procedia PDF Downloads 345
24997 Review of Published Articles on Climate Change and Health in Two Francophone Newspapers: 1990-2015

Authors: Mathieu Hemono, Sophie Puig-Malet, Patrick Zylberman, Avner Bar-Hen, Rainer Sauerborn, Stefanie Schütte, Niamh Herlihi, Antoine Flahault et Anneliese Depoux

Abstract:

Since the IPCC released its first report in 1990, an increasing number of peer-reviewed publications have reported the health risks associated with climate change. Although there is a large body of evidence supporting the association between climate change and poor health outcomes, the media is inconsistent in the attention it pays to the subject matter. This study aims to analyze the modalities and rhetoric in the media concerning the impact of climate change on health in order to better understand its role in information dissemination. A review was conducted of articles published between 1990 and 2015 in the francophone newspapers Le Monde and Jeune Afrique. A detailed search strategy including specific climate and health terminology was used to search the newspapers’ online databases. 1202 articles were identified as having referenced the terms climate change and health. Inclusion and exclusion criteria were applied to narrow the search to articles referencing the effects of climate change on human health and 160 articles were included in the final analysis. Data was extracted and categorized to create a structured database allowing for further investigation and analysis. The review indicated that although 66% of the selected newspaper articles reference scientific evidence of the impact of climate change on human health, the focus on the topic is limited major political events or is circumstances relating to public health crises. Main findings also include that among the many direct and indirect health outcomes, infectious diseases are the main health outcome highlighted in association with climate change. Lastly, the articles suggest that while developed countries have caused most of the greenhouse effect, the global south is more immediately affected. Overall, the reviewed articles reinforce the need for international cooperation in finding a solution to mitigate the effects of climate change on health. The manner in which scientific results are communicated and disseminated, impact individual and collective perceptions of the topic in the public sphere and affect political will to shape policy. The results of this analysis will underline the modalities of the rhetoric of transparency and provide the basis for a perception study of media discourses. This study is part of an interdisciplinary project called 4CHealth that confronts results of the research done on scientific, political and press literature to better understand how the knowledge on climate changes and health circulates within those different fields and whether and how it is translated to real world change.

Keywords: climate change, health, health impacts, communication, media, rhetoric, awareness, Global South, Africa

Procedia PDF Downloads 394
24996 Employing QR Code as an Effective Educational Tool for Quick Access to Sources of Kindergarten Concepts

Authors: Ahmed Amin Mousa, M. Abd El-Salam

Abstract:

This study discusses a simple solution for the problem of shortage in learning resources for kindergarten teachers. Occasionally, kindergarten teachers cannot access proper resources by usual search methods as libraries or search engines. Furthermore, these methods require a long time and efforts for preparing. The study is expected to facilitate accessing learning resources. Moreover, it suggests a potential direction for using QR code inside the classroom. The present work proposes that QR code can be used for digitizing kindergarten curriculums and accessing various learning resources. It investigates using QR code for saving information related to the concepts which kindergarten teachers use in the current educational situation. The researchers have established a guide for kindergarten teachers based on the Egyptian official curriculum. The guide provides different learning resources for each scientific and mathematical concept in the curriculum, and each learning resource is represented as a QR code image that contains its URL. Therefore, kindergarten teachers can use smartphone applications for reading QR codes and displaying the related learning resources for students immediately. The guide has been provided to a group of 108 teachers for using inside their classrooms. The results showed that the teachers approved the guide, and gave a good response.

Keywords: kindergarten, child, learning resources, QR code, smart phone, mobile

Procedia PDF Downloads 263
24995 The Search of Possibility of Running Six Sigma Process in It Education Center

Authors: Mohammad Amini, Aliakbar Alijarahi

Abstract:

This research that is collected and title as ‘ the search of possibility of running six sigma process in IT education center ‘ goals to test possibility of running the six sigma process and using in IT education center system. This process is a good method that is used for reducing process, errors. To evaluate running off six sigma in the IT education center, some variables relevant to this process is selected. These variables are: - The amount of support from organization master boss to process. - The current specialty. - The ability of training system for compensating reduction. - The amount of match between current culture whit six sigma culture . - The amount of current quality by comparing whit quality gain from running six sigma. For evaluation these variables we select four question and to gain the answers, we set a questionnaire from with 28 question and distribute it in our typical society. Since, our working environment is a very competition, and organization needs to decree the errors to minimum, otherwise it lasts their customers. The questionnaire from is given to 55 persons, they were filled and returned by 50 persons, after analyzing the forms these results is gained: - IT education center needs to use and run this system (six sigma) for improving their process qualities. - The most factors need to run the six sigma exist in the IT education center, but there is a need to support.

Keywords: education, customer, self-action, quality, continuous improvement process

Procedia PDF Downloads 319
24994 Examining the Functional and Practical Aspects of Iranian Painting as a Visual-Identity Language in Iranian Graphics

Authors: Arezoo Seifollahi

Abstract:

One of the topics that is receiving a lot of attention in artistic circles and among Iran today and has been the subject of many conversations is the issue of Iranian graphics. In this research, the functional and practical aspects of Iranian painting as a visual-identity language in Iranian graphics have been investigated by relying on Iranian cultural and social posters in order to gain an understanding of the trend of contemporary graphic art in Iran and to help us reach the identity of graphics. In order to arrive at Iranian graphics, first, the issue of identity and what it is has been examined, and then this category has been addressed in Iran and throughout the history of this country in order to reveal the characteristics of the identity that has come to us today under the name of Iranian identity cognition. In the following, the search for Iranian identity in the art of this land, especially the art of painting, and then the art of contemporary painting and the search for identity in it have been discussed. After that, Iranian identity has been investigated in Iranian graphics. To understand Iranian graphics, after a brief description of its contemporary history, this art is examined at the considered time point. By using the inductive method of examining the posters of each course and taking into account the related cultural and social conditions, we tried to get a general and comprehensive understanding of the graphic features of each course.

Keywords: Iranian painting, graphic visual language, Iranian identity, social cultural poster

Procedia PDF Downloads 19
24993 The State of Oral Health after COVID-19 Lockdown: A Systematic Review

Authors: Faeze omid, Morteza Banakar

Abstract:

Background: The COVID-19 pandemic has had a significant impact on global health and healthcare systems, including oral health. The lockdown measures implemented in many countries have led to changes in oral health behaviors, access to dental care, and the delivery of dental services. However, the extent of these changes and their effects on oral health outcomes remains unclear. This systematic review aims to synthesize the available evidence on the state of oral health after the COVID-19 lockdown. Methods: We conducted a systematic search of electronic databases (PubMed, Embase, Scopus, and Web of Science) and grey literature sources for studies reporting on oral health outcomes after the COVID-19 lockdown. We included studies published in English between January 2020 and March 2023. Two reviewers independently screened the titles, abstracts, and full texts of potentially relevant articles and extracted data from included studies. We used a narrative synthesis approach to summarize the findings. Results: Our search identified 23 studies from 12 countries, including cross-sectional surveys, cohort studies, and case reports. The studies reported on changes in oral health behaviors, access to dental care, and the prevalence and severity of dental conditions after the COVID-19 lockdown. Overall, the evidence suggests that the lockdown measures had a negative impact on oral health outcomes, particularly among vulnerable populations. There were decreases in dental attendance, increases in dental anxiety and fear, and changes in oral hygiene practices. Furthermore, there were increases in the incidence and severity of dental conditions, such as dental caries and periodontal disease, and delays in the diagnosis and treatment of oral cancers. Conclusion: The COVID-19 pandemic and associated lockdown measures have had significant effects on oral health outcomes, with negative impacts on oral health behaviors, access to care, and the prevalence and severity of dental conditions. These findings highlight the need for continued monitoring and interventions to address the long-term effects of the pandemic on oral health.

Keywords: COVID-19, oral health, systematic review, dental public health

Procedia PDF Downloads 49
24992 Personalization of Context Information Retrieval Model via User Search Behaviours for Ranking Document Relevance

Authors: Kehinde Agbele, Longe Olumide, Daniel Ekong, Dele Seluwa, Akintoye Onamade

Abstract:

One major problem of most existing information retrieval systems (IRS) is that they provide even access and retrieval results to individual users specially based on the query terms user issued to the system. When using IRS, users often present search queries made of ad-hoc keywords. It is then up to IRS to obtain a precise representation of user’s information need, and the context of the information. In effect, the volume and range of the Internet documents is growing exponentially and consequently causes difficulties for a user to obtain information that precisely matches the user interest. Diverse combination techniques are used to achieve the specific goal. This is due, firstly, to the fact that users often do not present queries to IRS that optimally represent the information they want, and secondly, the measure of a document's relevance is highly subjective between diverse users. In this paper, we address the problem by investigating the optimization of IRS to individual information needs in order of relevance. The paper addressed the development of algorithms that optimize the ranking of documents retrieved from IRS. This paper addresses this problem with a two-fold approach in order to retrieve domain-specific documents. Firstly, the design of context of information. The context of a query determines retrieved information relevance using personalization and context-awareness. Thus, executing the same query in diverse contexts often leads to diverse result rankings based on the user preferences. Secondly, the relevant context aspects should be incorporated in a way that supports the knowledge domain representing users’ interests. In this paper, the use of evolutionary algorithms is incorporated to improve the effectiveness of IRS. A context-based information retrieval system that learns individual needs from user-provided relevance feedback is developed whose retrieval effectiveness is evaluated using precision and recall metrics. The results demonstrate how to use attributes from user interaction behavior to improve the IR effectiveness.

Keywords: context, document relevance, information retrieval, personalization, user search behaviors

Procedia PDF Downloads 437
24991 Clustering of Association Rules of ISIS & Al-Qaeda Based on Similarity Measures

Authors: Tamanna Goyal, Divya Bansal, Sanjeev Sofat

Abstract:

In world-threatening terrorist attacks, where early detection, distinction, and prediction are effective diagnosis techniques and for functionally accurate and precise analysis of terrorism data, there are so many data mining & statistical approaches to assure accuracy. The computational extraction of derived patterns is a non-trivial task which comprises specific domain discovery by means of sophisticated algorithm design and analysis. This paper proposes an approach for similarity extraction by obtaining the useful attributes from the available datasets of terrorist attacks and then applying feature selection technique based on the statistical impurity measures followed by clustering techniques on the basis of similarity measures. On the basis of degree of participation of attributes in the rules, the associative dependencies between the attacks are analyzed. Consequently, to compute the similarity among the discovered rules, we applied a weighted similarity measure. Finally, the rules are grouped by applying using hierarchical clustering. We have applied it to an open source dataset to determine the usability and efficiency of our technique, and a literature search is also accomplished to support the efficiency and accuracy of our results.

Keywords: association rules, clustering, similarity measure, statistical approaches

Procedia PDF Downloads 294
24990 An Evolutionary Approach for QAOA for Max-Cut

Authors: Francesca Schiavello

Abstract:

This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.

Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization

Procedia PDF Downloads 33
24989 Heuristic Search Algorithm (HSA) for Enhancing the Lifetime of Wireless Sensor Networks

Authors: Tripatjot S. Panag, J. S. Dhillon

Abstract:

The lifetime of a wireless sensor network can be effectively increased by using scheduling operations. Once the sensors are randomly deployed, the task at hand is to find the largest number of disjoint sets of sensors such that every sensor set provides complete coverage of the target area. At any instant, only one of these disjoint sets is switched on, while all other are switched off. This paper proposes a heuristic search method to find the maximum number of disjoint sets that completely cover the region. A population of randomly initialized members is made to explore the solution space. A set of heuristics has been applied to guide the members to a possible solution in their neighborhood. The heuristics escalate the convergence of the algorithm. The best solution explored by the population is recorded and is continuously updated. The proposed algorithm has been tested for applications which require sensing of multiple target points, referred to as point coverage applications. Results show that the proposed algorithm outclasses the existing algorithms. It always finds the optimum solution, and that too by making fewer number of fitness function evaluations than the existing approaches.

Keywords: coverage, disjoint sets, heuristic, lifetime, scheduling, Wireless sensor networks, WSN

Procedia PDF Downloads 429
24988 The Use Management of the Knowledge Management and the Information Technologies in the Competitive Strategy of a Self-Propelling Industry

Authors: Guerrero Ramírez Sandra, Ramos Salinas Norma Maricela, Muriel Amezcua Vanesa

Abstract:

This article presents the beginning of a wider study that intends to demonstrate how within organizations of the automotive industry from the city of Querétaro. Knowledge management and technological management are required, as well as people’s initiative and the interaction embedded at the interior of it, with the appropriate environment that facilitates information conversion with wide information technologies management (ITM) range. A company was identified for the pilot study of this research, where descriptive and inferential research information was obtained. The results of the pilot suggest that some respondents did noted entity the knowledge management topic, even if staffs have access to information technology (IT) that serve to enhance access to knowledge (through internet, email, databases, external and internal company personnel, suppliers, customers and competitors) data, this implicates that there are Knowledge Management (KM) problems. The data shows that academically well-prepared organizations normally do not recognize the importance of knowledge in the business, nor in the implementation of it, which at the end is a great influence on how to manage it, so that it should guide the company to greater in sight towards a competitive strategy search, given that the company has an excellent technological infrastructure and KM was not exploited. Cultural diversity is another factor that was observed by the staff.

Keywords: Knowledge Management (KM), Technological Knowledge Management (TKM), Technology Information Management (TI), access to knowledge

Procedia PDF Downloads 477
24987 Energy Efficient Clustering with Adaptive Particle Swarm Optimization

Authors: KumarShashvat, ArshpreetKaur, RajeshKumar, Raman Chadha

Abstract:

Wireless sensor networks have principal characteristic of having restricted energy and with limitation that energy of the nodes cannot be replenished. To increase the lifetime in this scenario WSN route for data transmission is opted such that utilization of energy along the selected route is negligible. For this energy efficient network, dandy infrastructure is needed because it impinges the network lifespan. Clustering is a technique in which nodes are grouped into disjoints and non–overlapping sets. In this technique data is collected at the cluster head. In this paper, Adaptive-PSO algorithm is proposed which forms energy aware clusters by minimizing the cost of locating the cluster head. The main concern is of the suitability of the swarms by adjusting the learning parameters of PSO. Particle Swarm Optimization converges quickly at the beginning stage of the search but during the course of time, it becomes stable and may be trapped in local optima. In suggested network model swarms are given the intelligence of the spiders which makes them capable enough to avoid earlier convergence and also help them to escape from the local optima. Comparison analysis with traditional PSO shows that new algorithm considerably enhances the performance where multi-dimensional functions are taken into consideration.

Keywords: Particle Swarm Optimization, adaptive – PSO, comparison between PSO and A-PSO, energy efficient clustering

Procedia PDF Downloads 224
24986 Energy Saving Study of Mass Rapid Transit by Optimal Train Coasting Operation

Authors: Artiya Sopharak, Tosaphol Ratniyomchai, Thanatchai Kulworawanichpong

Abstract:

This paper presents an energy-saving study of Mass Rapid Transit (MRT) using an optimal train coasting operation. For the dynamic train movement with four modes of operation, including accelerating mode, constant speed or cruising mode, coasting mode, and braking mode are considered in this study. The acceleration rate, the deceleration rate, and the starting coasting point are taken into account the optimal train speed profile during coasting mode with considering the energy saving and acceptable travel time comparison to the based case with no coasting operation. In this study, the mathematical method as a Quadratic Search Method (QDS) is conducted to carry out the optimization problem. A single train of MRT services between two stations with a distance of 2 km and a maximum speed of 80 km/h is taken to be the case study. Regarding the coasting mode operation, the results show that the longer distance of costing mode, the less energy consumption in cruising mode and the less braking energy. On the other hand, the shorter distance of coasting mode, the more energy consumption in cruising mode and the more braking energy.

Keywords: energy saving, coasting mode, mass rapid transit, quadratic search method

Procedia PDF Downloads 272
24985 Control the Flow of Big Data

Authors: Shizra Waris, Saleem Akhtar

Abstract:

Big data is a research area receiving attention from academia and IT communities. In the digital world, the amounts of data produced and stored have within a short period of time. Consequently this fast increasing rate of data has created many challenges. In this paper, we use functionalism and structuralism paradigms to analyze the genesis of big data applications and its current trends. This paper presents a complete discussion on state-of-the-art big data technologies based on group and stream data processing. Moreover, strengths and weaknesses of these technologies are analyzed. This study also covers big data analytics techniques, processing methods, some reported case studies from different vendor, several open research challenges and the chances brought about by big data. The similarities and differences of these techniques and technologies based on important limitations are also investigated. Emerging technologies are suggested as a solution for big data problems.

Keywords: computer, it community, industry, big data

Procedia PDF Downloads 162
24984 Approaches of Flight Level Selection for an Unmanned Aerial Vehicle Round-Trip in Order to Reach Best Range Using Changes in Flight Level Winds

Authors: Dmitry Fedoseyev

Abstract:

The ultimate success of unmanned aerial vehicles (UAVs) depends largely on the effective control of their flight, especially in variable wind conditions. This paper investigates different approaches to selecting the optimal flight level to maximize the range of UAVs. We propose to consider methods based on mathematical models of atmospheric conditions, as well as the use of sensor data and machine learning algorithms to automatically optimize the flight level in real-time. The proposed approaches promise to improve the efficiency and range of UAVs in various wind conditions, which may have significant implications for the application of these systems in various fields, including geodesy, environmental surveillance, and search and rescue operations.

Keywords: drone, UAV, flight trajectory, wind-searching, efficiency

Procedia PDF Downloads 18
24983 Cellulose Acetate/Polyacrylic Acid Filled with Nano-Hydroxapatite Composites: Spectroscopic Studies and Search for Biomedical Applications

Authors: E. M. AbdelRazek, G. S. ElBahy, M. A. Allam, A. M. Abdelghany, A. M. Hezma

Abstract:

Polymeric biocomposite of hydroxyapatite/polyacrylic acid were prepared and their thermal and mechanical properties were improved by addition of cellulose acetate. FTIR spectroscopy technique and X-ray diffraction analysis were employed to examine the physical and chemical characteristics of the biocomposites. Scanning electron microscopy shows a uniform distribution of HAp nano-particles through the polymeric matrix of two organic/inorganic composites weight ratios (60/40 and 70/30), at which the material crystallinity reaches a considerable value appropriate for the needed applications were studied and revealed that the HAp nano-particles are uniformly distributed in the polymeric matrix. Kinetic parameters were determined from the weight loss data using non isothermal thermogravimetric analysis (TGA). Also, the main degradation steps were described and discussed. The mechanical properties of composites were evaluated by measuring tensile strength and elastic modulus. The data indicate that the addition of cellulose acetate can make homogeneous composites scaffold significantly resistant to higher stress. Elastic modulus of the composites was also improved by the addition of cellulose acetate, making them more appropriate for bioapplications.

Keywords: biocomposite, chemical synthesis, infrared spectroscopy, mechanical properties

Procedia PDF Downloads 435
24982 Building an Ontology for Researchers: An Application of Topic Maps and Social Information

Authors: Yu Hung Chiang, Hei Chia Wang

Abstract:

In the academic area, it is important for research to find proper research domain. Many researchers may refer to conference issues to find their interesting or new topics. Furthermore, conferences issues can help researchers realize current research trends in their field and learn about cutting-edge developments in their specialty. However, online published conference information may widely be distributed; it is not easy to be concluded. Many researchers use search engine of journals or conference issues to filter information in order to get what they want. However, this search engine has its limitation. There will still be some issues should be considered; i.e. researchers cannot find the associated topics which may be useful information for them. Hence, use Knowledge Management (KM) could be a way to resolve these issues. In KM, ontology is widely adopted; but most existed ontology construction methods do not consider social information between target users. To effective in academic KM, this study proposes a method of constructing research Topic Maps using Open Directory Project (ODP) and Social Information Processing (SIP). Through catching of social information in conference website: i.e. the information of co-authorship or collaborator, research topics can be associated among related researchers. Finally, the experiments show Topic Maps successfully help researchers to find the information they need more easily and quickly as well as construct associations between research topics.

Keywords: knowledge management, topic map, social information processing, ontology extraction

Procedia PDF Downloads 272