Search results for: successful learning
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8981

Search results for: successful learning

3881 Exploring the Role of Humorous Dialogues in Advertisements of Pakistani Network Companies: Analysis of Discourses through Multi-Modal Critical Approach

Authors: Jane E. Alam Solangi

Abstract:

The contribution of the study is to explore the important part of humorous dialogues in cellular network advertisements. This promotes the message of valuable construction and promotion of network companies in Pakistan that employ different and broad techniques to give promotion to selling products. It merely instigates the consumers to buy it. The results of the study after analysis of its collected data gives a vision that advertisers of network advertisements use humorous dialogues as a significant device to the greater level. The source of entertainment in the advertisement is accompanied by the texts and humorous discourses to influence buying decisions of the consumers. Therefore, it tends to neutralize personal and social based values. The earlier contribution of scholars presented that the technical employment of humorous devices leads to the successful market of the relevant products. In order to analyze the humorous discourse devices, the approach of multi-modality of Fairclough (1989) is used. It is accompanied by the framework of Kress and van Leeuwen’s (1996). It analyzes the visual graph of the grammar. The overall findings in the study verified the role of humorous devices in the captivation of consumers’ decision to buy the product that interests them. Therefore, the role of humor acts as a breaker of the monotonous rhythm of advertisements.

Keywords: advertisements, devices, humorous, multi-modality, networks, Pakistan

Procedia PDF Downloads 85
3880 Experiences of Timing Analysis of Parallel Embedded Software

Authors: Muhammad Waqar Aziz, Syed Abdul Baqi Shah

Abstract:

The execution time analysis is fundamental to the successful design and execution of real-time embedded software. In such analysis, the Worst-Case Execution Time (WCET) of a program is a key measure, on the basis of which system tasks are scheduled. The WCET analysis of embedded software is also needed for system understanding and to guarantee its behavior. WCET analysis can be performed statically (without executing the program) or dynamically (through measurement). Traditionally, research on the WCET analysis assumes sequential code running on single-core platforms. However, as computation is steadily moving towards using a combination of parallel programs and multi-core hardware, new challenges in WCET analysis need to be addressed. In this article, we report our experiences of performing the WCET analysis of Parallel Embedded Software (PES) running on multi-core platform. The primary purpose was to investigate how WCET estimates of PES can be computed statically, and how they can be derived dynamically. Our experiences, as reported in this article, include the challenges we faced, possible suggestions to these challenges and the workarounds that were developed. This article also provides observations on the benefits and drawbacks of deriving the WCET estimates using the said methods and provides useful recommendations for further research in this area.

Keywords: embedded software, worst-case execution-time analysis, static flow analysis, measurement-based analysis, parallel computing

Procedia PDF Downloads 309
3879 Enhancing the Rollability of Cu-Ge-Ni Alloy through Heat Treatment Methods

Authors: Morteza Hadi

Abstract:

This research investigates the potential enhancement of the rollability of Cu-Ge-Ni alloy through the mitigation of microstructural and compositional inhomogeneities via two distinct heat treatment methods: homogenization and solution treatment. To achieve this objective, the alloy with the desired composition was fabricated using a vacuum arc remelting furnace (VAR), followed by sample preparation for microstructural, compositional, and heat treatment analyses at varying temperatures and durations. Characterization was conducted employing optical and scanning electron microscopy (SEM), X-ray diffraction (XRD), and Vickers hardness testing. The results obtained indicate that a minimum duration of 10 hours is necessary for adequate homogenization of the alloy at 750°C. This heat treatment effectively removes coarse dendrites from the casting microstructure and significantly reduces elemental separations. However, despite these improvements, the presence of a second phase with markedly different hardness from the matrix results in poor rolling ability for the alloy. The optimal time for solution treatment at various temperatures was determined, with the most effective cycle identified as 750°C for 2 hours, followed by rapid quenching in water. This process induces the formation of a single-phase microstructure and complete elimination of the second  phase, as confirmed by X-ray diffraction analysis. Results demonstrate a reduction in hardness by 30 Vickers, and the elimination of microstructural unevenness enables successful thickness reduction by up to 50% through rolling without encountering cracking.

Keywords: Cu-Ge-Ni alloy, homogenization. solution treatment, rollability

Procedia PDF Downloads 35
3878 Prototyping the Problem Oriented Medical Record for Connected Health Based on TypeGraphQL

Authors: Sabah Mohammed, Jinan Fiaidhi, Darien Sawyer

Abstract:

Data integration of health through connected services can save lives in the event of a medical emergency or provide efficient and effective interventions for the benefit of the patients through the integration of bedside and bench side clinical research. Such integration will support all wind of change in healthcare by being predictive, pre-emptive, personalized, problem-oriented and participatory. Prototyping a healthcare system that enables data integration has been a big challenge for healthcare for a long time. However, an innovative solution started to emerge by focusing on problem lists where everything can connect the problem list forming a growing graph. This notion was introduced by Dr. Lawrence Weed in early 70’s, but the enabling technologies weren’t mature enough to provide a successful implementation prototype. In this article, we are describing our efforts in prototyping Dr. Lawrence Weed's problem-oriented medical record (POMR) and his patient case schema (SOAP) to shape a prototype for connected health. For this, we are using the TypeGraphQL API and our enterprise-based QL4POMR to describe a Web-Based gateway for healthcare services connectivity. Our prototype has reported success in connecting to the HL7 FHIR medical record and the OpenTarget biomedical repositories.

Keywords: connected health, problem-oriented healthcare record, SOAP, QL4POMR, typegraphQL

Procedia PDF Downloads 81
3877 The Determinant Factors of Technology Adoption for Improving Firm’s Performance; Toward a Conceptual Model

Authors: Zainal Arifin, Avanti Fontana

Abstract:

Considering that TOE framework is the most useful instrument for studying technology adoption in firm context, this paper will analyze the influence of technological, organizational and environmental (TOE) factors to the Dynamic capabilities (DCs) associated with technology adoption strategy for improving the firm’s performance. Focusing on the determinant factors of technology adoption at the firm level, the study will contribute to the broader study of resource base view (RBV) and dynamic capability (DC). There is no study connecting directly the TOE factors to the DCs, this paper proposes technology adoption as a functional competence/capability which mediates a relationship between technology adoptions with firm’s performance. The study wants to show a conceptual model of the indirect effects of DCs at the firm level, which can be key predictors of firm performance in dynamic business environment. The results of this research is mostly relevant to top corporate executives (BOD) or top management team (TMT) who seek to provide some supporting ‘hardware’ content and condition such as technological factors, organizational factors, environmental factors, and to improve firm's ‘software ‘ ability such as adaptive capability, absorptive capability and innovative capability, in order to achieve a successful technology adoption in organization. There are also mediating factors which are elaborated at this paper; timing and external network. A further research for showing its empirical results is highly recommended.

Keywords: technology adoption, TOE framework, dynamic capability, resources based view

Procedia PDF Downloads 315
3876 Floating Quantifiers in Hijazi Arabic

Authors: Tagreed Alzahrani

Abstract:

The syntax of quantifiers has received much attention by linguists, philosophers and logicians within different frameworks and in various languages. However, the syntax of Arabic quantifiers has received limited attention in the literature, especially in relation to floating quantifiers. There have been a few discussions of floating quantifiers in Modern Standard Arabic (henceforth, MSA), although the analysis and the properties of their counterparts in other Saudi dialects are rare. Therefore, the aim of the paper is to provide a clear description of floating quantifiers (FQs) in Hijazi dialect (henceforth, HA) by utilising the following approaches: the adverbial approach, and the derivational (stranding) analysis. For a long time, Linguists have tried to explain the floating quantifiers’ phenomenon, as exemplified in the following sentences: 1. All the friends have watched the movie. 2. The friends have all watched the movie. The adverbial approach assumes that the floating quantifier is a type of adverb, because it occupies the adverbial position next to the verb. Thus, the subject in the first example is all the friends and the subject in the second example is the friends with all becoming an adverb, as it is located in an adverbial position. However, in stranding analysis, it is argued that the floating quantifier becomes stranded when its complement has moved to a higher position in the sentence [SPEC, TP]. Therefore, both sentences have the same subject all the friends, although in second example the friends has moved to a higher position and has stranded the quantifier all. The paper will investigate the floating quantifiers in HA using both approaches. The analysis will show that neither view is entirely successful in providing a unified account for FQs in HA.

Keywords: floating quantifier, adverbial analysis, stranding approach, universal quantifier

Procedia PDF Downloads 339
3875 The Role of Knowledge Management in Global Software Engineering

Authors: Samina Khalid, Tehmina Khalil, Smeea Arshad

Abstract:

Knowledge management is essential ingredient of successful coordination in globally distributed software engineering. Various frameworks, KMSs, and tools have been proposed to foster coordination and communication between virtual teams but practical implementation of these solutions has not been found. Organizations have to face challenges to implement knowledge management system. For this purpose at first, a literature review is arranged to investigate about challenges that restrict organizations to implement KMS and then by taking in account these challenges a problem of need of integrated solution in the form of standardized KMS that can easily store tacit and explicit knowledge, has traced down to facilitate coordination and collaboration among virtual teams. Literature review has been already shown that knowledge is a complex perception with profound meanings, and one of the most important resources that contributes to the competitive advantage of an organization. In order to meet the different challenges caused by not properly managing knowledge related to projects among virtual teams in GSE, we suggest making use of the cloud computing model. In this research a distributed architecture to support KM storage is proposed called conceptual framework of KM as a service in cloud. Framework presented is enhanced and conceptual framework of KM is embedded into that framework to store projects related knowledge for future use.

Keywords: management, Globsl software development, global software engineering

Procedia PDF Downloads 510
3874 Communal Shipping Container Home Design for Reducing Homelessness in Glasgow

Authors: Matthew Brady

Abstract:

Lack of affordable housing for individuals has the potential to create gaps in society, which result in thousands of people facing homelessness every year in some of the worlds most affluent cities. This paper examines strategies for providing a more economic living environment for single occupants. Focusing on comparisons of successful examples reducing homeless populations around the world, with an emphasis on social inclusion and community living. Practically exploring the architectural considerations of ensuring a suitable living environment for multiple single occupancy residents, as well as selecting the appropriate materials to ensure costs are kept to manageable level for investment from local governments. The aim of this paper is to make some practical recommendations for low cost communal living space, with particular reference to recycled shipping container homes on a potential unused site on the River Clyde in Glasgow. Ideally, the suggestions and recommendations put forward in this paper can be replicable or used for reference in other similar situations. The proposal explored in this paper is sensitive towards addressing people's standard of living and adapting homes to match may be one solution to reducing the number of people being evicted from unaffordable homes as the generally upward global trend for urbanization continues.

Keywords: affordable housing, community living, shipping container, urban regeneration

Procedia PDF Downloads 164
3873 Real-Time Recognition of Dynamic Hand Postures on a Neuromorphic System

Authors: Qian Liu, Steve Furber

Abstract:

To explore how the brain may recognize objects in its general,accurate and energy-efficient manner, this paper proposes the use of a neuromorphic hardware system formed from a Dynamic Video Sensor~(DVS) silicon retina in concert with the SpiNNaker real-time Spiking Neural Network~(SNN) simulator. As a first step in the exploration on this platform a recognition system for dynamic hand postures is developed, enabling the study of the methods used in the visual pathways of the brain. Inspired by the behaviours of the primary visual cortex, Convolutional Neural Networks (CNNs) are modeled using both linear perceptrons and spiking Leaky Integrate-and-Fire (LIF) neurons. In this study's largest configuration using these approaches, a network of 74,210 neurons and 15,216,512 synapses is created and operated in real-time using 290 SpiNNaker processor cores in parallel and with 93.0% accuracy. A smaller network using only 1/10th of the resources is also created, again operating in real-time, and it is able to recognize the postures with an accuracy of around 86.4% -only 6.6% lower than the much larger system. The recognition rate of the smaller network developed on this neuromorphic system is sufficient for a successful hand posture recognition system, and demonstrates a much-improved cost to performance trade-off in its approach.

Keywords: spiking neural network (SNN), convolutional neural network (CNN), posture recognition, neuromorphic system

Procedia PDF Downloads 450
3872 An Innovation and Development System for a New Hybrid Composite Technology in Aerospace Industry

Authors: M. Fette, J. P. Wulfsberg, A. Herrmann, R. H. Ladstaetter

Abstract:

Present and future lightweight design represents an important key to successful implementation of energy-saving, fuel-efficient and environmentally friendly means of transport in the aerospace and automotive industry. In this context the use of carbon fibre reinforced plastics (CFRP) which are distinguished by their outstanding mechanical properties at relatively low weight, promise significant improvements. Due to the reduction of the total mass, with the resulting lowered fuel or energy consumption and CO2 emissions during the operational phase, commercial aircraft and future vehicles will increasingly be made of CFRP. An auspicious technology for the efficient and economic production of high performance thermoset composites and hybrid structures for future lightweight applications is the combination of carbon fibre sheet moulding compound (SMC), tailored continuous carbon fibre reinforcements and metallic components in a one-shot pressing and curing process. This paper deals with a new hybrid composite technology for aerospace industries, which was developed with the help of a universal innovation and development system. This system supports the management of idea generation, the methodical development of innovative technologies and the achievement of the industrial readiness of these technologies.

Keywords: development system, hybrid composite, innovation system, prepreg, sheet moulding compound

Procedia PDF Downloads 320
3871 Heroin and Opiates Metabolites Tracing by Gas-Chromatography Isotope Ratio Mass Spectrometry

Authors: Yao-Te Yen, Chao-Hsin Cheng, Meng-Shun Huang, Shan-Zong Cyue

Abstract:

'Poppy-seed defense' has been a serious problem all over the world, that is because the opiates metabolites in urine are difficult to distinguish where they come from precisely. In this research, a powerful analytic method has been developed to trace the opiates metabolites in urine by Gas-Chromatography Isotope Ratio Mass Spectrometry (GC-IRMS). In order to eliminate the interference of synthesis to heroin or metabolism through human body, opiates metabolites in urine and sized heroin were hydrolyzed to morphine. Morphine is the key compound for tracing between opiates metabolites and seized heroin in this research. By matching δ13C and δ15N values through morphine, it is successful to distinguish the opiates metabolites coming from heroin or medicine. We tested seven heroin abuser’s metabolites and seized heroin in crime sites, the result showed that opiates metabolites coming from seized heroin, the variation of δ13C and δ15N for morphine are within 0.2 and 2.5‰, respectively. The variation of δ13C and δ15N for morphine are reasonable with the result of matrix match experiments. Above all, the uncertainty of 'Poppy-seed defense' can be solved easily by this analytic method, it provides the direct evidence for judge to make accurate conviction without hesitation.

Keywords: poppy-seed defense, heroin, opiates metabolites, isotope ratio mass spectrometry

Procedia PDF Downloads 225
3870 Direct Visualization of Shear Induced Structures in Wormlike Micellar Solutions by Microfluidics and Advanced Microscopy

Authors: Carla Caiazza, Valentina Preziosi, Giovanna Tomaiuolo, Denis O'Sullivan, Vincenzo Guida, Stefano Guido

Abstract:

In the last decades, wormlike micellar solutions have been extensively used to tune the rheological behavior of home care and personal care products. This and other successful applications underlie the growing attention that both basic and applied research are devoting to these systems, and to their unique rheological and flow properties. One of the key research topics is the occurrence of flow instabilities at high shear rates (such as shear banding), with the possibility of appearance of flow induced structures. In this scenario, microfluidics is a powerful tool to get a deeper insight into the flow behavior of a wormlike micellar solution, as the high confinement of a microfluidic device facilitates the onset of the flow instabilities; furthermore, thanks to its small dimensions, it can be coupled with optical microscopy, allowing a direct visualization of flow structuring phenomena. Here, the flow of a widely used wormlike micellar solution through a glass capillary has been studied, by coupling the microfluidic device with μPIV techniques. The direct visualization of flow-induced structures and the flow visualization analysis highlight a relationship between solution structuring and the onset of discontinuities in the velocity profile.

Keywords: flow instabilities, flow-induced structures, μPIV, wormlike micelles

Procedia PDF Downloads 333
3869 An Evaluation of a First Year Introductory Statistics Course at a University in Jamaica

Authors: Ayesha M. Facey

Abstract:

The evaluation sought to determine the factors associated with the high failure rate among students taking a first-year introductory statistics course. By utilizing Tyler’s Objective Based Model, the main objectives were: to assess the effectiveness of the lecturer’s teaching strategies; to determine the proportion of students who attends lectures and tutorials frequently and to determine the impact of infrequent attendance on performance; to determine how the assigned activities assisted in students understanding of the course content; to ascertain the possible issues being faced by students in understanding the course material and obtain possible solutions to the challenges and to determine whether the learning outcomes have been achieved based on an assessment of the second in-course examination. A quantitative survey research strategy was employed and the study population was students enrolled in semester one of the academic year 2015/2016. A convenience sampling approach was employed resulting in a sample of 98 students. Primary data was collected using self-administered questionnaires over a one-week period. Secondary data was obtained from the results of the second in-course examination. Data were entered and analyzed in SPSS version 22 and both univariate and bivariate analyses were conducted on the information obtained from the questionnaires. Univariate analyses provided description of the sample through means, standard deviations and percentages while bivariate analyses were done using Spearman’s Rho correlation coefficient and Chi-square analyses. For secondary data, an item analysis was performed to obtain the reliability of the examination questions, difficulty index and discriminant index. The examination results also provided information on the weak areas of the students and highlighted the learning outcomes that were not achieved. Findings revealed that students were more likely to participate in lectures than tutorials and that attendance was high for both lectures and tutorials. There was a significant relationship between participation in lectures and performance on examination. However, a high proportion of students has been absent from three or more tutorials as well as lectures. A higher proportion of students indicated that they completed the assignments obtained from the lectures sometimes while they rarely completed tutorial worksheets. Students who were more likely to complete their assignments were significantly more likely to perform well on their examination. Additionally, students faced a number of challenges in understanding the course content and the topics of probability, binomial distribution and normal distribution were the most challenging. The item analysis also highlighted these topics as problem areas. Problems doing mathematics and application and analyses were their major challenges faced by students and most students indicated that some of the challenges could be alleviated if additional examples were worked in lectures and they were given more time to solve questions. Analysis of the examination results showed that a number of learning outcomes were not achieved for a number of topics. Based on the findings recommendations were made that suggested adjustments to grade allocations, delivery of lectures and methods of assessment.

Keywords: evaluation, item analysis, Tyler’s objective based model, university statistics

Procedia PDF Downloads 178
3868 Findings in Vascular Catheter Cultures at the Laboratory of Microbiology of General Hospital during One Year

Authors: P. Christodoulou, M. Gerasimou, S. Mantzoukis, N. Varsamis, G. Kolliopoulou, N. Zotos

Abstract:

Abstract— Purpose: The Intensive Care Unit (ICU) environment is conducive to the growth of microorganisms. A variety of microorganisms gain access to the intravascular area and are transported throughout the circulatory system. Therefore, examination of the catheters used in ICU patients is of paramount importance. Material and Method: The culture medium is a catheter tip, which is enriched with Tryptic soy broth (TSB). After one day of incubation, the broth is passaged in the following selective media: Blood, Mac conkey No. 2, chocolate, Mueller Hinton, Chapman, and Saboureaud agar. The above selective media is incubated for 2 days. After this period, if any number of microbial colonies is detected, gram staining is performed and then the microorganisms are identified by biochemical techniques in the automated Microscan (Siemens) system followed by a sensitivity test in the same system using the minimum inhibitory concentration (MIC) technique. The sensitivity test is verified by a Kirby Bauer test. Results: In 2017, the Microbiology Laboratory received 84 catheters from the ICU. 42 were found positive. Of these, S. epidermidis was identified at 8, A. baumannii in 10, K. pneumoniae in 6, P. aeruginosa in 6, P. mirabilis in 3, S. simulans in 1, S. haemolyticus in 4, S. aureus in 3 and S. hominis in 1. Conclusions: The results show that the placement and maintenance of the catheters in ICU patients are relatively successful, despite the unfavorable environment of the unit.

Keywords: culture, intensive care unit, microorganisms, vascular catheters

Procedia PDF Downloads 272
3867 Atomic Decomposition Audio Data Compression and Denoising Using Sparse Dictionary Feature Learning

Authors: T. Bryan , V. Kepuska, I. Kostnaic

Abstract:

A method of data compression and denoising is introduced that is based on atomic decomposition of audio data using “basis vectors” that are learned from the audio data itself. The basis vectors are shown to have higher data compression and better signal-to-noise enhancement than the Gabor and gammatone “seed atoms” that were used to generate them. The basis vectors are the input weights of a Sparse AutoEncoder (SAE) that is trained using “envelope samples” of windowed segments of the audio data. The envelope samples are extracted from the audio data by performing atomic decomposition with Gabor or gammatone seed atoms. This process identifies segments of audio data that are locally coherent with the seed atoms. Envelope samples are extracted by identifying locally coherent audio data segments with Gabor or gammatone seed atoms, found by matching pursuit. The envelope samples are formed by taking the kronecker products of the atomic envelopes with the locally coherent data segments. Oracle signal-to-noise ratio (SNR) verses data compression curves are generated for the seed atoms as well as the basis vectors learned from Gabor and gammatone seed atoms. SNR data compression curves are generated for speech signals as well as early American music recordings. The basis vectors are shown to have higher denoising capability for data compression rates ranging from 90% to 99.84% for speech as well as music. Envelope samples are displayed as images by folding the time series into column vectors. This display method is used to compare of the output of the SAE with the envelope samples that produced them. The basis vectors are also displayed as images. Sparsity is shown to play an important role in producing the highest denoising basis vectors.

Keywords: sparse dictionary learning, autoencoder, sparse autoencoder, basis vectors, atomic decomposition, envelope sampling, envelope samples, Gabor, gammatone, matching pursuit

Procedia PDF Downloads 237
3866 Nursing Experience of Helping the Mother of a Dying Baby by Applying Watson's Theory of Human Caring

Authors: Ya-Ping Chang

Abstract:

Starting from the early stages of pregnancy, parents begin to form hopes and dreams about the future of their child. They will think about the appearance and personality of their child and may even develop many expectations. The patient in this study experienced a successful pregnancy following multiple attempts at artificial insemination. However, due to arrested embryonic development, and based on the physician’s evaluation, a caesarean section was performed at week 25. However, the baby suffered from infections and subsequently died from multiple organ failures. This study collected and analyzed objective and subjective data through observation, interviews, recording, and interactions with the patient. The following nursing issues of the patient were identified: anxiety, anticipatory grief, and adjustment disorder. The psychology of caring as proposed in Watson’s theory was applied to address these nursing issues. Comprehensive and continuous care was provided to the patient on the basis of mutual trust and individual nursing guidelines in order to alleviate the patient’s anxiety, help her to cope with grief, and prepare her for the eventual death of her child. The author helped the patient to say goodbye to her child and accept the child’s death calmly, such that she had no regrets about the experience. This nursing experience may serve as a reference to nurses managing similar cases in the future.

Keywords: dying baby, mother, grief, Watson’s theory

Procedia PDF Downloads 154
3865 Neural Network Modelling for Turkey Railway Load Carrying Demand

Authors: Humeyra Bolakar Tosun

Abstract:

The transport sector has an undisputed place in human life. People need transport access to continuous increase day by day with growing population. The number of rail network, urban transport planning, infrastructure improvements, transportation management and other related areas is a key factor affecting our country made it quite necessary to improve the work of transportation. In this context, it plays an important role in domestic rail freight demand planning. Alternatives that the increase in the transportation field and has made it mandatory requirements such as the demand for improving transport quality. In this study generally is known and used in studies by the definition, rail freight transport, railway line length, population, energy consumption. In this study, Iron Road Load Net Demand was modeled by multiple regression and ANN methods. In this study, model dependent variable (Output) is Iron Road Load Net demand and 6 entries variable was determined. These outcome values extracted from the model using ANN and regression model results. In the regression model, some parameters are considered as determinative parameters, and the coefficients of the determinants give meaningful results. As a result, ANN model has been shown to be more successful than traditional regression model.

Keywords: railway load carrying, neural network, modelling transport, transportation

Procedia PDF Downloads 134
3864 Player Experience: A Research on Cross-Platform Supported Games

Authors: Salih Akkemik

Abstract:

User Experience has a characterized perspective based on two fundamentals: the usage process and the product. Digital games can be considered as a special interactive system. This system has a very specific purpose and this is to make the player feel good while playing. At this point, Player Experience (PX) and User Experience (UX) are similar. UX focuses on the user feels good, PX focuses on the player feels good. The most important difference between the two is the action taken. These are actions of using and playing. In this study, the player experience will be examined primarily. PX may differ on different platforms. Nowadays, companies are releasing the successful and high-income games that they have developed with cross-platform support. Cross-platform is the most common expression that an application can run on different operating systems, in other words, be developed to support different operating systems. In terms of digital games, cross-platform support means that a game can be played on a computer, console or mobile device environment, more specifically, the game developed is designed and programmed to be played in the same way on at least two different platforms, such as Windows, MacOS, Linux, iOS, Android, Orbis OS or Xbox OS. Different platforms also accommodate different player groups, profiles and preferences. This study aims to examine these different player profiles in terms of player experience and to determine the effects of cross-platform support on player experience.

Keywords: cross-platform, digital games, player experience, user experience

Procedia PDF Downloads 192
3863 Juvenile Justice in Maryland: The Evidence Based Approach to Youth with History of Victimization and Trauma

Authors: Gabriela Wasileski, Debra L. Stanley

Abstract:

Maryland efforts to decrease the juvenile criminality and recidivism shifts towards evidence based sentencing. While in theory the evidence based sentencing has an impact on the reduction of juvenile delinquency and drug abuse; the assessment of juveniles’ risk and needs usually lacks crucial information about juvenile’s prior victimization. The Maryland Comprehensive Assessment and Service Planning (MCASP) Initiative is the primary tool for developing and delivering a treatment service plan for juveniles at risk. Even though it consists of evidence-based screening and assessment instruments very little is currently known regarding the effectiveness and the impact of the assessment in general. In keeping with Maryland’s priority to develop successful evidence-based recidivism reduction programs, this study examined results of assessments based on MCASP using a representative sample of the juveniles at risk and their assessment results. Specifically, it examined: (1) the results of the assessments in an electronic database (2) areas of need that are more frequent among delinquent youth in a system/agency, (3) the overall progress of youth in an agency’s care (4) the impact of child victimization and trauma experiences reported in the assessment. The project will identify challenges regarding the use of MCASP in Maryland, and will provide a knowledge base to support future research and practices.

Keywords: Juvenile Justice, assessment of risk and need, victimization and crime, recidivism

Procedia PDF Downloads 301
3862 Pavement Management for a Metropolitan Area: A Case Study of Montreal

Authors: Luis Amador Jimenez, Md. Shohel Amin

Abstract:

Pavement performance models are based on projections of observed traffic loads, which makes uncertain to study funding strategies in the long run if history does not repeat. Neural networks can be used to estimate deterioration rates but the learning rate and momentum have not been properly investigated, in addition, economic evolvement could change traffic flows. This study addresses both issues through a case study for roads of Montreal that simulates traffic for a period of 50 years and deals with the measurement error of the pavement deterioration model. Travel demand models are applied to simulate annual average daily traffic (AADT) every 5 years. Accumulated equivalent single axle loads (ESALs) are calculated from the predicted AADT and locally observed truck distributions combined with truck factors. A back propagation Neural Network (BPN) method with a Generalized Delta Rule (GDR) learning algorithm is applied to estimate pavement deterioration models capable of overcoming measurement errors. Linear programming of lifecycle optimization is applied to identify M&R strategies that ensure good pavement condition while minimizing the budget. It was found that CAD 150 million is the minimum annual budget to good condition for arterial and local roads in Montreal. Montreal drivers prefer the use of public transportation for work and education purposes. Vehicle traffic is expected to double within 50 years, ESALS are expected to double the number of ESALs every 15 years. Roads in the island of Montreal need to undergo a stabilization period for about 25 years, a steady state seems to be reached after.

Keywords: pavement management system, traffic simulation, backpropagation neural network, performance modeling, measurement errors, linear programming, lifecycle optimization

Procedia PDF Downloads 446
3861 Hierarchical Clustering Algorithms in Data Mining

Authors: Z. Abdullah, A. R. Hamdan

Abstract:

Clustering is a process of grouping objects and data into groups of clusters to ensure that data objects from the same cluster are identical to each other. Clustering algorithms in one of the areas in data mining and it can be classified into partition, hierarchical, density based, and grid-based. Therefore, in this paper, we do a survey and review for four major hierarchical clustering algorithms called CURE, ROCK, CHAMELEON, and BIRCH. The obtained state of the art of these algorithms will help in eliminating the current problems, as well as deriving more robust and scalable algorithms for clustering.

Keywords: clustering, unsupervised learning, algorithms, hierarchical

Procedia PDF Downloads 866
3860 Tip60 Histone Acetyltransferase Activators as Neuroepigenetic Therapeutic Modulators for Alzheimer’s Disease

Authors: Akanksha Bhatnagar, Sandhya Kortegare, Felice Elefant

Abstract:

Context: Alzheimer's disease (AD) is a neurodegenerative disorder that is characterized by progressive cognitive decline and memory loss. The cause of AD is not fully understood, but it is thought to be caused by a combination of genetic, environmental, and lifestyle factors. One of the hallmarks of AD is the loss of neurons in the hippocampus, a brain region that is important for memory and learning. This loss of neurons is thought to be caused by a decrease in histone acetylation, which is a process that regulates gene expression. Research Aim: The research aim of the study was to develop mall molecule compounds that can enhance the activity of Tip60, a histone acetyltransferase that is important for memory and learning. Methodology/Analysis: The researchers used in silico structural modeling and a pharmacophore-based virtual screening approach to design and synthesize small molecule compounds strongly predicted to target and enhance Tip60’s HAT activity. The compounds were then tested in vitro and in vivo to assess their ability to enhance Tip60 activity and rescue cognitive deficits in AD models. Findings: The researchers found that several of the compounds were able to enhance Tip60 activity and rescue cognitive deficits in AD models. The compounds were also developed to cross the blood-brain barrier, which is an important factor for the development of potential AD therapeutics. Theoretical Importance: The findings of this study suggest that Tip60 HAT activators have the potential to be developed as therapeutic agents for AD. The compounds are specific to Tip60, which suggests that they may have fewer side effects than other HDAC inhibitors. Additionally, the compounds are able to cross the blood-brain barrier, which is a major hurdle for the development of AD therapeutics. Data Collection: The study collected data from a variety of sources, including in vitro assays and animal models. The in vitro assays assessed the ability of compounds to enhance Tip60 activity using histone acetyltransferase (HAT) enzyme assays and chromatin immunoprecipitation assays. Animal models were used to assess the ability of the compounds to rescue cognitive deficits in AD models using a variety of behavioral tests, including locomotor ability, sensory learning, and recognition tasks. The human clinical trials will be used to assess the safety and efficacy of the compounds in humans. Questions: The question addressed by this study was whether Tip60 HAT activators could be developed as therapeutic agents for AD. Conclusions: The findings of this study suggest that Tip60 HAT activators have the potential to be developed as therapeutic agents for AD. The compounds are specific to Tip60, which suggests that they may have fewer side effects than other HDAC inhibitors. Additionally, the compounds are able to cross the blood-brain barrier, which is a major hurdle for the development of AD therapeutics. Further research is needed to confirm the safety and efficacy of these compounds in humans.

Keywords: Alzheimer's disease, cognition, neuroepigenetics, drug discovery

Procedia PDF Downloads 58
3859 Signal Processing of Barkhausen Noise Signal for Assessment of Increasing Down Feed in Surface Ground Components with Poor Micro-Magnetic Response

Authors: Tanmaya Kumar Dash, Tarun Karamshetty, Soumitra Paul

Abstract:

The Barkhausen Noise Analysis (BNA) technique has been utilized to assess surface integrity of steels. But the BNA technique is not very successful in evaluating surface integrity of ground steels that exhibit poor micro-magnetic response. A new approach has been proposed for the processing of BN signal with Fast Fourier transforms while Wavelet transforms has been used to remove noise from the BN signal, with judicious choice of the ‘threshold’ value, when the micro-magnetic response of the work material is poor. In the present study, the effect of down feed induced upon conventional plunge surface grinding of hardened bearing steel has been investigated along with an ultrasonically cleaned, wet polished and a sample ground with spark out technique for benchmarking. Moreover, the FFT analysis has been established, at different sets of applied voltages and applied frequency and the pattern of the BN signal in the frequency domain is analyzed. The study also depicts the wavelet transforms technique with different levels of decomposition and different mother wavelets, which has been used to reduce the noise value in BN signal of materials with poor micro-magnetic response, in order to standardize the procedure for all BN signals depending on the frequency of the applied voltage.

Keywords: barkhausen noise analysis, grinding, magnetic properties, signal processing, micro-magnetic response

Procedia PDF Downloads 654
3858 A Review on Using Executive Function to Understand the Limited Efficacy of Weight-Loss Interventions

Authors: H. Soltani, Kevin Laugero

Abstract:

Obesity is becoming an increasingly critical issue in the United States due to the steady and substantial increase in prevalence over the last 30 years. Existing interventions have been able to help participants achieve short-term weight loss, but have failed to show long-term results. The complex nature of behavioral change remains one of the most difficult barriers in promoting sustainable weight-loss in overweight individuals. Research suggests that the 'intention-behavior gap' can be explained by a person’s ability to regulate higher-order thinking, or Executive Function (EF). A review of 63 research articles was completed in fall of 2017 to identify the role of EF in regulating eating behavior and to identify whether there is a potential for improving dietary quality by enhancing EF. Results showed that poor EF is positively associated with obesogenic behavior, namely increased consumption of highly palatable foods, eating in the absence of hunger, high saturated fat intake and low fruit and vegetable consumption. Recent research has indicated that interventions targeting an improvement in EF can be successful in helping promote healthy behaviors. Furthermore, interventions of longer duration have a more lasting and versatile effect on weight loss and maintenance. This may present an opportunity for the increasingly ubiquitous use of mobile application technology.

Keywords: eating behavior, executive function, nutrition, obesity, weight-loss

Procedia PDF Downloads 150
3857 Evaluation of Critical Success Factors in Public-Private Partnership Projects Based on Structural Equation Model

Authors: Medya Fathi

Abstract:

Today, success in the construction industry is not merely about project completion in a timely manner within an established budget and meeting required quality considerations. Management practices and partnerships need to be emphasized as well. In this regard, critical success factors (CSFs) cover necessary considerations for a successful project beyond the traditional success definition, which vary depending on project outcomes, delivery methods, project types, and partnering processes. Despite the extensive research on CSFs, there is a paucity of studies that examine CSFs for public-private partnership (PPP); the delivery method, which has gained increasing attention from researchers and practitioners over the last decade with a slow but growing adoption in the transportation infrastructure, particularly, highway industry. To fill this knowledge gap, data are collected through questionnaire surveys among private and public parties involved in PPP transportation projects in the United States. Then, the collected data are analyzed to explore the causality relationships between CSFs and PPP project success using structural equation model and provide a list of factors with the greatest influence. This study advocates adopting a critical success factor approach to enhance PPP success in the U.S. transportation industry and identify elements essential for public and private organizations to achieve this success.

Keywords: project success, critical success factors, public-private partnership, transportation

Procedia PDF Downloads 79
3856 Governance, Risk Management, and Compliance Factors Influencing the Adoption of Cloud Computing in Australia

Authors: Tim Nedyalkov

Abstract:

A business decision to move to the cloud brings fundamental changes in how an organization develops and delivers its Information Technology solutions. The accelerated pace of digital transformation across businesses and government agencies increases the reliance on cloud-based services. They are collecting, managing, and retaining large amounts of data in cloud environments makes information security and data privacy protection essential. It becomes even more important to understand what key factors drive successful cloud adoption following the commencement of the Privacy Amendment Notifiable Data Breaches (NDB) Act 2017 in Australia as the regulatory changes impact many organizations and industries. This quantitative correlational research investigated the governance, risk management, and compliance factors contributing to cloud security success. The factors influence the adoption of cloud computing within an organizational context after the commencement of the NDB scheme. The results and findings demonstrated that corporate information security policies, data storage location, management understanding of data governance responsibilities, and regular compliance assessments are the factors influencing cloud computing adoption. The research has implications for organizations, future researchers, practitioners, policymakers, and cloud computing providers to meet the rapidly changing regulatory and compliance requirements.

Keywords: cloud compliance, cloud security, data governance, privacy protection

Procedia PDF Downloads 103
3855 Prioritizing the TQM Enablers and IT Resources in the ICT Industry: An AHP Approach

Authors: Suby Khanam, Faisal Talib, Jamshed Siddiqui

Abstract:

Total Quality Management (TQM) is a managerial approach that improves the competitiveness of the industry, meanwhile Information technology (IT) was introduced with TQM for handling the technical issues which is supported by quality experts for fulfilling the customers’ requirement. Present paper aims to utilise AHP (Analytic Hierarchy Process) methodology to priorities and rank the hierarchy levels of TQM enablers and IT resource together for its successful implementation in the Information and Communication Technology (ICT) industry. A total of 17 TQM enablers (nine) and IT resources (eight) were identified and partitioned into 3 categories and were prioritised by AHP approach. The finding indicates that the 17 sub-criteria can be grouped into three main categories namely organizing, tools and techniques, and culture and people. Further, out of 17 sub-criteria, three sub-criteria: Top management commitment and support, total employee involvement, and continuous improvement got highest priority whereas three sub-criteria such as structural equation modelling, culture change, and customer satisfaction got lowest priority. The result suggests a hierarchy model for ICT industry to prioritise the enablers and resources as well as to improve the TQM and IT performance in the ICT industry. This paper has some managerial implication which suggests the managers of ICT industry to implement TQM and IT together in their organizations to get maximum benefits and how to utilize available resources. At the end, conclusions, limitation, future scope of the study are presented.

Keywords: analytic hierarchy process, information technology, information and communication technology, prioritization, total quality management

Procedia PDF Downloads 331
3854 Field-Programmable Gate Array Based Tester for Protective Relay

Authors: H. Bentarzi, A. Zitouni

Abstract:

The reliability of the power grid depends on the successful operation of thousands of protective relays. The failure of one relay to operate as intended may lead the entire power grid to blackout. In fact, major power system failures during transient disturbances may be caused by unnecessary protective relay tripping rather than by the failure of a relay to operate. Adequate relay testing provides a first defense against false trips of the relay and hence improves power grid stability and prevents catastrophic bulk power system failures. The goal of this research project is to design and enhance the relay tester using a technology such as Field Programmable Gate Array (FPGA) card NI 7851. A PC based tester framework has been developed using Simulink power system model for generating signals under different conditions (faults or transient disturbances) and LabVIEW for developing the graphical user interface and configuring the FPGA. Besides, the interface system has been developed for outputting and amplifying the signals without distortion. These signals should be like the generated ones by the real power system and large enough for testing the relay’s functionality. The signals generated that have been displayed on the scope are satisfactory. Furthermore, the proposed testing system can be used for improving the performance of protective relay.

Keywords: amplifier class D, field-programmable gate array (FPGA), protective relay, tester

Procedia PDF Downloads 198
3853 An Approach to Autonomous Drones Using Deep Reinforcement Learning and Object Detection

Authors: K. R. Roopesh Bharatwaj, Avinash Maharana, Favour Tobi Aborisade, Roger Young

Abstract:

Presently, there are few cases of complete automation of drones and its allied intelligence capabilities. In essence, the potential of the drone has not yet been fully utilized. This paper presents feasible methods to build an intelligent drone with smart capabilities such as self-driving, and obstacle avoidance. It does this through advanced Reinforcement Learning Techniques and performs object detection using latest advanced algorithms, which are capable of processing light weight models with fast training in real time instances. For the scope of this paper, after researching on the various algorithms and comparing them, we finally implemented the Deep-Q-Networks (DQN) algorithm in the AirSim Simulator. In future works, we plan to implement further advanced self-driving and object detection algorithms, we also plan to implement voice-based speech recognition for the entire drone operation which would provide an option of speech communication between users (People) and the drone in the time of unavoidable circumstances. Thus, making drones an interactive intelligent Robotic Voice Enabled Service Assistant. This proposed drone has a wide scope of usability and is applicable in scenarios such as Disaster management, Air Transport of essentials, Agriculture, Manufacturing, Monitoring people movements in public area, and Defense. Also discussed, is the entire drone communication based on the satellite broadband Internet technology for faster computation and seamless communication service for uninterrupted network during disasters and remote location operations. This paper will explain the feasible algorithms required to go about achieving this goal and is more of a reference paper for future researchers going down this path.

Keywords: convolution neural network, natural language processing, obstacle avoidance, satellite broadband technology, self-driving

Procedia PDF Downloads 230
3852 A Longitudinal Case Study of Greek as a Second Language

Authors: M. Vassou, A. Karasimos

Abstract:

A primary concern in the field of Second Language Acquisition (SLA) research is to determine the innate mechanisms of second language learning and acquisition through the systematic study of a learner's interlanguage. Errors emerge while a learner attempts to communicate using the target-language and can be seen either as the observable linguistic product of the latent cognitive and language process of mental representations or as an indispensable learning mechanism. Therefore, the study of the learner’s erroneous forms may depict the various strategies and mechanisms that take place during the language acquisition process resulting in deviations from the target-language norms and difficulties in communication. Mapping the erroneous utterances of a late adult learner in the process of acquiring Greek as a second language constitutes one of the main aims of this study. For our research purposes, we created an error-tagged learner corpus composed of the participant’s written texts produced throughout a period of a 4- year instructed language acquisition. Error analysis and interlanguage theory constitute the methodological and theoretical framework, respectively. The research questions pertain to the learner's most frequent errors per linguistic category and per year as well as his choices concerning the Greek Article System. According to the quantitative analysis of the data, the most frequent errors are observed in the categories of the stress system and syntax, whereas a significant fluctuation and/or gradual reduction throughout the 4 years of instructed acquisition indicate the emergence of developmental stages. The findings with regard to the article usage bespeak fossilization of erroneous structures in certain contexts. In general, our results point towards the existence and further development of an established learner’s (inter-) language system governed not only by mother- tongue and target-language influences but also by the learner’s assumptions and set of rules as the result of a complex cognitive process. It is expected that this study will contribute not only to the knowledge in the field of Greek as a second language and SLA generally, but it will also provide an insight into the cognitive mechanisms and strategies developed by multilingual learners of late adulthood.

Keywords: Greek as a second language, error analysis, interlanguage, late adult learner

Procedia PDF Downloads 115