Search results for: complex network platform
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11227

Search results for: complex network platform

1897 Simulation of the FDA Centrifugal Blood Pump Using High Performance Computing

Authors: Mehdi Behbahani, Sebastian Rible, Charles Moulinec, Yvan Fournier, Mike Nicolai, Paolo Crosetto

Abstract:

Computational Fluid Dynamics blood-flow simulations are increasingly used to develop and validate blood-contacting medical devices. This study shows that numerical simulations can provide additional and accurate estimates of relevant hemodynamic indicators (e.g., recirculation zones or wall shear stresses), which may be difficult and expensive to obtain from in-vivo or in-vitro experiments. The most recent FDA (Food and Drug Administration) benchmark consisted of a simplified centrifugal blood pump model that contains fluid flow features as they are commonly found in these devices with a clear focus on highly turbulent phenomena. The FDA centrifugal blood pump study is composed of six test cases with different volumetric flow rates ranging from 2.5 to 7.0 liters per minute, pump speeds, and Reynolds numbers ranging from 210,000 to 293,000. Within the frame of this study different turbulence models were tested including RANS models, e.g. k-omega, k-epsilon and a Reynolds Stress Model (RSM) and, LES. The partitioners Hilbert, METIS, ParMETIS and SCOTCH were used to create an unstructured mesh of 76 million elements and compared in their efficiency. Computations were performed on the JUQUEEN BG/Q architecture applying the highly parallel flow solver Code SATURNE and typically using 32768 or more processors in parallel. Visualisations were performed by means of PARAVIEW. Different turbulence models including all six flow situations could be successfully analysed and validated against analytical considerations and from comparison to other data-bases. It showed that an RSM represents an appropriate choice with respect to modeling high-Reynolds number flow cases. Especially, the Rij-SSG (Speziale, Sarkar, Gatzki) variant turned out to be a good approach. Visualisation of complex flow features could be obtained and the flow situation inside the pump could be characterized.

Keywords: blood flow, centrifugal blood pump, high performance computing, scalability, turbulence

Procedia PDF Downloads 383
1896 A Corpus Study of English Verbs in Chinese EFL Learners’ Academic Writing Abstracts

Authors: Shuaili Ji

Abstract:

The correct use of verbs is an important element of high-quality research articles, and thus for Chinese EFL learners, it is significant to master characteristics of verbs and to precisely use verbs. However, some researches have shown that there are differences in using verbs between learners and native speakers and learners have difficulty in using English verbs. This corpus-based quantitative research can enhance learners’ knowledge of English verbs and promote the quality of research article abstracts even of the whole academic writing. The aim of this study is to find the differences between learners’ and native speakers’ use of verbs and to study the factors that contribute to those differences. To this end, the research question is as follows: What are the differences between most frequently used verbs by learners and those by native speakers? The research question is answered through a study that uses corpus-based data-driven approach to analyze the verbs used by learners in their abstract writings in terms of collocation, colligation and semantic prosody. The results show that: (1) EFL learners obviously overused ‘be, can, find, make’ and underused ‘investigate, examine, may’. As to modal verbs, learners obviously overused ‘can’ while underused ‘may’. (2) Learners obviously overused ‘we find + object clauses’ while underused ‘nouns (results, findings, data) + suggest/indicate/reveal + object clauses’ when expressing research results. (3) Learners tended to transfer the collocation, colligation and semantic prosody of shǐ and zuò to make. (4) Learners obviously overused ‘BE+V-ed’ and used BE as the main verb. They also obviously overused the basic forms of BE such as be, is, are, while obviously underused its inflections (was, were). These results manifested learners’ lack of accuracy and idiomatic property in verb usage. Due to the influence of the concept transfer of Chinese, the verbs in learners’ abstracts showed obvious transfer of mother language. In addition, learners have not fully mastered the use of verbs, avoiding using complex colligations to prevent errors. Based on these findings, the present study has implications for English teaching, seeking to have implications for English academic abstract writing in China. Further research could be undertaken to study the use of verbs in the whole dissertation to find out whether the characteristic of the verbs in abstracts can apply in the whole dissertation or not.

Keywords: academic writing abstracts, Chinese EFL learners, corpus-based, data-driven, verbs

Procedia PDF Downloads 339
1895 Linking the Built Environment, Activities and Well-Being: Examining the Stories among Older Adults during Ageing-in-Place

Authors: Wenquan Gan, Peiyu Zhao, Xinyu Zhao

Abstract:

Under the background of the rapid development of China’s ageing population, ageing-in-place has become a primary strategy to cope with this problem promoted by the Chinese government. However, most older adults currently living in old residential communities are insufficient to support their ageing-in-place. Therefore, exploring how to retrofit existing communities towards ageing-friendly standards to support older adults is essential for healthy ageing. To better cope with this issue, this study aims to shed light on the inter-relationship among the built environment, daily activities, and well-being of older adults in urban China. Using mixed research methods including GPS tracking, structured observation, and in-depth interview to examine: (a) what specific places or facilities are most commonly used by the elderly in the ageing-in-place process; (b) what specific built environment characteristics attract older adults in these frequently used places; (c) how has the use of these spaces impacted the well-being of older adults. Specifically, structured observation and GPS are used to record and map the older residents’ behaviour and movement in Suzhou, China, a city with a highly aged population and suitable as a research case. Subsequently, a follow-up interview is conducted to explore what impact of activities and the built environment on their well-being. Results showed that for the elderly with good functional ability, the facilities promoted by the Chinese government to support ageing-in-place, such as community nursing homes for the aged, day-care centre, and activity centres for the aged, are rarely used by older adults. Additionally, older adults have their preferred activities and built environment characteristics that contribute to their well-being. Our findings indicate that a complex interrelationship between the built environment and activities can influence the well-being of the elderly. Further investigations are needed to understand how to support healthy ageing-in-place, especially in addition to providing permanent elder-ly-care facilities, but to attend to the design interventions that can enhance these particularly built environment characteristics to facilitate a healthy lifestyle in later life.

Keywords: older adults, built environment, spatial behavior, community activity, healthy ageing

Procedia PDF Downloads 111
1894 The Molecular Analysis of Effect of Phytohormones and Spermidine on Tomato Growth under Biotic Stress

Authors: Rumana Keyani, Haleema Sadia, Asia Nosheen, Rabia Naz, Humaira Yasmin, Sidra Zahoor

Abstract:

Tomato is a significant crop of the world and is one of the staple foods of Pakistan. A vast number of plant pathogens from simple viruses to complex parasites cause diseases in tomatoes but fungal infection in our country is quite high. Sometimes the symptoms are too harsh destroying the crop altogether. Countries like our own with continuously increasing massive population and limited resources cannot afford such an economic loss. There is an array of morphological, genetic, biochemical and molecular processes involved in plant resistance mechanisms to biotic stress. The study of different metabolic pathways like Jasmonic acid (JA) pathways and most importantly signaling molecules like ROS/RNS and their redoxin enzymes i.e. TRX and NRX is crucial to disease management, contributing to healthy plant growth. So, improving tolerance in crop plants against biotic stresses is a dire need of our country and world as whole. In the current study, fungal pathogenic strains Alternaria solani and Rhizoctonia solani were used to inoculate tomatoes to check the defense responses of tomato plant against these pathogens at molecular as well as phenotypic level with jasmonic acid and spermidine pretreatment. All the growth parameters (root and shoot length, dry and weight root, shoot weight measured 7 days post-inoculation, exhibited that infection drastically declined the growth of the plant whereas jasmonic acid and spermidine assisted the plants to cope up with the infection. Thus, JA and Spermidine treatments maintained comparatively better growth factors. Antioxidant assays and expression analysis through real time quantitative PCR following time course experiment at 24, 48 and 72 hours intervals also exhibited that activation of JA defense genes and a polyamine Spermidine helps in mediating tomato responses against fungal infection when used alone but the two treatments combined mask the effect of each other.

Keywords: fungal infection, jasmonic acid defence, tomato, spermidine

Procedia PDF Downloads 129
1893 The Effect of the Base Computer Method on Repetitive Behaviors and Communication Skills

Authors: Hoorieh Darvishi, Rezaei

Abstract:

Introduction: This study investigates the efficacy of computer-based interventions for children with Autism Spectrum Disorder , specifically targeting communication deficits and repetitive behaviors. The research evaluates novel software applications designed to enhance narrative capabilities and sensory integration through structured, progressive intervention protocols Method: The study evaluated two intervention software programs designed for children with autism, focusing on narrative speech and sensory integration. Twelve children aged 5-11 participated in the two-month intervention, attending three 45-minute weekly sessions, with pre- and post-tests measuring speech, communication, and behavioral outcomes. The narrative speech software incorporated 14 stories using the Cohen model. It progressively reduced software assistance as children improved their storytelling abilities, ultimately enabling independent narration. The process involved story comprehension questions and guided story completion exercises. The sensory integration software featured approximately 100 exercises progressing from basic classification to complex cognitive tasks. The program included attention exercises, auditory memory training (advancing from single to four-syllable words), problem-solving, decision-making, reasoning, working memory, and emotion recognition activities. Each module was accompanied by frequency and pitch-adjusted music that child enjoys it to enhance learning through multiple sensory channels (visual, auditory, and tactile). Conclusion: The results indicated that the use of these software programs significantly improved communication and narrative speech scores in children, while also reducing scores related to repetitive behaviors. Findings: These findings highlight the positive impact of computer-based interventions on enhancing communication skills and reducing repetitive behaviors in children with autism.

Keywords: autism, narrative speech, persian, SI, repetitive behaviors, communication

Procedia PDF Downloads 17
1892 Integrating Data Mining within a Strategic Knowledge Management Framework: A Platform for Sustainable Competitive Advantage within the Australian Minerals and Metals Mining Sector

Authors: Sanaz Moayer, Fang Huang, Scott Gardner

Abstract:

In the highly leveraged business world of today, an organisation’s success depends on how it can manage and organize its traditional and intangible assets. In the knowledge-based economy, knowledge as a valuable asset gives enduring capability to firms competing in rapidly shifting global markets. It can be argued that ability to create unique knowledge assets by configuring ICT and human capabilities, will be a defining factor for international competitive advantage in the mid-21st century. The concept of KM is recognized in the strategy literature, and increasingly by senior decision-makers (particularly in large firms which can achieve scalable benefits), as an important vehicle for stimulating innovation and organisational performance in the knowledge economy. This thinking has been evident in professional services and other knowledge intensive industries for over a decade. It highlights the importance of social capital and the value of the intellectual capital embedded in social and professional networks, complementing the traditional focus on creation of intellectual property assets. Despite the growing interest in KM within professional services there has been limited discussion in relation to multinational resource based industries such as mining and petroleum where the focus has been principally on global portfolio optimization with economies of scale, process efficiencies and cost reduction. The Australian minerals and metals mining industry, although traditionally viewed as capital intensive, employs a significant number of knowledge workers notably- engineers, geologists, highly skilled technicians, legal, finance, accounting, ICT and contracts specialists working in projects or functions, representing potential knowledge silos within the organisation. This silo effect arguably inhibits knowledge sharing and retention by disaggregating corporate memory, with increased operational and project continuity risk. It also may limit the potential for process, product, and service innovation. In this paper the strategic application of knowledge management incorporating contemporary ICT platforms and data mining practices is explored as an important enabler for knowledge discovery, reduction of risk, and retention of corporate knowledge in resource based industries. With reference to the relevant strategy, management, and information systems literature, this paper highlights possible connections (currently undergoing empirical testing), between an Strategic Knowledge Management (SKM) framework incorporating supportive Data Mining (DM) practices and competitive advantage for multinational firms operating within the Australian resource sector. We also propose based on a review of the relevant literature that more effective management of soft and hard systems knowledge is crucial for major Australian firms in all sectors seeking to improve organisational performance through the human and technological capability captured in organisational networks.

Keywords: competitive advantage, data mining, mining organisation, strategic knowledge management

Procedia PDF Downloads 418
1891 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton

Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani

Abstract:

Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.

Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton

Procedia PDF Downloads 330
1890 Memory Retrieval and Implicit Prosody during Reading: Anaphora Resolution by L1 and L2 Speakers of English

Authors: Duong Thuy Nguyen, Giulia Bencini

Abstract:

The present study examined structural and prosodic factors on the computation of antecedent-reflexive relationships and sentence comprehension in native English (L1) and Vietnamese-English bilinguals (L2). Participants read sentences presented on the computer screen in one of three presentation formats aimed at manipulating prosodic parsing: word-by-word (RSVP), phrase-segment (self-paced), or whole-sentence (self-paced), then completed a grammaticality rating and a comprehension task (following Pratt & Fernandez, 2016). The design crossed three factors: syntactic structure (simple; complex), grammaticality (target-match; target-mismatch) and presentation format. An example item is provided in (1): (1) The actress that (Mary/John) interviewed at the awards ceremony (about two years ago/organized outside the theater) described (herself/himself) as an extreme workaholic). Results showed that overall, both L1 and L2 speakers made use of a good-enough processing strategy at the expense of more detailed syntactic analyses. L1 and L2 speakers’ comprehension and grammaticality judgements were negatively affected by the most prosodically disrupting condition (word-by-word). However, the two groups demonstrated differences in their performance in the other two reading conditions. For L1 speakers, the whole-sentence and the phrase-segment formats were both facilitative in the grammaticality rating and comprehension tasks; for L2, compared with the whole-sentence condition, the phrase-segment paradigm did not significantly improve accuracy or comprehension. These findings are consistent with the findings of Pratt & Fernandez (2016), who found a similar pattern of results in the processing of subject-verb agreement relations using the same experimental paradigm and prosodic manipulation with English L1 and L2 English-Spanish speakers. The results provide further support for a Good-Enough cue model of sentence processing that integrates cue-based retrieval and implicit prosodic parsing (Pratt & Fernandez, 2016) and highlights similarities and differences between L1 and L2 sentence processing and comprehension.

Keywords: anaphora resolution, bilingualism, implicit prosody, sentence processing

Procedia PDF Downloads 155
1889 Extraction and Quantification of Triclosan in Wastewater Samples Using Molecularly Imprinted Membrane Adsorbent

Authors: Siyabonga Aubrey Mhlongo, Linda Lunga Sibali, Phumlane Selby Mdluli, Peter Papoh Ndibewu, Kholofelo Clifford Malematja

Abstract:

This paper reports on the successful extraction and quantification of an antibacterial and antifungal agent present in some consumer products (Triclosan: C₁₂H₇Cl₃O₂)generally found in wastewater or effluents using molecularly imprinted membrane adsorbent (MIMs) followed by quantification and removal on a high-performance liquid chromatography (HPLC). Triclosan is an antibacterial and antifungal agent present in some consumer products like toothpaste, soaps, detergents, toys, and surgical cleaning treatments. The MIMs was fabricated usingpolyvinylidene fluoride (PVDF) polymer with selective micro composite particles known as molecularly imprinted polymers (MIPs)via a phase inversion by immersion precipitation technique. This resulted in an improved hydrophilicity and mechanical behaviour of the membranes. Wastewater samples were collected from the Umbogintwini Industrial Complex (UIC) (south coast of Durban, KwaZulu-Natal in South Africa). central UIC effluent treatment plant and pre-treated before analysis. Experimental parameters such as sample size, contact time, stirring speed were optimised. The resultant MIMs had an adsorption efficiency of 97% of TCS with reference to NIMs and bare membrane, which had 92%, 88%, respectively. The analytical method utilized in this review had limits of detection (LoD) and limits of quantification (LoQ) of 0.22, 0.71µgL-1 in wastewater effluent, respectively. The percentage recovery for the effluent samples was 68%. The detection of TCS was monitored for 10 consecutive days, where optimum TCS traces detected in the treated wastewater was 55.0μg/L inday 9 of the monitored days, while the lowest detected was 6.0μg/L. As the concentrations of analytefound in effluent water samples were not so diverse, this study suggested that MIMs could be the best potential adsorbent for the development and continuous progress in membrane technologyand environmental sciences, lending its capability to desalination.

Keywords: molecularly imprinted membrane, triclosan, phase inversion, wastewater

Procedia PDF Downloads 128
1888 Global Winners versus Local Losers: Globalization Identity and Tradition in Spanish Club Football

Authors: Jim O'brien

Abstract:

Contemporary global representation and consumption of La Liga across a plethora of media platform outlets has resulted in significant implications for the historical, political and cultural developments which shaped the development of Spanish club football. This has established and reinforced a hierarchy of a small number of teams belonging to or aspiring to belong to a cluster of global elite clubs seeking to imitate the blueprint of the English Premier League in respect of corporate branding and marketing in order to secure a global fan base through success and exposure in La Liga itself and through the Champions League. The synthesis between globalization, global sport and the status of high profile clubs has created radical change within the folkloric iconography of Spanish football. The main focus of this paper is to critically evaluate the consequences of globalization on the rich tapestry at the core of the game’s distinctive history in Spain. The seminal debate underpinning the study considers whether the divergent aspects of globalization have acted as a malevolent force, eroding tradition, causing financial meltdown and reducing much of the fabric of club football to the status of by standers, or have promoted a renaissance of these traditions, securing their legacies through new fans and audiences. The study draws on extensive sources on the history, politics and culture of Spanish football, in both English and Spanish. It also uses primary and archive material derived from interviews and fieldwork undertaken with scholars, media professionals and club representatives in Spain. The paper has four main themes. Firstly, it contextualizes the key historical, political and cultural forces which shaped the landscape of Spanish football from the late nineteenth century. The seminal notions of region, locality and cultural divergence are pivotal to this discourse. The study then considers the relationship between football, ethnicity and identity as a barometer of continuity and change, suggesting that tradition is being reinvented and re-framed to reflect the shifting demographic and societal patterns within the Spanish state. Following on from this, consideration is given to the paradoxical function of ‘El Clasico’ and the dominant duopoly of the FC Barcelona – Real Madrid axis in both eroding tradition in the global nexus of football’s commodification and in protecting historic political rivalries. To most global consumers of La Liga, the mega- spectacle and hyperbole of ‘El Clasico’ is the essence of Spanish football, with cultural misrepresentation and distortion catapulting the event to the global media audience. Finally, the paper examines La Liga as a sporting phenomenon in which elite clubs, cult managers and galacticos serve as commodities on the altar of mass consumption in football’s global entertainment matrix. These processes accentuate a homogenous mosaic of cultural conformity which obscures local, regional and national identities and paradoxically fuses the global with the local to maintain the distinctive hue of La Liga, as witnessed by the extraordinary successes of Athletico Madrid and FC Eibar in recent seasons.

Keywords: Spanish football, globalization, cultural identity, tradition, folklore

Procedia PDF Downloads 307
1887 Measurement of Ionospheric Plasma Distribution over Myanmar Using Single Frequency Global Positioning System Receiver

Authors: Win Zaw Hein, Khin Sandar Linn, Su Su Yi Mon, Yoshitaka Goto

Abstract:

The Earth ionosphere is located at the altitude of about 70 km to several 100 km from the ground, and it is composed of ions and electrons called plasma. In the ionosphere, these plasma makes delay in GPS (Global Positioning System) signals and reflect in radio waves. The delay along the signal path from the satellite to the receiver is directly proportional to the total electron content (TEC) of plasma, and this delay is the largest error factor in satellite positioning and navigation. Sounding observation from the top and bottom of the ionosphere was popular to investigate such ionospheric plasma for a long time. Recently, continuous monitoring of the TEC using networks of GNSS (Global Navigation Satellite System) observation stations, which are basically built for land survey, has been conducted in several countries. However, in these stations, multi-frequency support receivers are installed to estimate the effect of plasma delay using their frequency dependence and the cost of multi-frequency support receivers are much higher than single frequency support GPS receiver. In this research, single frequency GPS receiver was used instead of expensive multi-frequency GNSS receivers to measure the ionospheric plasma variation such as vertical TEC distribution. In this measurement, single-frequency support ublox GPS receiver was used to probe ionospheric TEC. The location of observation was assigned at Mandalay Technological University in Myanmar. In the method, the ionospheric TEC distribution is represented by polynomial functions for latitude and longitude, and parameters of the functions are determined by least-squares fitting on pseudorange data obtained at a known location under an assumption of thin layer ionosphere. The validity of the method was evaluated by measurements obtained by the Japanese GNSS observation network called GEONET. The performance of measurement results using single-frequency of GPS receiver was compared with the results by dual-frequency measurement.

Keywords: ionosphere, global positioning system, GPS, ionospheric delay, total electron content, TEC

Procedia PDF Downloads 142
1886 Early Impact Prediction and Key Factors Study of Artificial Intelligence Patents: A Method Based on LightGBM and Interpretable Machine Learning

Authors: Xingyu Gao, Qiang Wu

Abstract:

Patents play a crucial role in protecting innovation and intellectual property. Early prediction of the impact of artificial intelligence (AI) patents helps researchers and companies allocate resources and make better decisions. Understanding the key factors that influence patent impact can assist researchers in gaining a better understanding of the evolution of AI technology and innovation trends. Therefore, identifying highly impactful patents early and providing support for them holds immeasurable value in accelerating technological progress, reducing research and development costs, and mitigating market positioning risks. Despite the extensive research on AI patents, accurately predicting their early impact remains a challenge. Traditional methods often consider only single factors or simple combinations, failing to comprehensively and accurately reflect the actual impact of patents. This paper utilized the artificial intelligence patent database from the United States Patent and Trademark Office and the Len.org patent retrieval platform to obtain specific information on 35,708 AI patents. Using six machine learning models, namely Multiple Linear Regression, Random Forest Regression, XGBoost Regression, LightGBM Regression, Support Vector Machine Regression, and K-Nearest Neighbors Regression, and using early indicators of patents as features, the paper comprehensively predicted the impact of patents from three aspects: technical, social, and economic. These aspects include the technical leadership of patents, the number of citations they receive, and their shared value. The SHAP (Shapley Additive exPlanations) metric was used to explain the predictions of the best model, quantifying the contribution of each feature to the model's predictions. The experimental results on the AI patent dataset indicate that, for all three target variables, LightGBM regression shows the best predictive performance. Specifically, patent novelty has the greatest impact on predicting the technical impact of patents and has a positive effect. Additionally, the number of owners, the number of backward citations, and the number of independent claims are all crucial and have a positive influence on predicting technical impact. In predicting the social impact of patents, the number of applicants is considered the most critical input variable, but it has a negative impact on social impact. At the same time, the number of independent claims, the number of owners, and the number of backward citations are also important predictive factors, and they have a positive effect on social impact. For predicting the economic impact of patents, the number of independent claims is considered the most important factor and has a positive impact on economic impact. The number of owners, the number of sibling countries or regions, and the size of the extended patent family also have a positive influence on economic impact. The study primarily relies on data from the United States Patent and Trademark Office for artificial intelligence patents. Future research could consider more comprehensive data sources, including artificial intelligence patent data, from a global perspective. While the study takes into account various factors, there may still be other important features not considered. In the future, factors such as patent implementation and market applications may be considered as they could have an impact on the influence of patents.

Keywords: patent influence, interpretable machine learning, predictive models, SHAP

Procedia PDF Downloads 53
1885 Realizing Teleportation Using Black-White Hole Capsule Constructed by Space-Time Microstrip Circuit Control

Authors: Mapatsakon Sarapat, Mongkol Ketwongsa, Somchat Sonasang, Preecha Yupapin

Abstract:

The designed and performed preliminary tests on a space-time control circuit using a two-level system circuit with a 4-5 cm diameter microstrip for realistic teleportation have been demonstrated. It begins by calculating the parameters that allow a circuit that uses the alternative current (AC) at a specified frequency as the input signal. A method that causes electrons to move along the circuit perimeter starting at the speed of light, which found satisfaction based on the wave-particle duality. It is able to establish the supersonic speed (faster than light) for the electron cloud in the middle of the circuit, creating a timeline and propulsive force as well. The timeline is formed by the stretching and shrinking time cancellation in the relativistic regime, in which the absolute time has vanished. In fact, both black holes and white holes are created from time signals at the beginning, where the speed of electrons travels close to the speed of light. They entangle together like a capsule until they reach the point where they collapse and cancel each other out, which is controlled by the frequency of the circuit. Therefore, we can apply this method to large-scale circuits such as potassium, from which the same method can be applied to form the system to teleport living things. In fact, the black hole is a hibernation system environment that allows living things to live and travel to the destination of teleportation, which can be controlled from position and time relative to the speed of light. When the capsule reaches its destination, it increases the frequency of the black holes and white holes canceling each other out to a balanced environment. Therefore, life can safely teleport to the destination. Therefore, there must be the same system at the origin and destination, which could be a network. Moreover, it can also be applied to space travel as well. The design system will be tested on a small system using a microstrip circuit system that we can create in the laboratory on a limited budget that can be used in both wired and wireless systems.

Keywords: quantum teleportation, black-white hole, time, timeline, relativistic electronics

Procedia PDF Downloads 77
1884 Predictions for the Anisotropy in Thermal Conductivity in Polymers Subjected to Model Flows by Combination of the eXtended Pom-Pom Model and the Stress-Thermal Rule

Authors: David Nieto Simavilla, Wilco M. H. Verbeeten

Abstract:

The viscoelastic behavior of polymeric flows under isothermal conditions has been extensively researched. However, most of the processing of polymeric materials occurs under non-isothermal conditions and understanding the linkage between the thermo-physical properties and the process state variables remains a challenge. Furthermore, the cost and energy required to manufacture, recycle and dispose polymers is strongly affected by the thermo-physical properties and their dependence on state variables such as temperature and stress. Experiments show that thermal conductivity in flowing polymers is anisotropic (i.e. direction dependent). This phenomenon has been previously omitted in the study and simulation of industrially relevant flows. Our work combines experimental evidence of a universal relationship between thermal conductivity and stress tensors (i.e. the stress-thermal rule) with differential constitutive equations for the viscoelastic behavior of polymers to provide predictions for the anisotropy in thermal conductivity in uniaxial, planar, equibiaxial and shear flow in commercial polymers. A particular focus is placed on the eXtended Pom-Pom model which is able to capture the non-linear behavior in both shear and elongation flows. The predictions provided by this approach are amenable to implementation in finite elements packages, since viscoelastic and thermal behavior can be described by a single equation. Our results include predictions for flow-induced anisotropy in thermal conductivity for low and high density polyethylene as well as confirmation of our method through comparison with a number of thermoplastic systems for which measurements of anisotropy in thermal conductivity are available. Remarkably, this approach allows for universal predictions of anisotropy in thermal conductivity that can be used in simulations of complex flows in which only the most fundamental rheological behavior of the material has been previously characterized (i.e. there is no need for additional adjusting parameters other than those in the constitutive model). Accounting for polymers anisotropy in thermal conductivity in industrially relevant flows benefits the optimization of manufacturing processes as well as the mechanical and thermal performance of finalized plastic products during use.

Keywords: anisotropy, differential constitutive models, flow simulations in polymers, thermal conductivity

Procedia PDF Downloads 186
1883 Exploring the Unintended Consequences of Loyalty programs in the Gambling Sector

Authors: Violet Justine Mtonga, Cecilia Diaz

Abstract:

this paper explores the prevalence of loyalty programs in the UK gambling industry and their association with unintended consequences and harm amongst program members. The use of loyalty programs within the UK gambling industry has risen significantly with over 40 million cards in circulation. Some research suggests that as of 2013-2014, nearly 95% of UK consumers have at least one loyalty card with 78% being members of two or more programs, and the average household possesses ‘22 loyalty programs’, nearly half of which tend to be used actively. The core design of loyalty programs is to create a relational ‘win-win’ approach where value is jointly created between the parties involved through repetitive engagement. However, main concern about the diffusion of gambling organisations’ loyalty programs amongst consumers, might be the use by the organisations within the gambling industry to over influence customer engagement and potentially cause unintended harm. To help understand the complex phenomena of the diffusions and adaptation of the use of loyalty programs in the gambling industry, and the potential unintended outcomes, this study is theoretically underpinned by the social exchange theory of relationships entrenched in the processes of social exchanges of resources, rewards, and costs for long-term interactions and mutual benefits. Qualitative data were collected via in-depth interviews from 14 customers and 12 employees within the UK land-based gambling firms. Data were analysed using a combination of thematic and clustering analysis to help reveal and discover the emerging themes regarding the use of loyalty cards for gambling companies and exploration of subgroups within the sample. The study’s results indicate that there are different unintended consequences and harm of loyalty program engagement and usage such as maladaptive gambling behaviours, risk of compulsiveness, and loyalty programs promoting gambling from home. Furthermore, there is a strong indication of a rite of passage among loyalty program members. There is also strong evidence to support other unfavorable behaviors such as amplified gambling habits and risk-taking practices. Additionally, in pursuit of rewards, loyalty program incentives effectuate overconsumption and heighten expenditure. Overall, the primary findings of this study show that loyalty programs in the gambling industry should be designed with an ethical perspective and practice.

Keywords: gambling, loyalty programs, social exchange theory, unintended harm

Procedia PDF Downloads 95
1882 Examining Experiences of QTBIPOC Disabled Students in Canadian Post-Secondary Institutions

Authors: Manchari Paranthahan

Abstract:

Higher education has often presented barriers to many communities as a result of its colonial roots. While higher education was initially created for white cis-males, student populations have become more diverse in the past few decades. Despite this increase in diversity, barriers like rising costs and hostile education settings continue to make higher education hard to access for certain demographics. These barriers and limitations are compounded for students who are intersectionality marginalized, such as Queer and Trans Black, Indigenous and People of Colour (QTBIPOC) Disabled students. As of 2021-2022, only 57.5% of the Canadian population between the ages of 25 - 64 held a college or university credential, with only 32.9% holding a bachelor’s degree or higher. In that same time frame, only 0.64% of the students who successfully completed a higher education program identified as transgender or nonbinary. QTBIPOC Disabled students experience diverse forms of oppression while navigating education systems, often preventing them from completing their education successfully. This research project will investigate the complex experiences of intersectional marginalization of QTBIPOC Disabled students in Canadian post-secondary education systems. Through this investigation, this research seeks to reimagine more inclusive and accessible education systems in Canada and beyond. The social and academic experiences of QTBIPOC Disabled students in education systems are largely absent from scholarly literature, speaking to their continued marginalization and erasure from academic discourses. The lack of representation for this community in academia reinforces the idea that there is no space for marginalized bodies in further education, a discriminatory belief that this research project aims to investigate and reframe with this project. This research study will be informed by Critical Race theory, Queer Theory and Critical Disability Theories. Through a blend of critical narrative ethnography and ethnodrama for my methodological framing. Using these methodologies will speak to the intersecting factors that impact the experiences that QTBIPOC Disabled students have in education systems while offering space to analyze and create new systems of learning that benefits all students.

Keywords: QTBIPOC, queer, disability, pedagogy

Procedia PDF Downloads 29
1881 Day-Case Ketamine Infusions in Patients with Chronic Pancreatitis

Authors: S. M. C. Kelly, M. Goulden

Abstract:

Introduction: Chronic Pancreatitis is an increasing problem worldwide. Pain is the main symptom and the main reason for hospital readmission following diagnosis, despite the use of strong analgesics including opioids. Ketamine infusions reduce pain in complex regional pain syndrome and other neuropathic pain conditions. Our centre has trialed the use of ketamine infusions in patients with chronic pancreatitis. We have evaluated this service to assess whether ketamine reduces emergency department admissions and analgesia requirements. Methods: This study collected retrospective data from 2010 in all patients who received a ketamine infusion for chronic pain secondary to a diagnosis of chronic pancreatitis. The day-case ketamine infusions were initiated in theatre by an anaesthetist, with standard monitoring and the assistance of an anaesthetic practitioner. A bolus dose of 0.5milligrams/kilogram was given in theatre. The infusion of 0.5 milligrams/kilogram per hour was then administered over a 6 hour period in the theatre recovery area. A study proforma detailed the medical history, analgesic use and admissions to hospital. Patients received a telephone follow up consultation. Results: Over the last eight years, a total of 30 patients have received intravenous ketamine infusions, with a total of 92 ketamine infusions being administered. 53% of the patients were male with the average age of 47. A total of 27 patients participated with the telephone consultation. A third of patients reported a reduction in hospital admissions with pain following the ketamine infusion. Analgesia requirements were reduced by an average of 48.3% (range 0-100%) for an average duration of 69.6 days (range 0-180 days.) Discussion: This service evaluation illustrates that ketamine infusions can reduce analgesic requirements and the number of hospital admissions in patients with chronic pancreatitis. In the light of increasing pressures on Emergency departments and the increasing evidence of the dangers of long-term opioid use, this is clearly a useful finding. We are now performing a prospective study to assess the long-term effectiveness of ketamine infusions in reducing analgesia requirements and improving patient’s quality of life.

Keywords: acute-on-chronic pain, intravenous analgesia infusion, ketamine, pancreatitis

Procedia PDF Downloads 139
1880 Need for Contemporization of Craft for Sustenance: A Study on Solapur Wall Hanging

Authors: Reena Aggarwal

Abstract:

Wall art is a manifestation of the human mind and an absorbing form of cultural expression. Solapur wall hanging making art reflects cultural values, regional sensibilities, beliefs, and identity and helps to preserve the many different communities. The tango of warp and weft in many ways than one tells the story of civilization itself. Solapur wall hanging is a poem in multicolor, written with the warp and weft having long, rich, and complex history with indigenous design vocabularies made by the Padmasali communities. The wall-hanging weaving of Solapur has remained unaltered for years, from being very basic and monochrome having landscapes and portraits catering only to the local market, thereby becoming a potential family income generation tool. The study focuses on the need for contemporization of the Solapur wall hanging and also deliberates on the fact that wherever the culture of native people has been aided by intervention, in nearly every case, the quality of their craft has began to be enhanced. The study also found the underlying reason for diminishing sales to a declining market, low sales, lack of innovation in design, and product development. Keeping in mind that the artisans of Solapur have heroically always hold on to their ancient beliefs and practices, which give them strength and identity, and a sense of pride, an intervention program was developed with an objective of widening the market and help artisans have a sustaining income which include urban consumers and create designs suitable for the urban market. The process of defining and measuring the advantages of design intervention was achieved by using qualitative research methods. An ethnographic research methodology was adopted, which includes six months of close interface with artisans from ten families engaged in making of wall hanging in Solapur. Design solutions were proposed in terms of product diversification and design extensions of the existing product line for increased variety. A collection of contemporary wall arts (wall decor) and room dividers were designed and developed.

Keywords: wall hanging, Solapur, contemporization, traditional, sustainable

Procedia PDF Downloads 329
1879 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator

Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty

Abstract:

Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) wherein the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation and Control design team. This paper discusses the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.

Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), steady state, transient state

Procedia PDF Downloads 268
1878 Planing the Participation of Units Bound to Demand Response Programs with Regard to Ancillary Services in the PQ Power Market

Authors: Farnoosh Davarian

Abstract:

The present research focuses on organizing the cooperation of units constrained by demand response (DR) programs, considering ancillary services in the P-Q power market. Moreover, it provides a comprehensive exploration of the effects of demand reduction and redistribution across several predefined scenarios (in three pre-designed demand response programs, for example, ranging from 5% to 20%) on system voltage and losses in a smart distribution system (in the studied network, distributed energy resources (DERs) such as synchronous distributed generators and wind turbines offer their active and reactive power for the proposed market).GAMS, a specialized software for high-powered modeling, is used for optimizing linear, nonlinear, and integer programming challenges. GAMS modeling is separate from its solution method, which is a notable feature. Thus, by providing changes in the solver, it is possible to solve the model using various methods (linear, nonlinear, integer, etc.). Finally, the combined active and reactive market challenge in smart distribution systems, considering renewable distributed sources and demand response programs in GAMS, will be evaluated. The active and reactive power trading by the distribution company is carried out in the wholesale market. What is demanded is active power. By using the buy-back/payment program, it is possible for responsive loads or aggregators to participate in the market. The objective function of the proposed market is to minimize the price of active and reactive power for DERs and distribution companies and the penalty cost for CO2 emissions and the cost of the buy-back/payment program. In this research, the objective function is to minimize the cost of active and reactive power from distributed generation sources and distribution companies, the cost of carbon dioxide emissions, and the cost of the buy-back/payment program. The effectiveness of the proposed method has been evaluated in a case study.

Keywords: consumer behavior, demand response, pollution cost, combined active and reactive market

Procedia PDF Downloads 17
1877 Features of Technological Innovation Management in Georgia

Authors: Ketevan Goletiani, Parmen Khvedelidze

Abstract:

discusses the importance of the topic, which is reflected in the advanced and developed countries in the formation of a new innovative stage of the distinctive mark of the modern world development. This phase includes the construction of the economy, which generates stockpiling and use is based. Intensifying the production and use of the results of new scientific and technical innovation has led to a sharp reduction in the cycle and accelerate the pace of product and technology updates. The world's leading countries in the development of innovative management systems for the formation of long-term and stable development of the socio-economic order conditions. The last years of the 20th century, the social and economic relations, modification, accelerating economic reforms, and profound changes in the system of the time. At the same time, the country should own place in the world geopolitical and economic space. Accelerated economic development tasks, the World Trade Organization, the European Union deep and comprehensive trade agreement, the new system of economic management, technical and technological renewal of production potential, and scientific fields in the share of the total volume of GDP growth requires new approaches. XX - XXI centuries Georgia's socio-economic changes is one of the urgent tasks in the form of a rise to the need for change, involving the use of natural resource-based economy to the latest scientific and technical achievements of an innovative and dynamic economy based on an accelerated pace. But Georgia still remains unresolved in many methodological, theoretical, and practical nature of the problem relating to the management of the economy in various fields for the development of innovative systems for optimal implementation. Therefore, the development of an innovative system for the formation of a complex and multi-problem, which is reflected in the following: countries should have higher growth rates than the geopolitical space of the neighboring countries that its competitors are. Formation of such a system is possible only in a deep theoretical research and innovative processes in the multi-level (micro, meso- and macro-levels) management on the basis of creation.

Keywords: georgia, innovative, socio-economic, innovative manage

Procedia PDF Downloads 125
1876 Biotechnology Sector in the Context of National Innovation System: The Case of Norway

Authors: Parisa Afshin, Terje Grønning

Abstract:

Norway, similar to many other countries, has set the focus of its policies in creating new strong and highly innovative sectors in recent years, as the oil and gas sector profitability is declining. Biotechnology sector in Norway has a great potential, especially in marine-biotech and cancer medicine. However, Norway being a periphery faces especial challenges in the path of creating internationally well-known biotech sector and an international knowledge hub. The aim of this article is to analyze the progress of the Norwegian biotechnology industry, its pathway to build up an innovation network and conduct collaborative innovation based on its initial conditions and its own advantage and disadvantages. The findings have important implications not only for politicians and academic in understanding the infrastructure of biotechnology sector in the country, but it has important lessons for other periphery countries or regions aiming in creating strong biotechnology sector and catching up with the strong internationally-recognized regions. Data and methodology: To achieve the main goal of this study, information has been collected via secondary resources such as web pages and annual reports published by the officials and mass media along with interviews were used. The data were collected with the goal to shed light on a brief history and current status of Norway biotechnology sector, as well as geographic distribution of biotech industry, followed by the role of academic and industry collaboration and public policies in Norway biotech. As knowledge is the key input in innovation, knowledge perspective of the system such as knowledge flow in the sector regarding the national and regional innovation system has been studied. Primary results: The internationalization has been an important element in development of periphery regions' innovativeness enabling them to overcome their weakness while putting more weight on the importance of regional policies. Following such findings, suggestions on policy decision and international collaboration, regarding national and regional system of innovation, has been offered as means of promoting strong innovative sector.

Keywords: biotechnology sector, knowledge-based industry, national innovation system, regional innovation system

Procedia PDF Downloads 227
1875 A Retrospective Analysis of the Impact of the Choosing Wisely Canada Campaign on Emergency Department Imaging Utilization for Head Injuries

Authors: Sameer Masood, Lucas Chartier

Abstract:

Head injuries are a commonly encountered presentation in emergency departments (ED) and the Choosing Wisely Canada (CWC) campaign was released in June 2015 in an attempt to decrease imaging utilization for patients with minor head injuries. The impact of the CWC campaign on imaging utilization for head injuries has not been explored in the ED setting. In our study, we describe the characteristics of patients with head injuries presenting to a tertiary care academic ED and the impact of the CWC campaign on CT head utilization. This retrospective cohort study used linked databases from the province of Ontario, Canada to assess emergency department visits with a primary diagnosis of head injury made between June 1, 2014 and Aug 31, 2016 at the University Health Network in Toronto, Canada. We examined the number of visits during the study period, the proportion of patients that had a CT head performed before and after the release of the CWC campaign, as well as mode of arrival, and disposition. There were 4,322 qualifying visits at our site during the study period. The median presenting age was 44.12 years (IQR 27.83,67.45), the median GCS was 15 (IQR 15,15) and the majority of patients presenting had intermediate acuity (CTAS 3). Overall, 43.17% of patients arrived via ambulance, 49.24 % of patients received a CT head and 10.46% of patients were admitted. Compared to patients presenting before the CWC campaign release, there was no significant difference in the rate of CT heads after the CWC (50.41% vs 47.68%, P = 0.07). There were also no significant differences between the two groups in mode of arrival (ambulance vs ambulatory) (42.94% vs 43.48%, P = 0.72) or admission rates (9.85% vs 11.26%, P = 0.15). However, more patients belonged to the high acuity groups (CTAS 1 or 2) in the post CWC campaign release group (12.98% vs 8.11% P <0.001). Visits for head injuries make up a significant proportion of total ED visits and approximately half of these patients receive CT imaging in the ED. The CWC campaign did not seem to impact imaging utilization for head injuries in the 14 months following its launch. Further efforts, including local quality improvement initiatives, are likely needed to increase adherence to its recommendation and reduce imaging utilization for head injuries.

Keywords: choosing wisely, emergency department, head injury, quality improvement

Procedia PDF Downloads 228
1874 Geomorphometric Analysis of the Hydrologic and Topographic Parameters of the Katsina-Ala Drainage Basin, Benue State, Nigeria

Authors: Oyatayo Kehinde Taofik, Ndabula Christopher

Abstract:

Drainage basins are a central theme in the green economy. The rising challenges in flooding, erosion or sediment transport and sedimentation threaten the green economy. This has led to increasing emphasis on quantitative analysis of drainage basin parameters for better understanding, estimation and prediction of fluvial responses and, thus associated hazards or disasters. This can be achieved through direct measurement, characterization, parameterization, or modeling. This study applied the Remote Sensing and Geographic Information System approach of parameterization and characterization of the morphometric variables of Katsina – Ala basin using a 30 m resolution Shuttle Radar Topographic Mission (SRTM) Digital Elevation Model (DEM). This was complemented with topographic and hydrological maps of Katsina-Ala on a scale of 1:50,000. Linear, areal and relief parameters were characterized. The result of the study shows that Ala and Udene sub-watersheds are 4th and 5th order basins, respectively. The stream network shows a dendritic pattern, indicating homogeneity in texture and a lack of structural control in the study area. Ala and Udene sub-watersheds have the following values for elongation ratio, circularity ratio, form factor and relief ratio: 0.48 / 0.39 / 0.35/ 9.97 and 0.40 / 0.35 / 0.32 / 6.0. They also have the following values for drainage texture and ruggedness index of 0.86 / 0.011 and 1.57 / 0.016. The study concludes that the two sub-watersheds are elongated, suggesting that they are susceptible to erosion and, thus higher sediment load in the river channels, which will dispose the watersheds to higher flood peaks. The study also concludes that the sub-watersheds have a very coarse texture, with good permeability of subsurface materials and infiltration capacity, which significantly recharge the groundwater. The study recommends that efforts should be put in place by the Local and State Governments to reduce the size of paved surfaces in these sub-watersheds by implementing a robust agroforestry program at the grass root level.

Keywords: erosion, flood, mitigation, morphometry, watershed

Procedia PDF Downloads 93
1873 Interactive Virtual Patient Simulation Enhances Pharmacology Education and Clinical Practice

Authors: Lyndsee Baumann-Birkbeck, Sohil A. Khan, Shailendra Anoopkumar-Dukie, Gary D. Grant

Abstract:

Technology-enhanced education tools are being rapidly integrated into health programs globally. These tools provide an interactive platform for students and can be used to deliver topics in various modes including games and simulations. Simulations are of particular interest to healthcare education, where they are employed to enhance clinical knowledge and help to bridge the gap between theory and practice. Simulations will often assess competencies for practical tasks, yet limited research examines the effects of simulation on student perceptions of their learning. The aim of this study was to determine the effects of an interactive virtual patient simulation for pharmacology education and clinical practice on student knowledge, skills and confidence. Ethics approval for the study was obtained from Griffith University Research Ethics Committee (PHM/11/14/HREC). The simulation was intended to replicate the pharmacy environment and patient interaction. The content was designed to enhance knowledge of proton-pump inhibitor pharmacology, role in therapeutics and safe supply to patients. The tool was deployed into a third-year clinical pharmacology and therapeutics course. A number of core practice areas were examined including the competency domains of questioning, counselling, referral and product provision. Baseline measures of student self-reported knowledge, skills and confidence were taken prior to the simulation using a specifically designed questionnaire. A more extensive questionnaire was deployed following the virtual patient simulation, which also included measures of student engagement with the activity. A quiz assessing student factual and conceptual knowledge of proton-pump inhibitor pharmacology and related counselling information was also included in both questionnaires. Sixty-one students (response rate >95%) from two cohorts (2014 and 2015) participated in the study. Chi-square analyses were performed and data analysed using Fishers exact test. Results demonstrate that student knowledge, skills and confidence within the competency domains of questioning, counselling, referral and product provision, show improvement following the implementation of the virtual patient simulation. Statistically significant (p<0.05) improvement occurred in ten of the possible twelve self-reported measurement areas. Greatest magnitude of improvement occurred in the area of counselling (student confidence p<0.0001). Student confidence in all domains (questioning, counselling, referral and product provision) showed a marked increase. Student performance in the quiz also improved, demonstrating a 10% improvement overall for pharmacology knowledge and clinical practice following the simulation. Overall, 85% of students reported the simulation to be engaging and 93% of students felt the virtual patient simulation enhanced learning. The data suggests that the interactive virtual patient simulation developed for clinical pharmacology and therapeutics education enhanced students knowledge, skill and confidence, with respect to the competency domains of questioning, counselling, referral and product provision. These self-reported measures appear to translate to learning outcomes, as demonstrated by the improved student performance in the quiz assessment item. Future research of education using virtual simulation should seek to incorporate modern quantitative measures of student learning and engagement, such as eye tracking.

Keywords: clinical simulation, education, pharmacology, simulation, virtual learning

Procedia PDF Downloads 343
1872 Impact of 99mTc-MDP Bone SPECT/CT Imaging in Failed Back Surgery Syndrome

Authors: Ching-Yuan Chen, Lung-Kwang Pan

Abstract:

Objective: Back pain is a major health problem costing billions of health budgets annually in Taiwan. Thousands of back pain surgeries are performed annually with up to 40% of patients complaining of back pain at time of post-surgery causing failed back surgery syndrome (FBSS), although diagnosis in these patients may be complex. The aim of study is to assess the feasibility of using bone SPECT-CT imaging to localize the active lesions causing persistent, recurrent or new backache after spine surgery. Materials and Methods: Bone SPECT-CT imaging was performed after the intravenous injection of 20 mCi of 99mTc-MDP for all the patients with diagnosis of FBSS. Patients were evaluated using status of subjectively pain relief, functional improvement and degree of satisfaction by reviewing the medical records and questionnaires in a 2 more years’ follow-up. Results: We enrolled a total of 16 patients were surveyed in our hospital from Jan. 2015 to Dec. 2016. Four people on SPEC/CT imaging ensured significant lesions were undergone a revised surgery (surgical treatment group). The mean visual analogue scale (VAS) decreased 5.3 points and mean Oswestry disability index (ODI) improved 38 points in the surgical group. The remaining 12 on SPECT/CT imaging were diagnosed as no significant lesions then received drug treatment (medical treatment group). The mean VAS only decreased 2 .1 point and mean ODI improved 12.6 points in the medical treatment group. In the posttherapeutic evaluation, the pain of the surgical treatment group showed a satisfactory improvement. In the medical treatment group, 10 of the 12 were also satisfied with the symptom relief while the other 2 did not improve significantly. Conclusions: Findings on SPECT-CT imaging appears to be easily explained the patients' pain. We recommended that SPECT/CT imaging was a feasible and useful clinical tool to improve diagnostic confidence or specificity when evaluating patients with FBSS.

Keywords: failed back surgery syndrome, oswestry disability index, SPECT-CT imaging, 99mTc-MDP, visual analogue scale

Procedia PDF Downloads 178
1871 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea

Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi

Abstract:

Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.

Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow

Procedia PDF Downloads 126
1870 A Comprehensive Survey on Machine Learning Techniques and User Authentication Approaches for Credit Card Fraud Detection

Authors: Niloofar Yousefi, Marie Alaghband, Ivan Garibay

Abstract:

With the increase of credit card usage, the volume of credit card misuse also has significantly increased, which may cause appreciable financial losses for both credit card holders and financial organizations issuing credit cards. As a result, financial organizations are working hard on developing and deploying credit card fraud detection methods, in order to adapt to ever-evolving, increasingly sophisticated defrauding strategies and identifying illicit transactions as quickly as possible to protect themselves and their customers. Compounding on the complex nature of such adverse strategies, credit card fraudulent activities are rare events compared to the number of legitimate transactions. Hence, the challenge to develop fraud detection that are accurate and efficient is substantially intensified and, as a consequence, credit card fraud detection has lately become a very active area of research. In this work, we provide a survey of current techniques most relevant to the problem of credit card fraud detection. We carry out our survey in two main parts. In the first part, we focus on studies utilizing classical machine learning models, which mostly employ traditional transnational features to make fraud predictions. These models typically rely on some static physical characteristics, such as what the user knows (knowledge-based method), or what he/she has access to (object-based method). In the second part of our survey, we review more advanced techniques of user authentication, which use behavioral biometrics to identify an individual based on his/her unique behavior while he/she is interacting with his/her electronic devices. These approaches rely on how people behave (instead of what they do), which cannot be easily forged. By providing an overview of current approaches and the results reported in the literature, this survey aims to drive the future research agenda for the community in order to develop more accurate, reliable and scalable models of credit card fraud detection.

Keywords: Credit Card Fraud Detection, User Authentication, Behavioral Biometrics, Machine Learning, Literature Survey

Procedia PDF Downloads 126
1869 Esophageal Premalignant and Malignant Epithelial Lesions: Pathological Characteristics and Value of Cyclooxygenase-2 Expression.

Authors: Hanan Mohamed Abd Elmoneim, Rawan Saleh AlJawi, Razan Saleh AlJawi, Aseel Abdullah AlMasoudi , Zyad Adnan Turkistani, Anas Abdulkarim Alkhoutani , Ohood Musaed AlJuhani , Hanan Attiyah AlZahrani

Abstract:

Background Esophageal cancer is the eighth most common cancer worldwide. More than 90% of esophageal cancers are either squamous cell carcinoma or adenocarcinoma. Squamous dysplasia is a precancerous lesion for squamous cell carcinoma and Barrett's esophagus is the precancerous lesion for adenocarcinoma. Gastro-esophageal reflux disease (GERD) is the initiation factor for Barrett's esophagus. Cyclooxygenase-2 (COX-2) is a key enzyme in arachidonic metabolism. It appears to play an important role in gastrointestinal carcinogenesis. COX-2 activity may be a potential target for the prevention of cancer progression by selective COX-2 inhibitors, which decrease proliferation and increase apoptosis. Objectives To assess COX-2 expression in premalignant and malignant esophageal epitheliums changes and detect its roles in progression of these lesions. Materials and Methods We analyzed the expression of COX-2 immunohistochemically in 40 esophageal biopsies utilizing the streptavidin-biotin-peroxidase complex method on archival formalin fixed-paraffin embedded blocks. Histopathologically, 17 (42.5%) of cases were non-malignant cases which included GERD, Barrett's esophagus and squamous dysplasia. The malignant cases were 23 (57.5%) squamous cell carcinoma, adenocarcinoma and undifferentiated carcinoma. Results In non-malignant cases 7 (41.2%) out of 17 cases had high COX-2 expression. In squamous cell carcinoma 10 (83.3%) out of 12 cases had high COX-2 expression. The expression of COX-2 was high in all 9 (100%) cases of adenocarcinoma. COX-2 expression is significantly increased (P=0.005 and P=0.0001) in squamous cell carcinoma and adenocarcinoma respectively. There was a significant difference in COX-2 immunoreactivity between malignant and non-malignant lesions (P=0.0003). Conclusion COX-2 is responsible for the progression of esophageal diseases from benign to malignant. We recommend that COX-2 immunohistochemistry should be done routinely for premalignant and malignant esophageal lesions as selective COX-2 inhibitors will be helpful in the treatment. Further studies on molecular and genetic basis of COX-2 expression are needed to unmask its role and relation to progression of esophageal lesions.

Keywords: Cox-2, Esophageal adinocarcinoma, Esophageal squamous cell carcinoma, Immunohistochemistry.

Procedia PDF Downloads 355
1868 An Advanced Image-Based Intelligent System for Enhancing Construction Site Safety Monitoring and Analysis

Authors: Hijratullah Sharifzada, You Wang, Said Ikram Sadat, Hamza Javed, Khalid Akhunzada, Sidra Javed, Sadiq Khan

Abstract:

In the construction industry, safety is of paramount importance given the complex and dynamic nature of construction sites, which are prone to various hazards like falls from heights, being hit by falling objects, and structural collapses. Traditional safety management strategies such as manual inspections and safety training have shown significant limitations. This study presents an intelligent monitoring and analysis system for construction site safety based on an image dataset. A specifically designed Construction Site Safety Image Dataset, comprising 10 distinct classes of objects commonly found on sites, is utilized and divided into training, validation, and test subsets. InceptionV3 and MobileNetV2 are chosen as pre-trained models for feature extraction and are modified through truncation and compression to better suit the task. A Feature Fusion architecture is introduced, integrating these modified models along with a Squeeze-and-Excitation block. Experimental results demonstrate that the proposed model achieves a mean Average Precision (mAP) of 0.81 at an IoU threshold of 0.5, with high accuracies for classes like "Safety Cone" (91%) and "Machinery" (93%) but relatively lower accuracy for "Vehicle" (57%). The training process exhibits smooth convergence, and compared to prior methods such as YOLOv4 and SSD, the proposed framework shows superiority in precision and recall. Despite its achievements, the system has limitations, including reliance on visual data and dataset imbalance. Future research directions involve incorporating multi-modal data, conducting real-world deployments, and optimizing for edge deployment, aiming to further enhance construction site safety.

Keywords: construction site safety, intelligent monitoring system, image dataset, InceptionV3, MobileNetV2, feature fusion, squeeze-and-excitation block, mean average precision, object detection

Procedia PDF Downloads 11