Search results for: long memory
5709 Efficient Motion Estimation by Fast Three Step Search Algorithm
Authors: S. M. Kulkarni, D. S. Bormane, S. L. Nalbalwar
Abstract:
The rapid development in the technology have dramatic impact on the medical health care field. Medical data base obtained with latest machines like CT Machine, MRI scanner requires large amount of memory storage and also it requires large bandwidth for transmission of data in telemedicine applications. Thus, there is need for video compression. As the database of medical images contain number of frames (slices), hence while coding of these images there is need of motion estimation. Motion estimation finds out movement of objects in an image sequence and gets motion vectors which represents estimated motion of object in the frame. In order to reduce temporal redundancy between successive frames of video sequence, motion compensation is preformed. In this paper three step search (TSS) block matching algorithm is implemented on different types of video sequences. It is shown that three step search algorithm produces better quality performance and less computational time compared with exhaustive full search algorithm.Keywords: block matching, exhaustive search motion estimation, three step search, video compression
Procedia PDF Downloads 4895708 Glaucoma Detection in Retinal Tomography Using the Vision Transformer
Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan
Abstract:
Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning
Procedia PDF Downloads 1895707 Tip-Apex Distance as a Long-Term Risk Factor for Hospital Readmission Following Intramedullary Fixation of Intertrochanteric Fractures
Authors: Brandon Knopp, Matthew Harris
Abstract:
Purpose: Tip-apex distance (TAD) has long been discussed as a metric for determining risk of failure in the fixation of peritrochanteric fractures. TAD measurements over 25 millimeters (mm) have been associated with higher rates of screw cut out and other complications in the first several months after surgery. However, there is limited evidence for the efficacy of this measurement in predicting the long-term risk of negative outcomes following hip fixation surgery. The purpose of our study was to investigate risk factors including TAD for hospital readmission, loss of pre-injury ambulation and development of complications within 1 year after hip fixation surgery. Methods: A retrospective review of proximal hip fractures treated with single screw intramedullary devices between 2016 and 2020 was performed at a 327-bed regional medical center. Patients included had a postoperative follow-up of at least 12 months or surgery-related complications developing within that time. Results: 44 of the 67 patients in this study met the inclusion criteria with adequate follow-up post-surgery. There was a total of 10 males (22.7%) and 34 females (77.3%) meeting inclusion criteria with a mean age of 82.1 (± 12.3) at the time of surgery. The average TAD in our study population was 19.57mm and the average 1-year readmission rate was 15.9%. 3 out of 6 patients (50%) with a TAD > 25mm were readmitted within one year due to surgery-related complications. In contrast, 3 out of 38 patients (7.9%) with a TAD < 25mm were readmitted within one year due to surgery-related complications (p=0.0254). Individual TAD measurements, averaging 22.05mm in patients readmitted within 1 year of surgery and 19.18mm in patients not readmitted within 1 year of surgery, were not significantly different between the two groups (p=0.2113). Conclusions: Our data indicate a significant improvement in hospital readmission rates up to one year after hip fixation surgery in patients with a TAD < 25mm with a decrease in readmissions of over 40% (50% vs 7.9%). This result builds upon past investigations by extending the follow-up time to 1 year after surgery and utilizing hospital readmissions as a metric for surgical success. With the well-documented physical and financial costs of hospital readmission after hip surgery, our study highlights a reduction of TAD < 25mm as an effective method of improving patient outcomes and reducing financial costs to patients and medical institutions. No relationship was found between TAD measurements and secondary outcomes, including loss of pre-injury ambulation and development of complications.Keywords: hip fractures, hip reductions, readmission rates, open reduction internal fixation
Procedia PDF Downloads 1445706 Immediate and Long-Term Effect of the Sawdust Usage on Shear Strength of the Clayey Silt Soil
Authors: Dogan Cetin, Omar Hamdi Jasim
Abstract:
Using some additives is very common method to improve the soil properties such as shear strength, bearing capacity; and to reduce the settlement and lateral deformation. Soil reinforcement with natural materials is an attractive method to improve the soil properties because of their low cost. However, the studies conducted by using natural additive are very limited. This paper presents the results of an investigation on the immediate and long-term effects of the sawdust on the shear strength behavior of a clayey silt soil obtained in Arnavutkoy in Istanbul with sawdust. Firstly, compaction tests were conducted to be able to optimum moisture content for every percentage of sawdust. The samples were obtained from compacted soil at optimum moisture content. UU Triaxial Tests were conducted to evaluate the response of randomly distributed sawdust on the strength of low plasticity clayey silt soil. The specimens were tested with 1%, 2% and 3% content of sawdust. It was found that the undrained shear strength of clay soil with 1%, 2% and 3% sawdust were increased respectively 4.65%, 27.9% and 39.5% higher than the soil without additive. At 5%, shear strength of clay soil decreased by 3.8%. After 90 days cure period, the shear strength of the soil with 1%, 2%, 3% and %5 increased respectively 251%, 302%, 260% and 153%. It can be said that the effect of the sawdust usage has a remarkable effect on the undrained shear strength of the soil. Besides the increasing undrained shear strength, it was also found that the sawdust decreases the liquid limit, plastic limit and plasticity index by 5.5%, 2.9 and 10.9% respectively.Keywords: compaction test, sawdust, shear strength, UU Triaxial Test
Procedia PDF Downloads 3525705 A Review of Encryption Algorithms Used in Cloud Computing
Authors: Derick M. Rakgoale, Topside E. Mathonsi, Vusumuzi Malele
Abstract:
Cloud computing offers distributed online and on-demand computational services from anywhere in the world. Cloud computing services have grown immensely over the past years, especially in the past year due to the Coronavirus pandemic. Cloud computing has changed the working environment and introduced work from work phenomenon, which enabled the adoption of technologies to fulfill the new workings, including cloud services offerings. The increased cloud computing adoption has come with new challenges regarding data privacy and its integrity in the cloud environment. Previously advanced encryption algorithms failed to reduce the memory space required for cloud computing performance, thus increasing the computational cost. This paper reviews the existing encryption algorithms used in cloud computing. In the future, artificial neural networks (ANN) algorithm design will be presented as a security solution to ensure data integrity, confidentiality, privacy, and availability of user data in cloud computing. Moreover, MATLAB will be used to evaluate the proposed solution, and simulation results will be presented.Keywords: cloud computing, data integrity, confidentiality, privacy, availability
Procedia PDF Downloads 1305704 Examination of Media and Electoral Violence in Kogi State, Nigeria
Authors: Chris Ogwu Attah, Okpanachi Linus Odiji
Abstract:
An election is no doubt a universally accepted means of resolving societal problems, particularly those with political connotations. While the process has often been conducted in advanced democracies without attacks on opponents and the populace, that ambiance of political tranquillity has hardly been enjoyed in many African states. While the violent nature of polls on this part of the globe have for long been linked among other things to monetization and the zero-sum character of politics, emerging trends show how the increasing rate of electoral violence may not be unconnected to the broadcasts of violent acts in the media. Anchored on the age-long complaints about the possible deleterious effects of mass media and Plato’s concern about the effects of plays on the youth, this study aims to interrogate the relationship between media and electoral violence in Nigeria using Kogi State as a case study. While the Social Cognitive Theory is adopted to guide the study to fruition, data was elicited primarily from a multi-stage sampling arrangement in which respondents from three purposively selected locations (Anyigba, Lokoja, and Okene) were randomly selected. Using chi-square to test the assumption that media violence catalyzes electoral violence in Kogi State, it was discovered among other revelations that electoral violence increases numerically with the depiction of violence in the media. As a recommendation, therefore, this paper advocate that Civil Society Organisations, as well as relevant governmental agencies, should carry out mass political education which aims at instilling political morals on the populace, especially the youths.Keywords: electoral violence, media, media violence, violence
Procedia PDF Downloads 1525703 A Study of Evolutional Control Systems
Authors: Ti-Jun Xiao, Zhe Xu
Abstract:
Controllability is one of the fundamental issues in control systems. In this paper, we study the controllability of second order evolutional control systems in Hilbert spaces with memory and boundary controls, which model dynamic behaviors of some viscoelastic materials. Transferring the control problem into a moment problem and showing the Riesz property of a family of functions related to Cauchy problems for some integrodifferential equations, we obtain a general boundary controllability theorem for these second order evolutional control systems. This controllability theorem is applicable to various concrete 1D viscoelastic systems and recovers some previous related results. It is worth noting that Riesz sequences can be used for numerical computations of the control functions and the identification of new Riesz sequence is of independent interest for the basis-function theory. Moreover, using the Riesz sequences, we obtain the existence and uniqueness of (weak) solutions to these second order evolutional control systems in Hilbert spaces. Finally, we derive the exact boundary controllability of a viscoelastic beam equation, as an application of our abstract theorem.Keywords: evolutional control system, controllability, boundary control, existence and uniqueness
Procedia PDF Downloads 2205702 Causality between Stock Indices and Cryptocurrencies during the Russia-Ukraine War
Authors: Nidhal Mgadmi, Abdelhafidh Othmani
Abstract:
This article examines the causal relationship between stock indices and cryptocurrencies during the current war between Russia and Ukraine. The econometric investigation runs from February 24, 2022, to April 12, 2023, focusing on seven stock market indices (S&P500, DAX, CAC40, Nikkei, TSX, MOEX, and PFTS) and seven cryptocurrencies (Bitcoin, Ethereum, Litcoin, Dash, Ripple, DigiByte and XEM). In this article, we try to understand how investors react to fluctuations in financial assets to seek safe havens in cryptocurrencies. We used dynamic causality to detect a possible causal relationship in the short term and seven models to estimate the long-term relationship between cryptocurrencies and financial assets. The causal relationship between financial market indexes and cryptocurrency coins in the short run indicates that three famous cryptocurrencies (BITCOIN, ETHEREUM, RIPPLE) and the two digital assets with minor popularity (XEM, Digibyte) are impacted by the German, Russian, and Ukrainian stock markets. In the long run, we found a positive and significate effect of the American, Canadian, French, and Ukrainian stock market indexes on Bitcoin. Thus, the stability of the traditional financial markets during the current war period can be explained on the one hand by investors’ fears of an unstable business climate, and on the other hand, by speculators’ sentiment towards new electronic products, which are perceived as hedging instruments and a safe haven in the face of the conflict between Ukraine and Russia.Keywords: causality, stock indices, cryptocurrency, war, Russia, Ukraine
Procedia PDF Downloads 665701 The End Justifies the Means: Using Programmed Mastery Drill to Teach Spoken English to Spanish Youngsters, without Relying on Homework
Authors: Robert Pocklington
Abstract:
Most current language courses expect students to be ‘vocational’, sacrificing their free time in order to learn. However, pupils with a full-time job, or bringing up children, hardly have a spare moment. Others just need the language as a tool or a qualification, as if it were book-keeping or a driving license. Then there are children in unstructured families whose stressful life makes private study almost impossible. And the countless parents whose evenings and weekends have become a nightmare, trying to get the children to do their homework. There are many arguments against homework being a necessity (rather than an optional extra for more ambitious or dedicated students), making a clear case for teaching methods which facilitate full learning of the key content within the classroom. A methodology which could be described as Programmed Mastery Learning has been used at Fluency Language Academy (Spain) since 1992, to teach English to over 4000 pupils yearly, with a staff of around 100 teachers, barely requiring homework. The course is structured according to the tenets of Programmed Learning: small manageable teaching steps, immediate feedback, and constant successful activity. For the Mastery component (not stopping until everyone has learned), the memorisation and practice are entrusted to flashcard-based drilling in the classroom, leading all students to progress together and develop a permanently growing knowledge base. Vocabulary and expressions are memorised using flashcards as stimuli, obliging the brain to constantly recover words from the long-term memory and converting them into reflex knowledge, before they are deployed in sentence building. The use of grammar rules is practised with ‘cue’ flashcards: the brain refers consciously to the grammar rule each time it produces a phrase until it comes easily. This automation of lexicon and correct grammar use greatly facilitates all other language and conversational activities. The full B2 course consists of 48 units each of which takes a class an average of 17,5 hours to complete, allowing the vast majority of students to reach B2 level in 840 class hours, which is corroborated by an 85% pass-rate in the Cambridge University B2 exam (First Certificate). In the past, studying for qualifications was just one of many different options open to young people. Nowadays, youngsters need to stay at school and obtain qualifications in order to get any kind of job. There are many students in our classes who have little intrinsic interest in what they are studying; they just need the certificate. In these circumstances and with increasing government pressure to minimise failure, teachers can no longer think ‘If they don’t study, and fail, its their problem’. It is now becoming the teacher’s problem. Teachers are ever more in need of methods which make their pupils successful learners; this means assuring learning in the classroom. Furthermore, homework is arguably the main divider between successful middle-class schoolchildren and failing working-class children who drop out: if everything important is learned at school, the latter will have a much better chance, favouring inclusiveness in the language classroom.Keywords: flashcard drilling, fluency method, mastery learning, programmed learning, teaching English as a foreign language
Procedia PDF Downloads 1095700 A Unique Multi-Class Support Vector Machine Algorithm Using MapReduce
Authors: Aditi Viswanathan, Shree Ranjani, Aruna Govada
Abstract:
With data sizes constantly expanding, and with classical machine learning algorithms that analyze such data requiring larger and larger amounts of computation time and storage space, the need to distribute computation and memory requirements among several computers has become apparent. Although substantial work has been done in developing distributed binary SVM algorithms and multi-class SVM algorithms individually, the field of multi-class distributed SVMs remains largely unexplored. This research seeks to develop an algorithm that implements the Support Vector Machine over a multi-class data set and is efficient in a distributed environment. For this, we recursively choose the best binary split of a set of classes using a greedy technique. Much like the divide and conquer approach. Our algorithm has shown better computation time during the testing phase than the traditional sequential SVM methods (One vs. One, One vs. Rest) and out-performs them as the size of the data set grows. This approach also classifies the data with higher accuracy than the traditional multi-class algorithms.Keywords: distributed algorithm, MapReduce, multi-class, support vector machine
Procedia PDF Downloads 3995699 Continuous Functions Modeling with Artificial Neural Network: An Improvement Technique to Feed the Input-Output Mapping
Authors: A. Belayadi, A. Mougari, L. Ait-Gougam, F. Mekideche-Chafa
Abstract:
The artificial neural network is one of the interesting techniques that have been advantageously used to deal with modeling problems. In this study, the computing with artificial neural network (CANN) is proposed. The model is applied to modulate the information processing of one-dimensional task. We aim to integrate a new method which is based on a new coding approach of generating the input-output mapping. The latter is based on increasing the neuron unit in the last layer. Accordingly, to show the efficiency of the approach under study, a comparison is made between the proposed method of generating the input-output set and the conventional method. The results illustrated that the increasing of the neuron units, in the last layer, allows to find the optimal network’s parameters that fit with the mapping data. Moreover, it permits to decrease the training time, during the computation process, which avoids the use of computers with high memory usage.Keywords: neural network computing, continuous functions generating the input-output mapping, decreasing the training time, machines with big memories
Procedia PDF Downloads 2825698 A Corpus-Based Study on the Lexical, Syntactic and Sequential Features across Interpreting Types
Authors: Qianxi Lv, Junying Liang
Abstract:
Among the various modes of interpreting, simultaneous interpreting (SI) is regarded as a ‘complex’ and ‘extreme condition’ of cognitive tasks while consecutive interpreters (CI) do not have to share processing capacity between tasks. Given that SI exerts great cognitive demand, it makes sense to posit that the output of SI may be more compromised than that of CI in the linguistic features. The bulk of the research has stressed the varying cognitive demand and processes involved in different modes of interpreting; however, related empirical research is sparse. In keeping with our interest in investigating the quantitative linguistic factors discriminating between SI and CI, the current study seeks to examine the potential lexical simplification, syntactic complexity and sequential organization mechanism with a self-made inter-model corpus of transcribed simultaneous and consecutive interpretation, translated speech and original speech texts with a total running word of 321960. The lexical features are extracted in terms of the lexical density, list head coverage, hapax legomena, and type-token ratio, as well as core vocabulary percentage. Dependency distance, an index for syntactic complexity and reflective of processing demand is employed. Frequency motif is a non-grammatically-bound sequential unit and is also used to visualize the local function distribution of interpreting the output. While SI is generally regarded as multitasking with high cognitive load, our findings evidently show that CI may impose heavier or taxing cognitive resource differently and hence yields more lexically and syntactically simplified output. In addition, the sequential features manifest that SI and CI organize the sequences from the source text in different ways into the output, to minimize the cognitive load respectively. We reasoned the results in the framework that cognitive demand is exerted both on maintaining and coordinating component of Working Memory. On the one hand, the information maintained in CI is inherently larger in volume compared to SI. On the other hand, time constraints directly influence the sentence reformulation process. The temporal pressure from the input in SI makes the interpreters only keep a small chunk of information in the focus of attention. Thus, SI interpreters usually produce the output by largely retaining the source structure so as to relieve the information from the working memory immediately after formulated in the target language. Conversely, CI interpreters receive at least a few sentences before reformulation, when they are more self-paced. CI interpreters may thus tend to retain and generate the information in a way to lessen the demand. In other words, interpreters cope with the high demand in the reformulation phase of CI by generating output with densely distributed function words, more content words of higher frequency values and fewer variations, simpler structures and more frequently used language sequences. We consequently propose a revised effort model based on the result for a better illustration of cognitive demand during both interpreting types.Keywords: cognitive demand, corpus-based, dependency distance, frequency motif, interpreting types, lexical simplification, sequential units distribution, syntactic complexity
Procedia PDF Downloads 1765697 The Analysis of the Effect of Brand Image on Creating Brand Loyalty with the Structural Equation Model: A Research Study on the Sports Equipment Brand Users
Authors: Murat Erdoğdu, Murat Koçyiğit
Abstract:
Brand image and brand loyalty are among the most important relational marketing elements for brand owners to be able to set up long – term relationships with their customers and to maintain these relationships. Brand owners improve their brand images with the positive perceptions remaining in the consumers’ minds. In addition, they try to find the customers that are both emotionally and behaviourally faithful to themselves in order to set up long – term relationships. Therefore, the aim of this study is to analyse the effects of the brand image that has a very important role among relational marketing elements on the brand loyalty in terms of the variables such as the perceived value, the trust in brand and the brand satisfaction. In this context, a conceptual model was created to determine the effect of the brand image on the brand loyalty thanks to the Structural Equation Model (SEM). According to this aim and this model, the study was carried out in the scope of the data collected through the questionnaires in Konya with the method of convenience sampling. The results of the research showed that the brand image has positive significant effects on the perceived value and the trust in brand and that the trust in brand has positive significant effects on the brand satisfaction, and that the brand satisfaction has positive significant effects on the brand loyalty. Thus, the hypotheses that the brand image has direct effects on the perceived value and the trust in brand and that the trust in brand has direct effects on the brand satisfaction and that the brand satisfaction has direct effects on the brand loyalty were supported. In addition, the findings about whether the perceived value has a significant effect on the brand satisfaction were also acquired.Keywords: brand image, brand loyalty, perceived value, satisfaction, trust
Procedia PDF Downloads 4395696 Resource Orchestration Based on Two-Sides Scheduling in Computing Network Control Sytems
Authors: Li Guo, Jianhong Wang, Dian Huang, Shengzhong Feng
Abstract:
Computing networks as a new network architecture has shown great promise in boosting the utilization of different resources, such as computing, caching, and communications. To maximise the efficiency of resource orchestration in computing network control systems (CNCSs), this work proposes a dynamic orchestration strategy of a different resource based on task requirements from computing power requestors (CPRs). Specifically, computing power providers (CPPs) in CNCSs could share information with each other through communication channels on the basis of blockchain technology, especially their current idle resources. This dynamic process is modeled as a cooperative game in which CPPs have the same target of maximising long-term rewards by improving the resource utilization ratio. Meanwhile, the task requirements from CPRs, including size, deadline, and calculation, are simultaneously considered in this paper. According to task requirements, the proposed orchestration strategy could schedule the best-fitting resource in CNCSs, achieving the maximum long-term rewards of CPPs and the best quality of experience (QoE) of CRRs at the same time. Based on the EdgeCloudSim simulation platform, the efficiency of the proposed strategy is achieved from both sides of CPRs and CPPs. Besides, experimental results show that the proposed strategy outperforms the other comparisons in all cases.Keywords: computing network control systems, resource orchestration, dynamic scheduling, blockchain, cooperative game
Procedia PDF Downloads 1145695 Investigation of Martensitic Transformation Zone at the Crack Tip of NiTi under Mode-I Loading Using Microscopic Image Correlation
Authors: Nima Shafaghi, Gunay Anlaş, C. Can Aydiner
Abstract:
A realistic understanding of martensitic phase transition under complex stress states is key for accurately describing the mechanical behavior of shape memory alloys (SMAs). Particularly regarding the sharply changing stress fields at the tip of a crack, the size, nature and shape of transformed zones are of great interest. There is significant variation among various analytical models in their predictions of the size and shape of the transformation zone. As the fully transformed region remains inside a very small boundary at the tip of the crack, experimental validation requires microscopic resolution. Here, the crack tip vicinity of NiTi compact tension specimen has been monitored in situ with microscopic image correlation with 20x magnification. With nominal 15 micrometer grains and 0.2 micrometer per pixel optical resolution, the strains at the crack tip are mapped with intra-grain detail. The transformation regions are then deduced using an equivalent strain formulation.Keywords: digital image correlation, fracture, martensitic phase transition, mode I, NiTi, transformation zone
Procedia PDF Downloads 3515694 Measuring Greenhouse Gas Exchange from Paddy Field Using Eddy Covariance Method in Mekong Delta, Vietnam
Authors: Vu H. N. Khue, Marian Pavelka, Georg Jocher, Jiří Dušek, Le T. Son, Bui T. An, Ho Q. Bang, Pham Q. Huong
Abstract:
Agriculture is an important economic sector of Vietnam, the most popular of which is wet rice cultivation. These activities are also known as the main contributor to the national greenhouse gas. In order to understand more about greenhouse gas exchange in these activities and to investigate the factors influencing carbon cycling and sequestration in these types of ecosystems, since 2019, the first eddy covariance station has been installed in a paddy field in Long An province, Mekong Delta. The station was equipped with state-of-the-art equipment for CO₂ and CH₄ gas exchange and micrometeorology measurements. In this study, data from the station was processed following the ICOS recommendations (Integrated Carbon Observation System) standards for CO₂, while CH₄ was manually processed and gap-filled using a random forest model from methane-gapfill-ml, a machine learning package, as there is no standard method for CH₄ flux gap-filling yet. Finally, the carbon equivalent (Ce) balance based on CO₂ and CH₄ fluxes was estimated. The results show that in 2020, even though a new water management practice - alternate wetting and drying - was applied to reduce methane emissions, the paddy field released 928 g Cₑ.m⁻².yr⁻¹, and in 2021, it was reduced to 707 g Cₑ.m⁻².yr⁻¹. On a provincial level, rice cultivation activities in Long An, with a total area of 498,293 ha, released 4.6 million tons of Cₑ in 2020 and 3.5 million tons of Cₑ in 2021.Keywords: eddy covariance, greenhouse gas, methane, rice cultivation, Mekong Delta
Procedia PDF Downloads 1415693 Integration GIS–SCADA Power Systems to Enclosure Air Dispersion Model
Authors: Ibrahim Shaker, Amr El Hossany, Moustafa Osman, Mohamed El Raey
Abstract:
This paper will explore integration model between GIS–SCADA system and enclosure quantification model to approach the impact of failure-safe event. There are real demands to identify spatial objects and improve control system performance. Nevertheless, the employed methodology is predicting electro-mechanic operations and corresponding time to environmental incident variations. Open processing, as object systems technology, is presented for integration enclosure database with minimal memory size and computation time via connectivity drivers such as ODBC:JDBC during main stages of GIS–SCADA connection. The function of Geographic Information System is manipulating power distribution in contrast to developing issues. In other ward, GIS-SCADA systems integration will require numerical objects of process to enable system model calibration and estimation demands, determine of past events for analysis and prediction of emergency situations for response training.Keywords: air dispersion model, environmental management, SCADA systems, GIS system, integration power system
Procedia PDF Downloads 3665692 Performance Evaluation of Hemispherical Basin Type Solar Still
Authors: Husham Mahmood Ahmed
Abstract:
For so many reasons, fresh water scarcity is one of major problems facing the world and in particularly in the third world in the Northern Africa, the Middle East, the Southwest of Asia, and many other desert areas. Solar distillation offers one of the most promising solutions of renewable energy to this aggravated situation. The main obstacle hindering the spread of the use of solar technology for fresh water production is its low efficiency. Therefore, enhancing the solar stills performances by studying the parameters affecting their productivity and implementing new ideas and a different design are the main goals of the investigators in recent years. The present research is experimental work that tests a new design of solar still with a hemispherical top cover for water desalination with and without external reflectors under the climate of the Kingdom of Bahrain during the autumn season. The hemispherical cover has a base diameter of 1m and a depth of 0.4m, die cast from a 6 mm thick Lexan plastic sheet. The net effective area was 0.785 m2. It has been found that the average daily production rate obtained from the hemispherical top cover solar still is 3.610 liter/day. This yield is 11.1% higher than the yield of a conventional simple type single slope solar still having 20ᴼ slope glass cover and a larger effective area of 1 m2 obtained in previous research under similar climatic conditions. It has also been found that adding 1.2m long by 0.15 curved reflectors increased the yield of the hemispherical solar still by 5.5 %, while the 1.2 long by 0.3m curved reflector increased the yield by about 8%.Keywords: hemispherical solar still, solar desalination, solar energy, the Northern Africa
Procedia PDF Downloads 3925691 Energy Mutual Funds: The Behavior of Environmental, Social and Governance Funds
Authors: Anna Paola Micheli, Anna Maria Calce, Loris Di Nallo
Abstract:
Sustainable finance identifies the process that leads, in the adoption of investment decisions, to take into account environmental and social factors, with the aim of orienting investments towards sustainable and long-term activities. Considering that the topic is at the center of the interest of national agendas, long-term investments will no longer be analyzed only by looking at financial data, but environmental, social, and governance (ESG) factors will be increasingly important and will play a fundamental role in determining the risk and return of an investment. Although this perspective does not deny the orientation to profit, ESG mutual funds represent sustainable finance applied to the world of mutual funds. So the goal of this paper is to verify this attitude, in particular in the energy sector. The choice of the sector is not casual: ESG is the acronym for environmental, social, and governance, and energy companies are strictly related to the environmental theme. The methodology adopted leads to a comparison between a sample of ESG funds and a sample of ESG funds with similar characteristics, using the most important indicators of literature: yield, standard deviation, and Sharpe index. The analysis is focused on equity funds. Results that are partial, due to the lack of historicity, show a good performance of ESG funds, testifying how a sustainable approach does not necessarily mean lower profits. It is clear that these first findings do not involve an absolute preference for ESG funds in terms of performance because the persistence of results is requested. Furthermore, these findings are to be verified in other sectors and in bond funds.Keywords: mutual funds, ESG, performance, energy
Procedia PDF Downloads 1105690 The 10,000 Fold Effect of Retrograde Neurotransmission, a New Concept for Stroke Revival: Use of Intracarotid Sodium Nitroprusside
Authors: Vinod Kumar
Abstract:
Background: Tissue Plasminogen Activator (tPA) showed a level 1 benefit in acute stroke (within 3-6 hrs). Intracarotid sodium nitroprusside (ICSNP) has been studied in this context with a wide treatment window, fast recovery and affordability. This work proposes two mechanisms for acute cases and one mechanism for chronic cases, which are interrelated, for physiological recovery. a)Retrograde Neurotransmission (acute cases): 1)Normal excitatory impulse: at the synaptic level, glutamate activates NMDA receptors, with nitric oxide synthetase (NOS) on the postsynaptic membrane, for further propagation by the calcium-calmodulin complex. Nitric oxide (NO, produced by NOS) travels backward across the chemical synapse and binds the axon-terminal NO receptor/sGC of a presynaptic neuron, regulating anterograde neurotransmission (ANT) via retrograde neurotransmission (RNT). Heme is the ligand-binding site of the NO receptor/sGC. Heme exhibits > 10,000-fold higher affinity for NO than for oxygen (the 10,000-fold effect) and is completed in 20 msec. 2)Pathological conditions: normal synaptic activity, including both ANT and RNT, is absent. A NO donor (SNP) releases NO from NOS in the postsynaptic region. NO travels backward across a chemical synapse to bind to the heme of a NO receptor in the axon terminal of a presynaptic neuron, generating an impulse, as under normal conditions. b)Vasospasm: (acute cases) Perforators show vasospastic activity. NO vasodilates the perforators via the NO-cAMP pathway. c)Long-Term Potentıatıon (LTP): (chronic cases) The NO–cGMP-pathway plays a role in LTP at many synapses throughout the CNS and at the neuromuscular junction. LTP has been reviewed both generally and with respect to brain regions specific for memory/learning. Aims/Study Des’gn: The principles of “generation of impulses from the presynaptic region to the postsynaptic region by very potent RNT (10,000-fold effect)” and “vasodilation of arteriolar perforators” are the basis of the authors’ hypothesis to treat stroke cases. Case-control prospective study. Mater’als And Methods: The experimental population included 82 stroke patients (10 patients were given control treatments without superfusion or with 5% dextrose superfusion, and 72 patients comprised the ICSNP group). The mean time for superfusion was 9.5 days post-stroke. Pre- and post-ICSNP status was monitored by NIHSS, MRI and TCD. Results: After 90 seconds in the ICSNP group, the mean change in the NIHSS score was a decrease of 1.44 points, or 6.55%; after 2 h, there was a decrease of 1.16 points; after 24 h, there was an increase of 0.66 points, 2.25%, compared to the control-group increase of 0.7 points, or 3.53%; at 7 days, there was an 8.61-point decrease, 44.58%, compared to the control-group increase of 2.55 points, or 22.37%; at 2 months in ICSNP, there was a 6.94-points decrease, 62.80%, compared to the control-group decrease of 2.77 points, or 8.78%. TCD was documented and improvements were noted. Conclusions: ICSNP is a swift-acting drug in the treatment of stroke, acting within 90 seconds on day 9.5 post-stroke with a small decrease after 24 hours. The drug recovers from this decrease quickly.Keywords: brain infarcts, intracarotid sodium nitroprusside, perforators, vasodilatıons, retrograde transmission, the 10, 000-fold effect
Procedia PDF Downloads 3065689 The “Ecological Approach” to GIS Implementation in Low Income Countries’ and the Role of Universities: Union of Municipalities of Joumeh Case Study
Authors: A. Iaaly, O. Jadayel, R. Jadayel
Abstract:
This paper explores the effectiveness of approaches used for the implementation of technology within central governments specifically Geographic Information Systems (GIS). It examines the extent to which various strategies to GIS implementation and its roll out to users within an organization is crucial for its long term assimilation. Depending on the contextual requirements, various implementation strategies exist spanning from the most revolutionary to the most evolutionary, which have an influence on the success of GIS projects and the realization of resulting business benefits within the central governments. This research compares between two strategies of GIS implementation within the Lebanese Municipalities. The first strategy is the “Technological Approach” which is focused on technology acquisition, overlaid on existing governmental frameworks. This approach gives minimal attention to capability building and the long term sustainability of the implemented program. The second strategy, referred to as the “Ecological Approach”, is naturally oriented to the function of the organization. This approach stresses on fostering the evolution of the program and on building the human capabilities. The Union of the Joumeh Municipalities will be presented as a case study under the “Ecological Approach” and the role of the GIS Center at the University of Balamand will be highlighted. Thus, this research contributes to the development of knowledge on technology implementation and the vital role of academia in the specific context of the Lebanese public sector so that this experience may pave the way for further applications.Keywords: ecological approach GIS, low income countries, technological approach
Procedia PDF Downloads 3055688 Civic Engagement and Political Participation in Bangladesh
Authors: Syeda Salina Aziz, Tanvir Ahmed Mozumder
Abstract:
Citizenship is an important concept of democracy which broadly defines the relationship between the state and its citizens; at the same time, it analyzes the rights and duties of a citizen. The universal citizenship principle demands that citizens should be aware of the political system, possess democratic attitudes, and join the political activity. Bangladesh presents an interesting case for democracy; the democratic practices in the country have been long introduced, have been interrupted several times, and the democratic values and practices have yet to be established in the country. These transitions have influenced citizens’ ideologies and participation in decision-making and also shaped their expectations differently. In this backdrop, this paper aims to understand and explain the citizenship behavior of Bangladeshi nationals. Based on nationally representative household survey data of 4000 respondents, this paper creates a composite citizenship index which is a combination of three separate indices, including participation index, knowledge and awareness index, and ideology index. The paper then tries to explain the factors that affect the citizenship index. Using fixed effect regression analysis, the paper intends to explore the association between citizenship and socio-demographic variables, including education, location, gender, and exposure to the media of respondents. Additionally, using national election polls, the paper creates a variable to measure long-term support towards the current ruling party and tests whether and how this affects the citizenship variables.Keywords: citizenship, political participation, Bangladesh, stronghold
Procedia PDF Downloads 815687 A Survey on Speech Emotion-Based Music Recommendation System
Authors: Chirag Kothawade, Gourie Jagtap, PreetKaur Relusinghani, Vedang Chavan, Smitha S. Bhosale
Abstract:
Psychological research has proven that music relieves stress, elevates mood, and is responsible for the release of “feel-good” chemicals like oxytocin, serotonin, and dopamine. It comes as no surprise that music has been a popular tool in rehabilitation centers and therapy for various disorders, thus with the interminably rising numbers of people facing mental health-related issues across the globe, addressing mental health concerns is more crucial than ever. Despite the existing music recommendation systems, there is a dearth of holistically curated algorithms that take care of the needs of users. Given that, an undeniable majority of people turn to music on a regular basis and that music has been proven to increase cognition, memory, and sleep quality while reducing anxiety, pain, and blood pressure, it is the need of the hour to fashion a product that extracts all the benefits of music in the most extensive and deployable method possible. Our project aims to ameliorate our users’ mental state by building a comprehensive mood-based music recommendation system called “Viby”.Keywords: language, communication, speech recognition, interaction
Procedia PDF Downloads 635686 Closing the Gap: Efficient Voxelization with Equidistant Scanlines and Gap Detection
Authors: S. Delgado, C. Cerrada, R. S. Gómez
Abstract:
This research introduces an approach to voxelizing the surfaces of triangular meshes with efficiency and accuracy. Our method leverages parallel equidistant scan-lines and introduces a Gap Detection technique to address the limitations of existing approaches. We present a comprehensive study showcasing the method's effectiveness, scalability, and versatility in different scenarios. Voxelization is a fundamental process in computer graphics and simulations, playing a pivotal role in applications ranging from scientific visualization to virtual reality. Our algorithm focuses on enhancing the voxelization process, especially for complex models and high resolutions. One of the major challenges in voxelization in the Graphics Processing Unit (GPU) is the high cost of discovering the same voxels multiple times. These repeated voxels incur in costly memory operations with no useful information. Our scan-line-based method ensures that each voxel is detected exactly once when processing the triangle, enhancing performance without compromising the quality of the voxelization. The heart of our approach lies in the use of parallel, equidistant scan-lines to traverse the interiors of triangles. This minimizes redundant memory operations and avoids revisiting the same voxels, resulting in a significant performance boost. Moreover, our method's computational efficiency is complemented by its simplicity and portability. Written as a single compute shader in Graphics Library Shader Language (GLSL), it is highly adaptable to various rendering pipelines and hardware configurations. To validate our method, we conducted extensive experiments on a diverse set of models from the Stanford repository. Our results demonstrate not only the algorithm's efficiency, but also its ability to produce 26 tunnel free accurate voxelizations. The Gap Detection technique successfully identifies and addresses gaps, ensuring consistent and visually pleasing voxelized surfaces. Furthermore, we introduce the Slope Consistency Value metric, quantifying the alignment of each triangle with its primary axis. This metric provides insights into the impact of triangle orientation on scan-line based voxelization methods. It also aids in understanding how the Gap Detection technique effectively improves results by targeting specific areas where simple scan-line-based methods might fail. Our research contributes to the field of voxelization by offering a robust and efficient approach that overcomes the limitations of existing methods. The Gap Detection technique fills a critical gap in the voxelization process. By addressing these gaps, our algorithm enhances the visual quality and accuracy of voxelized models, making it valuable for a wide range of applications. In conclusion, "Closing the Gap: Efficient Voxelization with Equidistant Scan-lines and Gap Detection" presents an effective solution to the challenges of voxelization. Our research combines computational efficiency, accuracy, and innovative techniques to elevate the quality of voxelized surfaces. With its adaptable nature and valuable innovations, this technique could have a positive influence on computer graphics and visualization.Keywords: voxelization, GPU acceleration, computer graphics, compute shaders
Procedia PDF Downloads 705685 When Messages Cause Distraction from Advertising: An Eye-Tracking Study
Authors: Nilamadhab Mohanty
Abstract:
It is essential to use message formats that make communication understandable and correct. It is because; the information format can influence consumer decision on the purchase of a product. This study combines information from qualitative inquiry, media trend analysis, eye tracking experiment, and questionnaire data to examine the impact of specific message format and consumer perceived risk on attention to the information and risk retention. We investigated the influence of message framing (goal framing, attribute framing, and mix framing) on consumer memory, study time, and decisional uncertainty while deciding on the purchase of drugs. Furthermore, we explored the impact of consumer perceived risk (associated with the use of the drug, i.e., RISK-AB and perceived risk associated with the non-use of the drug, i.e., RISK-EB) on message format preference. The study used eye-tracking methods to understand the differences in message processing. Findings of the study suggest that the message format influences information processing, and participants' risk perception impacts message format preference. Eye tracking can be used to understand the format differences and design effective advertisements.Keywords: message framing, consumer perceived risk, advertising, eye tracking
Procedia PDF Downloads 1215684 Effects of Aging on Ultra: Triathlon Performance
Authors: Richard S. Jatau, Kankanala Venkateswarlu, Bulus Kpame
Abstract:
The purpose of this critical review is to find out what is known and what is unknown about the effects of aging on endurance performance, especially on ultra- triathlon performance. It has been shown that among master’s athlete’s peak levels of performance decreased by 50% by age 50 it has also been clearly revealed that age associated atrophy, weakness and fatigability cannot be halted, although year round athletic training can slow down this age associated decline. Studies have further revealed that 30% to 50% decrease in skeletal muscle mass between ages 40 and 80 years, which is accompanied by an equal or even greater decline in strength and power and an increase in muscle weakness and fatigability. Studies on ultra- triathlon athletes revealed that 30 to 39 year old showed fastest time, with athletes in younger and older age groups were slower. It appears that the length of the endurance performance appears to influence age related endurance performance decline in short distance triathlons. A significant decline seems to start at the age of 40 to 50 years, whereas in long distance triathlons this decline seems to start after the age of 65 years. However, it is not clear whether this decline is related in any way to the training methods used, the duration of training, or the frequency of training. It’s also not clear whether the triathlon athletes experience more injuries due to long hours of training. It’s also not clear whether these athletes used performance enhancing drugs to enhance their performance. It’s not also clear whiles there has been tremendous increase in the number of athletes specializing in triathlon. On the basis of our experience and available research evidence we have provided answers to some of these questions. We concluded that aging associated decline in ultra–endurance performance is inevitable although it can be slowed down.Keywords: aging, triathlon, atrophy, endurance
Procedia PDF Downloads 3725683 The Efficacy of Video Education to Improve Treatment or Illness-Related Knowledge in Patients with a Long-Term Physical Health Condition: A Systematic Review
Authors: Megan Glyde, Louise Dye, David Keane, Ed Sutherland
Abstract:
Background: Typically patient education is provided either verbally, in the form of written material, or with a multimedia-based tool such as videos, CD-ROMs, DVDs, or via the internet. By providing patients with effective educational tools, this can help to meet their information needs and subsequently empower these patients and allow them to participate within medical-decision making. Video education may have some distinct advantages compared to other modalities. For instance, whilst eHealth is emerging as a promising modality of patient education, an individual’s ability to access, read, and navigate through websites or online modules varies dramatically in relation to health literacy levels. Literacy levels may also limit patients’ ability to understand written education, whereas video education can be watched passively by patients and does not require high literacy skills. Other benefits of video education include that the same information is provided consistently to each patient, it can be a cost-effective method after the initial cost of producing the video, patients can choose to watch the videos by themselves or in the presence of others, and they can pause and re-watch videos to suit their needs. Health information videos are not only viewed by patients in formal educational sessions, but are increasingly being viewed on websites such as YouTube. Whilst there is a lot of anecdotal and sometimes misleading information on YouTube, videos from government organisations and professional associations contain trustworthy and high-quality information and could enable YouTube to become a powerful information dissemination platform for patients and carers. This systematic review will examine the efficacy of video education to improve treatment or illness-related knowledge in patients with various long-term conditions, in comparison to other modalities of education. Methods: Only studies which match the following criteria will be included: participants will have a long-term physical health condition, video education will aim to improve treatment or illness related knowledge and will be tested in isolation, and the study must be a randomised controlled trial. Knowledge will be the primary outcome measure, with modality preference, anxiety, and behaviour change as secondary measures. The searches have been conducted in the following databases: OVID Medline, OVID PsycInfo, OVID Embase, CENTRAL and ProQuest, and hand searching for relevant published and unpublished studies has also been carried out. Screening and data extraction will be conducted independently by 2 researchers. Included studies will be assessed for their risk of bias in accordance with Cochrane guidelines, and heterogeneity will also be assessed before deciding whether a meta-analysis is appropriate or not. Results and Conclusions: Appropriate synthesis of the studies in relation to each outcome measure will be reported, along with the conclusions and implications.Keywords: long-term condition, patient education, systematic review, video
Procedia PDF Downloads 1115682 Family Caregiver Transitions and Health in Old Age: A Longitudinal Perspective
Authors: Cecilia Fagerstrom, Solve Elmstahl, Lena S. Wranker
Abstract:
The conditions of increased morbidity in an aging population cause the need for family care to become more common at an advanced age. The role of family caregivers may well last for a long time but may also change over time, from being caregivers to being non-caregivers or vice versa. Although demands associated with family caring change as individuals enter into, engage with, and exit from this role, the evidence regarding the impact of family caregiving transitions on the health of older carers is still limited. This study comprised individuals (n=2294, 60+years) from the southern part of Sweden included in the project Swedish National study of Aging and Care. Caregiving transitions are discussed in the categories: enter, exit, and continuing during a six-year period. Individuals who exited caregiving during the time were older than those who continued or entered into the role of caregiving. At the six-year follow-up, caregivers who were continuing or had exited caregiving were more often worried about their own health compared to baseline. Resembling findings were not found in those who entered caregiving. Family caregiving transitions of exiting, entering or continuing had no effect on the individuals’ functional, physical and mental health expect for participants who entered in caregiving. For them, entering the role of family caregiving was associated with an improvement in physical health during the six years follow up period. Conclusion: Although the health impact of different caregiving transitions in late life does not differ, individual conditions and health at baseline are important parameters to take into consideration to improve long-term health in family caregivers.Keywords: family caregiving, health, old age, transition
Procedia PDF Downloads 2185681 An Inviscid Compressible Flow Solver Based on Unstructured OpenFOAM Mesh Format
Authors: Utkan Caliskan
Abstract:
Two types of numerical codes based on finite volume method are developed in order to solve compressible Euler equations to simulate the flow through forward facing step channel. Both algorithms have AUSM+- up (Advection Upstream Splitting Method) scheme for flux splitting and two-stage Runge-Kutta scheme for time stepping. In this study, the flux calculations differentiate between the algorithm based on OpenFOAM mesh format which is called 'face-based' algorithm and the basic algorithm which is called 'element-based' algorithm. The face-based algorithm avoids redundant flux computations and also is more flexible with hybrid grids. Moreover, some of OpenFOAM’s preprocessing utilities can be used on the mesh. Parallelization of the face based algorithm for which atomic operations are needed due to the shared memory model, is also presented. For several mesh sizes, 2.13x speed up is obtained with face-based approach over the element-based approach.Keywords: cell centered finite volume method, compressible Euler equations, OpenFOAM mesh format, OpenMP
Procedia PDF Downloads 3185680 Characteristics of Domestic Sewage in Small Urban Communities
Authors: Shohreh Azizi, Memory Tekere, Wag Nel
Abstract:
An evaluation of the characteristics of wastewater generated from small communities was carried out in relation to decentralized approach for domestic sewage treatment plant and design of biological nutrient removal system. The study included the survey of the waste from various individual communities such as a hotel, a residential complex, an office premise, and an educational institute. The results indicate that the concentration of organic pollutant in wastewater from the residential complex is higher than the waste from all the other communities with COD 664 mg/l, BOD 370.2 mg/l and TSS 248.8 mg/l. And the waste water from office premise indicates low organic load with COD428 mg/l, BOD 232mg/l and TSS 157mg/l. The wastewater from residential complex was studied under activated sludge process to evaluate this technology for decentralized wastewater treatment. The Activated sludge process was operated at different 12to 4 hrs hydraulic retention times and the optimum 6 hrs HRT was selected, therefore the average reduction of COD (85.92%) and BOD (91.28 %) was achieved. The issue of sludge recycling, maintenance of biomass concentration and high HRT reactor (10 L) volume are making the system non-practical for smaller communities.Keywords: wastewater, small communities, activated sludge process, decentralized system
Procedia PDF Downloads 357