Search results for: wavelet coherence
132 A Multi Sensor Monochrome Video Fusion Using Image Quality Assessment
Authors: M. Prema Kumar, P. Rajesh Kumar
Abstract:
The increasing interest in image fusion (combining images of two or more modalities such as infrared and visible light radiation) has led to a need for accurate and reliable image assessment methods. This paper gives a novel approach of merging the information content from several videos taken from the same scene in order to rack up a combined video that contains the finest information coming from different source videos. This process is known as video fusion which helps in providing superior quality (The term quality, connote measurement on the particular application.) image than the source images. In this technique different sensors (whose redundant information can be reduced) are used for various cameras that are imperative for capturing the required images and also help in reducing. In this paper Image fusion technique based on multi-resolution singular value decomposition (MSVD) has been used. The image fusion by MSVD is almost similar to that of wavelets. The idea behind MSVD is to replace the FIR filters in wavelet transform with singular value decomposition (SVD). It is computationally very simple and is well suited for real time applications like in remote sensing and in astronomy.Keywords: multi sensor image fusion, MSVD, image processing, monochrome video
Procedia PDF Downloads 570131 Infilling Strategies for Surrogate Model Based Multi-disciplinary Analysis and Applications to Velocity Prediction Programs
Authors: Malo Pocheau-Lesteven, Olivier Le Maître
Abstract:
Engineering and optimisation of complex systems is often achieved through multi-disciplinary analysis of the system, where each subsystem is modeled and interacts with other subsystems to model the complete system. The coherence of the output of the different sub-systems is achieved through the use of compatibility constraints, which enforce the coupling between the different subsystems. Due to the complexity of some sub-systems and the computational cost of evaluating their respective models, it is often necessary to build surrogate models of these subsystems to allow repeated evaluation these subsystems at a relatively low computational cost. In this paper, gaussian processes are used, as their probabilistic nature is leveraged to evaluate the likelihood of satisfying the compatibility constraints. This paper presents infilling strategies to build accurate surrogate models of the subsystems in areas where they are likely to meet the compatibility constraint. It is shown that these infilling strategies can reduce the computational cost of building surrogate models for a given level of accuracy. An application of these methods to velocity prediction programs used in offshore racing naval architecture further demonstrates these method's applicability in a real engineering context. Also, some examples of the application of uncertainty quantification to field of naval architecture are presented.Keywords: infilling strategy, gaussian process, multi disciplinary analysis, velocity prediction program
Procedia PDF Downloads 156130 Shifting Paradigms of Culture: Rise of Secular Sensibility in Indian Literature
Authors: Nidhi Chouhan
Abstract:
Burgeoning demand of ‘Secularism’ has shaken the pillars of cultural studies in the contemporary literature. The perplexity of the culturally estranged term ‘secular’ gives rise to temporal ideologies across the world. Hence, it is high time to scan this concept in the context of Indian lifestyle which is a blend of assimilated cultures woven in multiple religious fabrics. The infliction of such secular taste is depicted in literary productions like ‘Satanic Verses’ and ‘An Area of Darkness’. The paper conceptually makes a cross-cultural analysis of anti-religious Indian literary texts, assessing its revitalization in current times. Further, this paper studies the increasing popularity of secular sensibility in the contemporary times. The mushrooming elements of secularism such as abstraction, spirituality, liberation, individualism give rise to a seemingly newer idea i.e. ‘Plurality’ making the literature highly hybrid. This approach has been used to study Indian modernity reflected in its literature. Seminal works of stalwarts are used to understand the consequence of this cultural synthesis. Conclusively, this theoretical research inspects the efficiency of secular culture, intertwined with internal coherence and throws light on the plurality of texts in Indian literature.Keywords: culture, indian, literature, plurality, secular, secularism
Procedia PDF Downloads 102129 Pod and Wavelets Application for Aerodynamic Design Optimization
Authors: Bonchan Koo, Junhee Han, Dohyung Lee
Abstract:
The research attempts to evaluate the accuracy and efficiency of a design optimization procedure which combines wavelets-based solution algorithm and proper orthogonal decomposition (POD) database management technique. Aerodynamic design procedure calls for high fidelity computational fluid dynamic (CFD) simulations and the consideration of large number of flow conditions and design constraints. Even with significant computing power advancement, current level of integrated design process requires substantial computing time and resources. POD reduces the degree of freedom of full system through conducting singular value decomposition for various field simulations. For additional efficiency improvement of the procedure, adaptive wavelet technique is also being employed during POD training period. The proposed design procedure was applied to the optimization of wing aerodynamic performance. Throughout the research, it was confirmed that the POD/wavelets design procedure could significantly reduce the total design turnaround time and is also able to capture all detailed complex flow features as in full order analysis.Keywords: POD (Proper Orthogonal Decomposition), wavelets, CFD, design optimization, ROM (Reduced Order Model)
Procedia PDF Downloads 463128 Cladding Technology for Metal-Hybrid Composites with Network-Structure
Authors: Ha-Guk Jeong, Jong-Beom Lee
Abstract:
Cladding process is very typical technology for manufacturing composite materials by the hydrostatic extrusion. Because there is no friction between the metal and the container, it can be easily obtained in uniform flow during the deformation. The general manufacturing process for a metal-matrix composite in the solid state, mixing metal powders and ceramic powders with a suited volume ratio, prior to be compressed or extruded at the cold or hot condition in a can. Since through a plurality of unit processing steps of dispersing the materials having a large difference in their characteristics and physical mixing, the process is complicated and leads to non-uniform dispersion of ceramics. It is difficult and hard to reach a uniform ideal property in the coherence problems at the interface between the metal and the ceramic reinforcements. Metal hybrid composites, which presented in this report, are manufactured through the traditional plastic deformation processes like hydrostatic extrusion, caliber-rolling, and drawing. By the previous process, the realization of uniform macro and microstructure is surely possible. In this study, as a constituent material, aluminum, copper, and titanium have been used, according to the component ratio, excellent characteristics of each material were possible to produce a metal hybrid composite that appears to maximize. MgB₂ superconductor wire also fabricated via the same process. It will be introduced to their unique artistic and thermal characteristics.Keywords: cladding process, metal-hybrid composites, hydrostatic extrusion, electronic/thermal characteristics
Procedia PDF Downloads 177127 Narrative Point of View in Nature Documentary Films: A Study of The Cove (2009), Tale of a Forest (2012), and Before the Flood (2016)
Authors: Sakshi Yadav, Sushila Shekhawat
Abstract:
This study addresses different types of points of view as seen in nature documentary films with the help of three eco documentaries, and it would be significant in understanding the role of the narrative point of view as a tool for showing and telling in documentaries. Narrative analysis of a film forms an essential aspect of the discourse of scholarship in film studies. Narration is the chain of events occurring in time and space. The notion of narrative provides the idea of coherence and wholeness to the story. There are various components that the narration carries, one of which is the perspective or point of view. The narrator plays the role of a mediator between the film and the audience; thus, his perspective influences the way the audience interprets the film. Feature films have been analyzed through narrative points of view; however, this research intends to conduct it from the angle of a nature documentary film. The study will examine narrative viewpoints unique to nature documentary films using three ecological documentary films-The Cove (2009), Tale of a forest (2012), and Before the flood (2016). This research will apply the framework of narrative theory and will investigate the impact of the different types of narrative points of view, as each portrays the human-nature relationship from a different standpoint, and it will also study the effect that the narrative point of view has on the mode of these eco documentaries.Keywords: ecodocumentary, narrative, human-nature relationship, point of view
Procedia PDF Downloads 88126 Diminishing Voices of Children in Mandatory Mediation Schemes
Authors: Yuliya Radanova, Agnė Tvaronavičienė
Abstract:
With the growing trend for mandating parties of family conflicts to out-of-court processes, the adopted statutory regulations often remain silent on the way the voice of the child is integrated into the procedure. Convention on the Rights of the Child (Art. 12) clearly states the obligation to assure to the child who can form his or her own views the right to express those views freely in all matters affecting him. This article seeks to explore the way children participate in the mandatory mediation schemes applicable to family disputes in the European Union. A review of scientific literature and empirical data has been conducted on those EU Member States that coerce parties to family mediation to establish that different models of practice are deployed, and there is a lack of synchronicity on how children’s role in mediation is viewed. Child-inclusive mediation processes are deemed to produce sustainable results over time but necessitate professional qualifications and skills for the purpose of mediators to accommodate that such discussions are aligned with the best interest of the child. However, there is no unanimous guidance, standards or protocols on the peculiar characteristics and manner through which children are involved in mediation. Herewith, it is suggested that the lack of such rigorous approaches and coherence in an ever-changing mediation setting transitioning towards mandatory mediation models jeopardizes the importance of children’s voices in the process. Thus, it is suggested that there is a need to consider the adoption of uniform guidelines on the specific role children have in mediation, particularly in its mandatory models.Keywords: family mediation, child involvement, mandatory mediation, child-inclusive, child-focused
Procedia PDF Downloads 73125 Application of Computational Flow Dynamics (CFD) Analysis for Surge Inception and Propagation for Low Head Hydropower Projects
Authors: M. Mohsin Munir, Taimoor Ahmad, Javed Munir, Usman Rashid
Abstract:
Determination of maximum elevation of a flowing fluid due to sudden rejection of load in a hydropower facility is of great interest to hydraulic engineers to ensure safety of the hydraulic structures. Several mathematical models exist that employ one-dimensional modeling for the determination of surge but none of these perfectly simulate real-time circumstances. The paper envisages investigation of surge inception and propagation for a Low Head Hydropower project using Computational Fluid Dynamics (CFD) analysis on FLOW-3D software package. The fluid dynamic model utilizes its analysis for surge by employing Reynolds’ Averaged Navier-Stokes Equations (RANSE). The CFD model is designed for a case study at Taunsa hydropower Project in Pakistan. Various scenarios have run through the model keeping in view upstream boundary conditions. The prototype results were then compared with the results of physical model testing for the same scenarios. The results of the numerical model proved quite accurate coherence with the physical model testing and offers insight into phenomenon which are not apparent in physical model and shall be adopted in future for the similar low head projects limiting delays and cost incurred in the physical model testing.Keywords: surge, FLOW-3D, numerical model, Taunsa, RANSE
Procedia PDF Downloads 357124 Emergentist Metaphorical Creativity: Towards a Model of Analysing Metaphorical Creativity in Interactive Talk
Authors: Afef Badri
Abstract:
Metaphorical creativity does not constitute a static property of discourse. It is an interactive dynamic process created online. There has been a lack of research concerning online produced metaphorical creativity. This paper intends to account for metaphorical creativity in online talk-in-interaction as a dynamic process that emerges as discourse unfolds. It brings together insights from the emergentist approach to the study of metaphor in verbal interactions and insights from conceptual blending approach as a model for analysing online metaphorical constructions to propose a model for studying metaphorical creativity in interactive talk. The model is based on three focal points. First, metaphorical creativity is a dynamic emergent and open-to-change process that evolves in real time as interlocutors constantly blend and re-blend previous metaphorical contributions. Second, it is not a product of isolated individual minds but a joint achievement that is co-constructed and co-elaborated by interlocutors. The third and most important point is that the emergent process of metaphorical creativity is tightly shaped by contextual variables surrounding talk-in-interaction. It is grounded in the framework of interpretation of interlocutors. It is constrained by preceding contributions in a way that creates textual cohesion of the verbal exchange and it is also a goal-oriented process predefined by the communicative intention of each participant in a way that reveals the ideological coherence/incoherence of the entire conversation.Keywords: communicative intention, conceptual blending, the emergentist approach, metaphorical creativity
Procedia PDF Downloads 257123 Visualization and Performance Measure to Determine Number of Topics in Twitter Data Clustering Using Hybrid Topic Modeling
Authors: Moulana Mohammed
Abstract:
Topic models are widely used in building clusters of documents for more than a decade, yet problems occurring in choosing optimal number of topics. The main problem is the lack of a stable metric of the quality of topics obtained during the construction of topic models. The authors analyzed from previous works, most of the models used in determining the number of topics are non-parametric and quality of topics determined by using perplexity and coherence measures and concluded that they are not applicable in solving this problem. In this paper, we used the parametric method, which is an extension of the traditional topic model with visual access tendency for visualization of the number of topics (clusters) to complement clustering and to choose optimal number of topics based on results of cluster validity indices. Developed hybrid topic models are demonstrated with different Twitter datasets on various topics in obtaining the optimal number of topics and in measuring the quality of clusters. The experimental results showed that the Visual Non-negative Matrix Factorization (VNMF) topic model performs well in determining the optimal number of topics with interactive visualization and in performance measure of the quality of clusters with validity indices.Keywords: interactive visualization, visual mon-negative matrix factorization model, optimal number of topics, cluster validity indices, Twitter data clustering
Procedia PDF Downloads 133122 Embodying the Ecological Validity in Creating the Sustainable Public Policy: A Study in Strengthening the Green Economy in Indonesia
Authors: Gatot Dwi Hendro, Hayyan ul Haq
Abstract:
This work aims to explore the strategy in embodying the ecological validity in creating the sustainability of public policy, particularly in strengthening the green economy in Indonesia. This green economy plays an important role in supporting the national development in Indonesia, as it is a part of the national policy that posits the primary priority in Indonesian governance. The green economy refers to the national development covering strategic natural resources, such as mining, gold, oil, coal, forest, water, marine, and the other supporting infrastructure for products and distribution, such as fabrics, roads, bridges, and so forth. Thus, all activities in those national development should consider the sustainability. This sustainability requires the strong commitment of the national and regional government, as well as the local governments to put the ecology as the main requirement for issuing any policy, such as licence in mining production, and developing and building new production and supporting infrastructures for optimising the national resources. For that reason this work will focus on the strategy how to embody the ecological values and norms in the public policy. In detail, this work will offer the method, i.e. legal techniques, in visualising and embodying the norms and public policy that valid ecologically. This ecological validity is required in order to maintain and sustain our collective life.Keywords: ecological validity, sustainable development, coherence, Indonesian Pancasila values, environment, marine
Procedia PDF Downloads 484121 Design of Enhanced Adaptive Filter for Integrated Navigation System of FOG-SINS and Star Tracker
Authors: Nassim Bessaad, Qilian Bao, Zhao Jiangkang
Abstract:
The fiber optics gyroscope in the strap-down inertial navigation system (FOG-SINS) suffers from precision degradation due to the influence of random errors. In this work, an enhanced Allan variance (AV) stochastic modeling method combined with discrete wavelet transform (DWT) for signal denoising is implemented to estimate the random process in the FOG signal. Furthermore, we devise a measurement-based iterative adaptive Sage-Husa nonlinear filter with augmented states to integrate a star tracker sensor with SINS. The proposed filter adapts the measurement noise covariance matrix based on the available data. Moreover, the enhanced stochastic modeling scheme is invested in tuning the process noise covariance matrix and the augmented state Gauss-Markov process parameters. Finally, the effectiveness of the proposed filter is investigated by employing the collected data in laboratory conditions. The result shows the filter's improved accuracy in comparison with the conventional Kalman filter (CKF).Keywords: inertial navigation, adaptive filtering, star tracker, FOG
Procedia PDF Downloads 79120 Detecting HCC Tumor in Three Phasic CT Liver Images with Optimization of Neural Network
Authors: Mahdieh Khalilinezhad, Silvana Dellepiane, Gianni Vernazza
Abstract:
The aim of the present work is to build a model based on tissue characterization that is able to discriminate pathological and non-pathological regions from three-phasic CT images. Based on feature selection in different phases, in this research, we design a neural network system that has optimal neuron number in a hidden layer. Our approach consists of three steps: feature selection, feature reduction, and classification. For each ROI, 6 distinct set of texture features are extracted such as first order histogram parameters, absolute gradient, run-length matrix, co-occurrence matrix, autoregressive model, and wavelet, for a total of 270 texture features. We show that with the injection of liquid and the analysis of more phases the high relevant features in each region changed. Our results show that for detecting HCC tumor phase3 is the best one in most of the features that we apply to the classification algorithm. The percentage of detection between these two classes according to our method, relates to first order histogram parameters with the accuracy of 85% in phase 1, 95% phase 2, and 95% in phase 3.Keywords: multi-phasic liver images, texture analysis, neural network, hidden layer
Procedia PDF Downloads 260119 Text as Reader Device Improving Subjectivity on the Role of Attestation between Interpretative Semiotics and Discursive Linguistics
Authors: Marco Castagna
Abstract:
Proposed paper is aimed to inquire about the relation between text and reader, focusing on the concept of ‘attestation’. Indeed, despite being widely accepted in semiotic research, even today the concept of text remains uncertainly defined. So, it seems to be undeniable that what is called ‘text’ offers an image of internal cohesion and coherence, that makes it possible to analyze it as an object. Nevertheless, this same object remains problematic when it is pragmatically activated by the act of reading. In fact, as for the T.A.R:D.I.S., that is the unique space-temporal vehicle used by the well-known BBC character Doctor Who in his adventures, every text appears to its own readers not only “bigger inside than outside”, but also offering spaces that change according to the different traveller standing in it. In a few words, as everyone knows, this singular condition raises the questions about the gnosiological relation between text and reader. How can a text be considered the ‘same’, even if it can be read in different ways by different subjects? How can readers can be previously provided with knowledge required for ‘understanding’ a text, but at the same time learning something more from it? In order to explain this singular condition it seems useful to start thinking about text as a device more than an object. In other words, this unique status is more clearly understandable when ‘text’ ceases to be considered as a box designed to move meaning from a sender to a recipient (marking the semiotic priority of the “code”) and it starts to be recognized as performative meaning hypothesis, that is discursively configured by one or more forms and empirically perceivable by means of one or more substances. Thus, a text appears as a “semantic hanger”, potentially offered to the “unending deferral of interpretant", and from time to time fixed as “instance of Discourse”. In this perspective, every reading can be considered as an answer to the continuous request for confirming or denying the meaning configuration (the meaning hypothesis) expressed by text. Finally, ‘attestation’ is exactly what regulates this dynamic of request and answer, through which the reader is able to confirm his previous hypothesis on reality or maybe acquire some new ones.Proposed paper is aimed to inquire about the relation between text and reader, focusing on the concept of ‘attestation’. Indeed, despite being widely accepted in semiotic research, even today the concept of text remains uncertainly defined. So, it seems to be undeniable that what is called ‘text’ offers an image of internal cohesion and coherence, that makes it possible to analyze it as an object. Nevertheless, this same object remains problematic when it is pragmatically activated by the act of reading. In fact, as for the T.A.R:D.I.S., that is the unique space-temporal vehicle used by the well-known BBC character Doctor Who in his adventures, every text appears to its own readers not only “bigger inside than outside”, but also offering spaces that change according to the different traveller standing in it. In a few words, as everyone knows, this singular condition raises the questions about the gnosiological relation between text and reader. How can a text be considered the ‘same’, even if it can be read in different ways by different subjects? How can readers can be previously provided with knowledge required for ‘understanding’ a text, but at the same time learning something more from it? In order to explain this singular condition it seems useful to start thinking about text as a device more than an object. In other words, this unique status is more clearly understandable when ‘text’ ceases to be considered as a box designed to move meaning from a sender to a recipient (marking the semiotic priority of the “code”) and it starts to be recognized as performative meaning hypothesis, that is discursively configured by one or more forms and empirically perceivable by means of one or more substances. Thus, a text appears as a “semantic hanger”, potentially offered to the “unending deferral of interpretant", and from time to time fixed as “instance of Discourse”. In this perspective, every reading can be considered as an answer to the continuous request for confirming or denying the meaning configuration (the meaning hypothesis) expressed by text. Finally, ‘attestation’ is exactly what regulates this dynamic of request and answer, through which the reader is able to confirm his previous hypothesis on reality or maybe acquire some new ones.Keywords: attestation, meaning, reader, text
Procedia PDF Downloads 236118 Spirituality in Education (Enhance the Human Mind Competencies)
Authors: Kshama Sharma
Abstract:
Education is one of the most powerful tools to transform the world into a just, sustainable, and more peaceful place for existing lives across the globe. However, its recent objective approach focused on materialistic, factual, and existing knowledge, has a constraint of human experiences that is limited to certain dimensions only. And leads to a materialistic world which is deprived of spiritual approaches and makes it less compassionate, and more grades oriented. To make it more comprehensive, education should explore the subjective approaches towards spiritualism to connect lives with the greater self and consciousness of cosmic intelligence. This approach will bring a major shift in the orientation of pedagogical processes, assessment strategies, and administrative management of the present education system. Spirituality often related to the religious aspect of human civilization and development, however, when universal consciousness /cosmic intelligence (which is often claimed as dark energy) and the human mind competencies works in coherence and coordination then the efficiency of human mind reaches to a different dimension and achieve extraordinary level of human understanding. Quantitative analysis of the existing secondary data from the different agencies working in the field of meditation had been analyzed to conclude its implications on human mind and further how it can effectively use in education to bring the desired and expected results. Any kind of meditation practice affects the cognitive, mental, physical, emotional, and conscious state of mind. If aligned with the teaching and learning methodology will lead to conscious learner and peaceful world.Keywords: spirituality, cosmic intelligence, consciousness, mind competencies
Procedia PDF Downloads 52117 Comparison of Corneal Curvature Measurements Conducted with Tomey AO-2000® and the Current Standard Biometer IOL Master®
Authors: Mohd Radzi Hilmi, Khairidzan Mohd Kamal, Che Azemin Mohd Zulfaezal, Ariffin Azrin Esmady
Abstract:
Purpose: Corneal curvature (CC) is an important anterior segment parameter. This study compared CC measurements conducted with two optical devices in phakic eyes. Methods: Sixty phakic eyes of 30 patients were enrolled in this study. CC was measured three times with the optical biometer and topography-keratometer Tomey AO-2000 (Tomey Corporation, Nagoya, Japan), then with the standard partial optical coherence interferometry (PCI) IOL Master (Carl Zeiss Meditec, Dublin, CA) and data were statistically analysed. Results: The measurements resulted in a mean CC of 43.86 ± 1.57 D with Tomey AO-2000 and 43.84 ± 1.55 D with IOL Master. Distribution of data is normal, and no significance difference in CC values was detected (P = 0.952) between the two devices. Correlation between CC measurements was highly significant (r = 0. 99; P < 0.0001). The mean difference of CC values between devices was 0.017 D and 95% limit of agreement was -0.088 to 0.12. Duration taken for measurements with the standard biometer IOL Master was longer (55.17 ± 2.24 seconds) than with Tomey AO-2000 (39.88 ± 2.38 seconds) in automatic mode. Duration of manual measurement with Tomey AO-2000 in manual mode was the shortest (28.57 ± 2.71 seconds). Conclusion: In phakic eyes, CC measured with Tomey AO-2000 and IOL Master showed similar values, and high correlation was observed between these two devices. This shows that both devices can be used interchangeably. Tomey AO-2000 is better in terms of faster to operate and has its own topography systems.Keywords: corneal topography, corneal curvature, IOL Master, Tomey AO2000
Procedia PDF Downloads 384116 Machine Learning for Rational Decision-Making: Introducing Creativity to Teachers within a School System
Authors: Larry Audet
Abstract:
Creativity is suddenly and fortunately a new educational focus in the United Arab Emirates and around the world. Yet still today many leaders of creativity are not sure how to introduce it to their teachers. It is impossible to simultaneously introduce every aspect of creativity into a work climate and reach any degree of organizational coherence. The number of alternatives to explore is so great; the information teachers need to learn is so vast, that even an approximation to including every concept and theory of creativity into the school organization is hard to conceive. Effective leaders of creativity need evidence-based and practical guidance for introducing and stimulating creativity in others. Machine learning models reveal new findings from KEYS Survey© data about teacher perceptions of stimulants and barriers to their individual and collective creativity. Findings from predictive and causal models provide leaders with a rational for decision-making when introducing creativity into their organization. Leaders should focus on management practices first. Analyses reveal that creative outcomes are more likely to occur when teachers perceive supportive management practices: providing teachers with challenging work that calls for their best efforts; allowing freedom and autonomy in their practice of work; allowing teachers to form creative work-groups; and, recognizing them for their efforts. Once management practices are in place, leaders should focus their efforts on modeling risk-taking, providing optimal amounts of preparation time, and evaluating teachers fairly.Keywords: creativity, leadership, KEYS survey, teaching, work climate
Procedia PDF Downloads 165115 Community Perceptions and Attitudes Regarding Wildlife Crime in South Africa
Authors: Louiza C. Duncker, Duarte Gonçalves
Abstract:
Wildlife crime is a complex problem with many interconnected facets, which are generally responded to in parts or fragments in efforts to “break down” the complexity into manageable components. However, fragmentation increases complexity as coherence and cooperation become diluted. A whole-of-society approach has been developed towards finding a common goal and integrated approach to preventing wildlife crime. As part of this development, research was conducted in rural communities adjacent to conservation areas in South Africa to define and comprehend the challenges faced by them, and to understand their perceptions of wildlife crime. The results of the research showed that the perceptions of community members varied - most were in favor of conservation and of protecting rhinos, only if they derive adequate benefit from it. Regardless of gender, income level, education level, or access to services, conservation was perceived to be good and bad by the same people. Even though people in the communities are poor, a willingness to stop rhino poaching does exist amongst them, but their perception of parks not caring about people triggered an attitude of not being willing to stop, prevent or report poaching. Understanding the nuances, the history, the interests and values of community members, and the drivers behind poaching mind-sets (intrinsic or driven by transnational organized crime) is imperative to create sustainable and resilient communities on multiple levels that make a substantial positive impact on people’s lives, but also conserve wildlife for posterity.Keywords: community perceptions, conservation, rhino poaching, whole-of-society approach, wildlife crime
Procedia PDF Downloads 230114 Importance of Developing a Decision Support System for Diagnosis of Glaucoma
Authors: Murat Durucu
Abstract:
Glaucoma is a condition of irreversible blindness, early diagnosis and appropriate interventions to make the patients able to see longer time. In this study, it addressed that the importance of developing a decision support system for glaucoma diagnosis. Glaucoma occurs when pressure happens around the eyes it causes some damage to the optic nerves and deterioration of vision. There are different levels ranging blindness of glaucoma disease. The diagnosis at an early stage allows a chance for therapies that slows the progression of the disease. In recent years, imaging technology from Heidelberg Retinal Tomography (HRT), Stereoscopic Disc Photo (SDP) and Optical Coherence Tomography (OCT) have been used for the diagnosis of glaucoma. This better accuracy and faster imaging techniques in response technique of OCT have become the most common method used by experts. Although OCT images or HRT precision and quickness, especially in the early stages, there are still difficulties and mistakes are occurred in diagnosis of glaucoma. It is difficult to obtain objective results on diagnosis and placement process of the doctor's. It seems very important to develop an objective decision support system for diagnosis and level the glaucoma disease for patients. By using OCT images and pattern recognition systems, it is possible to develop a support system for doctors to make their decisions on glaucoma. Thus, in this recent study, we develop an evaluation and support system to the usage of doctors. Pattern recognition system based computer software would help the doctors to make an objective evaluation for their patients. It is intended that after development and evaluation processes of the software, the system is planning to be serve for the usage of doctors in different hospitals.Keywords: decision support system, glaucoma, image processing, pattern recognition
Procedia PDF Downloads 300113 Optimal Pressure Control and Burst Detection for Sustainable Water Management
Authors: G. K. Viswanadh, B. Rajasekhar, G. Venkata Ramana
Abstract:
Water distribution networks play a vital role in ensuring a reliable supply of clean water to urban areas. However, they face several challenges, including pressure control, pump speed optimization, and burst event detection. This paper combines insights from two studies to address these critical issues in Water distribution networks, focusing on the specific context of Kapra Municipality, India. The first part of this research concentrates on optimizing pressure control and pump speed in complex Water distribution networks. It utilizes the EPANET- MATLAB Toolkit to integrate EPANET functionalities into the MATLAB environment, offering a comprehensive approach to network analysis. By optimizing Pressure Reduce Valves (PRVs) and variable speed pumps (VSPs), this study achieves remarkable results. In the Benchmark Water Distribution System (WDS), the proposed PRV optimization algorithm reduces average leakage by 20.64%, surpassing the previous achievement of 16.07%. When applied to the South-Central and East zone WDS of Kapra Municipality, it identifies PRV locations that were previously missed by existing algorithms, resulting in average leakage reductions of 22.04% and 10.47%. These reductions translate to significant daily Water savings, enhancing Water supply reliability and reducing energy consumption. The second part of this research addresses the pressing issue of burst event detection and localization within the Water Distribution System. Burst events are a major contributor to Water losses and repair expenses. The study employs wireless sensor technology to monitor pressure and flow rate in real time, enabling the detection of pipeline abnormalities, particularly burst events. The methodology relies on transient analysis of pressure signals, utilizing Cumulative Sum and Wavelet analysis techniques to robustly identify burst occurrences. To enhance precision, burst event localization is achieved through meticulous analysis of time differentials in the arrival of negative pressure waveforms across distinct pressure sensing points, aided by nodal matrix analysis. To evaluate the effectiveness of this methodology, a PVC Water pipeline test bed is employed, demonstrating the algorithm's success in detecting pipeline burst events at flow rates of 2-3 l/s. Remarkably, the algorithm achieves a localization error of merely 3 meters, outperforming previously established algorithms. This research presents a significant advancement in efficient burst event detection and localization within Water pipelines, holding the potential to markedly curtail Water losses and the concomitant financial implications. In conclusion, this combined research addresses critical challenges in Water distribution networks, offering solutions for optimizing pressure control, pump speed, burst event detection, and localization. These findings contribute to the enhancement of Water Distribution System, resulting in improved Water supply reliability, reduced Water losses, and substantial cost savings. The integrated approach presented in this paper holds promise for municipalities and utilities seeking to improve the efficiency and sustainability of their Water distribution networks.Keywords: pressure reduce valve, complex networks, variable speed pump, wavelet transform, burst detection, CUSUM (Cumulative Sum), water pipeline monitoring
Procedia PDF Downloads 84112 The Effect of Self and Peer Assessment Activities in Second Language Writing: A Washback Effect Study on the Writing Growth during the Revision Phase in the Writing Process: Learners’ Perspective
Authors: Musbah Abdussayed
Abstract:
The washback effect refers to the influence of assessment on teaching and learning, and this washback effect can either be positive or negative. This study implemented, sequentially, self-assessment (SA) and peer assessment (PA) and examined the washback effect of self and peer assessment (SPA) activities on the writing growth during the revision phase in the writing process. Twenty advanced Arabic as a second language learners from a private school in the USA participated in the study. The participants composed and then revised a short Arabic story as a part of a midterm grade. Qualitative data was collected, analyzed, and synthesized from ten interviews with the learners and from the twenty learners’ post-reflective journals. The findings indicate positive washback effects on the learners’ writing growth. The PA activity enhanced descriptions and meaning, promoted creativity, and improved textual coherence, whereas the SA activity led to detecting editing issues. Furthermore, both SPA activities had washback effects in common, including helping the learners meet the writing genre conventions and developing metacognitive awareness. However, the findings also demonstrate negative washback effects on the learners’ attitudes during the revision phase in the writing process, including bias toward self-evaluation during the SA activity and reluctance to rate peers’ writing performance during the PA activity. The findings suggest that self-and peer assessment activities are essential teaching and learning tools that can be utilized sequentially to help learners tackle multiple writing areas during the revision phase in the writing process.Keywords: self assessment, peer assessment, washback effect, second language writing, writing process
Procedia PDF Downloads 65111 Exploring Teacher Verbal Feedback on Postgraduate Students' Performances in Presentations in English
Authors: Nattawadee Sinpattanawong, Yaowaret Tharawoot
Abstract:
This is an analytic and descriptive classroom-centered research, the purpose of which is to explore teacher verbal feedback on postgraduate students’ performances in presentations in English in an English for Specific Purposes (ESP) postgraduate classroom. The participants are a Thai female teacher, two Thai female postgraduate students, and two foreign male postgraduate students. The current study draws on both classroom observation and interview data. The class focused on the students’ presentations and the teacher’s providing verbal feedback on them was observed nine times with audio recording and taking notes. For the interviews, the teacher was interviewed about linkages between her verbal feedback and each student’s presentation skills in English. For the data analysis, the audio files from the observations were transcribed and analyzed both quantitatively and qualitatively. The quantitative approach addressed the frequencies and percentages of content of the teacher’s verbal feedback for each student’s performances based on eight presentation factors (content, structure, grammar, coherence, vocabulary, speaking skills, involving the audience, and self-presentation). Based on the quantitative data including the interview data, a qualitative analysis of the transcripts was made to describe the occurrences of several content of verbal feedback for each student’s presentation performances. The study’s findings may help teachers to reflect on their providing verbal feedback based on various students’ performances in presentation in English. They also help students who have similar characteristics to the students in the present study when giving a presentation in English improve their presentation performances by applying the teacher’s verbal feedback content.Keywords: teacher verbal feedback, presentation factors, presentation in English, presentation performances
Procedia PDF Downloads 147110 Solar Radiation Time Series Prediction
Authors: Cameron Hamilton, Walter Potter, Gerrit Hoogenboom, Ronald McClendon, Will Hobbs
Abstract:
A model was constructed to predict the amount of solar radiation that will make contact with the surface of the earth in a given location an hour into the future. This project was supported by the Southern Company to determine at what specific times during a given day of the year solar panels could be relied upon to produce energy in sufficient quantities. Due to their ability as universal function approximators, an artificial neural network was used to estimate the nonlinear pattern of solar radiation, which utilized measurements of weather conditions collected at the Griffin, Georgia weather station as inputs. A number of network configurations and training strategies were utilized, though a multilayer perceptron with a variety of hidden nodes trained with the resilient propagation algorithm consistently yielded the most accurate predictions. In addition, a modeled DNI field and adjacent weather station data were used to bolster prediction accuracy. In later trials, the solar radiation field was preprocessed with a discrete wavelet transform with the aim of removing noise from the measurements. The current model provides predictions of solar radiation with a mean square error of 0.0042, though ongoing efforts are being made to further improve the model’s accuracy.Keywords: artificial neural networks, resilient propagation, solar radiation, time series forecasting
Procedia PDF Downloads 384109 High-Rises and Urban Design: The Reasons for Unsuccessful Placemaking with Residential High-Rises in England
Authors: E. Kalcheva, A. Taki, Y. Hadi
Abstract:
High-rises and placemaking is an understudied combination which receives more and more interest with the proliferation of this typology in many British cities. The reason for studying three major cities in England: London, Birmingham and Manchester, is to learn from the latest advances in urban design in well-developed and prominent urban environment. The analysis of several high-rise sites reveals the weaknesses in urban design of contemporary British cities and presents an opportunity to study from the implemented examples. Therefore, the purpose of this research is to analyze design approaches towards creating a sustainable and varied urban environment when high-rises are involved. The research questions raised by the study are: what is the quality of high-rises and their surroundings; what facilities and features are deployed in the research area; what is the role of the high-rise buildings in the placemaking process; what urban design principles are applicable in this context. The methodology utilizes observation of the researched area by structured questions, developed by the author to evaluate the outdoor qualities of the high-rise surroundings. In this context, the paper argues that the quality of the public realm around the high-rises is quite low, missing basic but vital elements such as plazas, public art, and seating, along with landscaping and pocket parks. There is lack of coherence, the rhythm of the streets is often disrupted, and even though the high-rises are very aesthetically appealing, they fail to create a sense of place on their own. The implications of the study are that future planning can take into consideration the critique in this article and provide more opportunities for urban design interventions around high-rise buildings in the British cities.Keywords: high-rises, placemaking, urban design, townscape
Procedia PDF Downloads 321108 Spatial Analysis of Flood Vulnerability in Highly Urbanized Area: A Case Study in Taipei City
Authors: Liang Weichien
Abstract:
Without adequate information and mitigation plan for natural disaster, the risk to urban populated areas will increase in the future as populations grow, especially in Taiwan. Taiwan is recognized as the world's high-risk areas, where an average of 5.7 times of floods occur per year should seek to strengthen coherence and consensus in how cities can plan for flood and climate change. Therefore, this study aims at understanding the vulnerability to flooding in Taipei city, Taiwan, by creating indicators and calculating the vulnerability of each study units. The indicators were grouped into sensitivity and adaptive capacity based on the definition of vulnerability of Intergovernmental Panel on Climate Change. The indicators were weighted by using Principal Component Analysis. However, current researches were based on the assumption that the composition and influence of the indicators were the same in different areas. This disregarded spatial correlation that might result in inaccurate explanation on local vulnerability. The study used Geographically Weighted Principal Component Analysis by adding geographic weighting matrix as weighting to get the different main flood impact characteristic in different areas. Cross Validation Method and Akaike Information Criterion were used to decide bandwidth and Gaussian Pattern as the bandwidth weight scheme. The ultimate outcome can be used for the reduction of damage potential by integrating the outputs into local mitigation plan and urban planning.Keywords: flood vulnerability, geographically weighted principal components analysis, GWPCA, highly urbanized area, spatial correlation
Procedia PDF Downloads 284107 Configuring Systems to Be Viable in a Crisis: The Role of Intuitive Decision-Making
Authors: Ayham Fattoum, Simos Chari, Duncan Shaw
Abstract:
Volatile, uncertain, complex, and ambiguous (VUCA) conditions threaten systems viability with emerging and novel events requiring immediate and localized responses. Such responsiveness is only possible through devolved freedom and emancipated decision-making. The Viable System Model (VSM) recognizes the need and suggests maximizing autonomy to localize decision-making and minimize residual complexity. However, exercising delegated autonomy in VUCA requires confidence and knowledge to use intuition and guidance to maintain systemic coherence. This paper explores the role of intuition as an enabler of emancipated decision-making and autonomy under VUCA. Intuition allows decision-makers to use their knowledge and experience to respond rapidly to novel events. This paper offers three contributions to VSM. First, it designs a system model that illustrates the role of intuitive decision-making in managing complexity and maintaining viability. Second, it takes a black-box approach to theory development in VSM to model the role of autonomy and intuition. Third, the study uses a multi-stage discovery-oriented approach (DOA) to develop theory, with each stage combining literature, data analysis, and model/theory development and identifying further questions for the subsequent stage. We synthesize literature (e.g., VSM, complexity management) with seven months of field-based insights (interviews, workshops, and observation of a live disaster exercise) to develop a framework of intuitive complexity management framework and VSM models. The results have practical implications for enhancing the resilience of organizations and communities.Keywords: Intuition, complexity management, decision-making, viable system model
Procedia PDF Downloads 66106 Artificial intelligence and Law
Authors: Mehrnoosh Abouzari, Shahrokh Shahraei
Abstract:
With the development of artificial intelligence in the present age, intelligent machines and systems have proven their actual and potential capabilities and are mindful of increasing their presence in various fields of human life in the fields of industry, financial transactions, marketing, manufacturing, service affairs, politics, economics and various branches of the humanities .Therefore, despite the conservatism and prudence of law enforcement, the traces of artificial intelligence can be seen in various areas of law. Including judicial robotics capability estimation, intelligent judicial decision making system, intelligent defender and attorney strategy adjustment, dissemination and regulation of different and scattered laws in each case to achieve judicial coherence and reduce opinion, reduce prolonged hearing and discontent compared to the current legal system with designing rule-based systems, case-based, knowledge-based systems, etc. are efforts to apply AI in law. In this article, we will identify the ways in which AI is applied in its laws and regulations, identify the dominant concerns in this area and outline the relationship between these two areas in order to answer the question of how artificial intelligence can be used in different areas of law and what the implications of this application will be. The authors believe that the use of artificial intelligence in the three areas of legislative, judiciary and executive power can be very effective in governments' decisions and smart governance, and helping to reach smart communities across human and geographical boundaries that humanity's long-held dream of achieving is a global village free of violence and personalization and human error. Therefore, in this article, we are going to analyze the dimensions of how to use artificial intelligence in the three legislative, judicial and executive branches of government in order to realize its application.Keywords: artificial intelligence, law, intelligent system, judge
Procedia PDF Downloads 117105 ALEF: An Enhanced Approach to Arabic-English Bilingual Translation
Authors: Abdul Muqsit Abbasi, Ibrahim Chhipa, Asad Anwer, Saad Farooq, Hassan Berry, Sonu Kumar, Sundar Ali, Muhammad Owais Mahmood, Areeb Ur Rehman, Bahram Baloch
Abstract:
Accurate translation between structurally diverse languages, such as Arabic and English, presents a critical challenge in natural language processing due to significant linguistic and cultural differences. This paper investigates the effectiveness of Facebook’s mBART model, fine-tuned specifically for sequence-tosequence (seq2seq) translation tasks between Arabic and English, and enhanced through advanced refinement techniques. Our approach leverages the Alef Dataset, a meticulously curated parallel corpus spanning various domains to capture the linguistic richness, nuances, and contextual accuracy essential for high-quality translation. We further refine the model’s output using advanced language models such as GPT-3.5 and GPT-4, which improve fluency, coherence, and correct grammatical errors in translated texts. The fine-tuned model demonstrates substantial improvements, achieving a BLEU score of 38.97, METEOR score of 58.11, and TER score of 56.33, surpassing widely used systems such as Google Translate. These results underscore the potential of mBART, combined with refinement strategies, to bridge the translation gap between Arabic and English, providing a reliable, context-aware machine translation solution that is robust across diverse linguistic contexts.Keywords: natural language processing, machine translation, fine-tuning, Arabic-English translation, transformer models, seq2seq translation, translation evaluation metrics, cross-linguistic communication
Procedia PDF Downloads 6104 High-Resolution ECG Automated Analysis and Diagnosis
Authors: Ayad Dalloo, Sulaf Dalloo
Abstract:
Electrocardiogram (ECG) recording is prone to complications, on analysis by physicians, due to noise and artifacts, thus creating ambiguity leading to possible error of diagnosis. Such drawbacks may be overcome with the advent of high resolution Methods, such as Discrete Wavelet Analysis and Digital Signal Processing (DSP) techniques. This ECG signal analysis is implemented in three stages: ECG preprocessing, features extraction and classification with the aim of realizing high resolution ECG diagnosis and improved detection of abnormal conditions in the heart. The preprocessing stage involves removing spurious artifacts (noise), due to such factors as muscle contraction, motion, respiration, etc. ECG features are extracted by applying DSP and suggested sloping method techniques. These measured features represent peak amplitude values and intervals of P, Q, R, S, R’, and T waves on ECG, and other features such as ST elevation, QRS width, heart rate, electrical axis, QR and QT intervals. The classification is preformed using these extracted features and the criteria for cardiovascular diseases. The ECG diagnostic system is successfully applied to 12-lead ECG recordings for 12 cases. The system is provided with information to enable it diagnoses 15 different diseases. Physician’s and computer’s diagnoses are compared with 90% agreement, with respect to physician diagnosis, and the time taken for diagnosis is 2 seconds. All of these operations are programmed in Matlab environment.Keywords: ECG diagnostic system, QRS detection, ECG baseline removal, cardiovascular diseases
Procedia PDF Downloads 295103 Decision Support System Based On GIS and MCDM to Identify Land Suitability for Agriculture
Authors: Abdelkader Mendas
Abstract:
The integration of MultiCriteria Decision Making (MCDM) approaches in a Geographical Information System (GIS) provides a powerful spatial decision support system which offers the opportunity to efficiently produce the land suitability maps for agriculture. Indeed, GIS is a powerful tool for analyzing spatial data and establishing a process for decision support. Because of their spatial aggregation functions, MCDM methods can facilitate decision making in situations where several solutions are available, various criteria have to be taken into account and decision-makers are in conflict. The parameters and the classification system used in this work are inspired from the FAO (Food and Agriculture Organization) approach dedicated to a sustainable agriculture. A spatial decision support system has been developed for establishing the land suitability map for agriculture. It incorporates the multicriteria analysis method ELECTRE Tri (ELimitation Et Choix Traduisant la REalité) in a GIS within the GIS program package environment. The main purpose of this research is to propose a conceptual and methodological framework for the combination of GIS and multicriteria methods in a single coherent system that takes into account the whole process from the acquisition of spatially referenced data to decision-making. In this context, a spatial decision support system for developing land suitability maps for agriculture has been developed. The algorithm of ELECTRE Tri is incorporated into a GIS environment and added to the other analysis functions of GIS. This approach has been tested on an area in Algeria. A land suitability map for durum wheat has been produced. Through the obtained results, it appears that ELECTRE Tri method, integrated into a GIS, is better suited to the problem of land suitability for agriculture. The coherence of the obtained maps confirms the system effectiveness.Keywords: multicriteria decision analysis, decision support system, geographical information system, land suitability for agriculture
Procedia PDF Downloads 635