Search results for: flood features
3167 A Survey of Novel Opportunistic Routing Protocols in Mobile Ad Hoc Networks
Authors: R. Poonkuzhali, M. Y. Sanavullah, M. R. Gurupriya
Abstract:
Opportunistic routing is used, where the network has the features like dynamic topology changes and intermittent network connectivity. In Delay Tolerant network or Disruption tolerant network opportunistic forwarding technique is widely used. The key idea of opportunistic routing is selecting forwarding nodes to forward data and coordination among these nodes to avoid duplicate transmissions. This paper gives the analysis of pros and cons of various opportunistic routing techniques used in MANET.Keywords: ETX, opportunistic routing, PSR, throughput
Procedia PDF Downloads 4943166 Democratic Action as Insurgency: On Claude Lefort's Concept of the Political Regime
Authors: Lorenzo Buti
Abstract:
This paper investigates the nature of democratic action through a critical reading of Claude Lefort’s notion of the democratic ‘regime’. Lefort provides one of the most innovative accounts of the essential features of a democratic regime. According to him, democracy is a political regime that acknowledges the indeterminacy of a society and stages it as a contestation between competing political actors. As such, democracy provides the symbolic markers of society’s openness towards the future. However, despite their democratic features, the recent decades in late capitalist societies attest to a sense of the future becoming fixed and predetermined. This suggests that Lefort’s conception of democracy harbours a misunderstanding of the character and experience of democratic action. This paper examines this underlying tension in Lefort’s work. It claims that Lefort underestimates how a democratic regime, next to its symbolic function, also takes a materially constituted form with its particular dynamics of power relations. Lefort’s systematic dismissal of this material dimension for democratic action can lead to the contemporary paradoxical situation where democracy’s symbolic markers are upheld (free elections, public debate, dynamic between government and opposition in parliament,…) but the room for political decision-making is constrained due to a myriad of material constraints (e.g., market pressures, institutional inertias). The paper draws out the implications for the notion of democratic action. Contra Lefort, it argues that democratic action necessarily targets the material conditions that impede the capacity for decision-making on the basis of equality and liberty. This analysis shapes our understanding of democratic action in two ways. First, democratic action takes an asymmetrical, insurgent form, as a contestation of material power relations from below. Second, it reveals an ambivalent position vis-à-vis the political regime: democratic action is symbolically made possible by the democratic dispositive, but it contests the constituted form that the democratic regime takes.Keywords: Claude Lefort, democratic action, material constitution, political regime
Procedia PDF Downloads 1413165 A Study of Flooding Detention Space Efficiency in Different Lands Uses : The Case in Zhoushui River Downstream Catchment in Taiwan
Authors: Jie-Ying Wu, Kuo-Hao Weng, Jin-Cheng Fu
Abstract:
This study proposes changes to land use for the purposes of water retention and runoff reduction, with the aim of reducing the frequency of flooding. This study uses the Zhuoshui River in Taiwan as a case study, designing different land use planning strategies, and setting up various detention spaces. The HEC-HMS model developed by the Hydrology Research Center of the U.S. Army Corps of Engineers is used to calculate the decrease in runoff using various planning strategies, during five precipitation events of increasing return periods. This study finds that a maximum decrease in runoff of 14 million square meters can result by changing the form of land cover and storm detention in non-urban agricultural and river zones. This is due to the fact that non-urban land accounts for 96% of the area under study. Greatest efficacy was demonstrated in a two-year return period, with results ranging from 16% to 52%. The efficacy of a 100-year return period rated from 3% to 8%. Urban area detentions consist of agricultural paddy fields, storm water ponds and rainwater retention systems in building basements. Although urban areas can provide one million cubic meters of runoff storage, this result is insignificant due to the fact that urban area constitutes only 4% of the study area. By changing land cover, a 2-year return period has a 9% efficacy, and a 100-year return period has a 2% efficacy.Keywords: flood detention space, land-use, spatial planning, Zhuoshuei River, Taiwan
Procedia PDF Downloads 3793164 Visualization Tool for EEG Signal Segmentation
Authors: Sweeti, Anoop Kant Godiyal, Neha Singh, Sneh Anand, B. K. Panigrahi, Jayasree Santhosh
Abstract:
This work is about developing a tool for visualization and segmentation of Electroencephalograph (EEG) signals based on frequency domain features. Change in the frequency domain characteristics are correlated with change in mental state of the subject under study. Proposed algorithm provides a way to represent the change in the mental states using the different frequency band powers in form of segmented EEG signal. Many segmentation algorithms have been suggested in literature having application in brain computer interface, epilepsy and cognition studies that have been used for data classification. But the proposed method focusses mainly on the better presentation of signal and that’s why it could be a good utilization tool for clinician. Algorithm performs the basic filtering using band pass and notch filters in the range of 0.1-45 Hz. Advanced filtering is then performed by principal component analysis and wavelet transform based de-noising method. Frequency domain features are used for segmentation; considering the fact that the spectrum power of different frequency bands describes the mental state of the subject. Two sliding windows are further used for segmentation; one provides the time scale and other assigns the segmentation rule. The segmented data is displayed second by second successively with different color codes. Segment’s length can be selected as per need of the objective. Proposed algorithm has been tested on the EEG data set obtained from University of California in San Diego’s online data repository. Proposed tool gives a better visualization of the signal in form of segmented epochs of desired length representing the power spectrum variation in data. The algorithm is designed in such a way that it takes the data points with respect to the sampling frequency for each time frame and so it can be improved to use in real time visualization with desired epoch length.Keywords: de-noising, multi-channel data, PCA, power spectra, segmentation
Procedia PDF Downloads 3973163 Local Directional Encoded Derivative Binary Pattern Based Coral Image Classification Using Weighted Distance Gray Wolf Optimization Algorithm
Authors: Annalakshmi G., Sakthivel Murugan S.
Abstract:
This paper presents a local directional encoded derivative binary pattern (LDEDBP) feature extraction method that can be applied for the classification of submarine coral reef images. The classification of coral reef images using texture features is difficult due to the dissimilarities in class samples. In coral reef image classification, texture features are extracted using the proposed method called local directional encoded derivative binary pattern (LDEDBP). The proposed approach extracts the complete structural arrangement of the local region using local binary batten (LBP) and also extracts the edge information using local directional pattern (LDP) from the edge response available in a particular region, thereby achieving extra discriminative feature value. Typically the LDP extracts the edge details in all eight directions. The process of integrating edge responses along with the local binary pattern achieves a more robust texture descriptor than the other descriptors used in texture feature extraction methods. Finally, the proposed technique is applied to an extreme learning machine (ELM) method with a meta-heuristic algorithm known as weighted distance grey wolf optimizer (GWO) to optimize the input weight and biases of single-hidden-layer feed-forward neural networks (SLFN). In the empirical results, ELM-WDGWO demonstrated their better performance in terms of accuracy on all coral datasets, namely RSMAS, EILAT, EILAT2, and MLC, compared with other state-of-the-art algorithms. The proposed method achieves the highest overall classification accuracy of 94% compared to the other state of art methods.Keywords: feature extraction, local directional pattern, ELM classifier, GWO optimization
Procedia PDF Downloads 1633162 Geotechnical Properties and Compressibility Behavior of Organic Dredged Soils
Authors: Inci Develioglu, Hasan Firat Pulat
Abstract:
Sustainable development is one of the most important topics in today's world, and it is also an important research topic for geoenvironmental engineering. Dredging process is performed to expand the river and port channel, flood control and accessing harbors. Every year large amount of sediment are dredged for these purposes. Dredged marine soils can be reused as filling materials, road and foundation embankments, construction materials and wildlife habitat developments. In this study, geotechnical engineering properties and compressibility behavior of dredged soil obtained from the Izmir Bay were investigated. The samples with four different organic matter contents were obtained and particle size distributions, consistency limits, pH and specific gravity tests were performed. The consolidation tests were conducted to examine organic matter content (OMC) effects on compressibility behavior of dredged soil. This study has shown that the OMC has an important effect on the engineering properties of dredged soils. The liquid and plastic limits increased with increasing OMC. The lowest specific gravity belonged to sample which has the maximum OMC. The specific gravity values ranged between 2.76 and 2.52. The maximum void ratio difference belongs to sample with the highest OMC (De11% = 0.38). As the organic matter content of the samples increases, the change in the void ratio has also increased. The compression index increases with increasing OMC.Keywords: compressibility, consolidation, geotechnical properties, organic matter content, dredged soil
Procedia PDF Downloads 2583161 Feature Evaluation Based on Random Subspace and Multiple-K Ensemble
Authors: Jaehong Yu, Seoung Bum Kim
Abstract:
Clustering analysis can facilitate the extraction of intrinsic patterns in a dataset and reveal its natural groupings without requiring class information. For effective clustering analysis in high dimensional datasets, unsupervised dimensionality reduction is an important task. Unsupervised dimensionality reduction can generally be achieved by feature extraction or feature selection. In many situations, feature selection methods are more appropriate than feature extraction methods because of their clear interpretation with respect to the original features. The unsupervised feature selection can be categorized as feature subset selection and feature ranking method, and we focused on unsupervised feature ranking methods which evaluate the features based on their importance scores. Recently, several unsupervised feature ranking methods were developed based on ensemble approaches to achieve their higher accuracy and stability. However, most of the ensemble-based feature ranking methods require the true number of clusters. Furthermore, these algorithms evaluate the feature importance depending on the ensemble clustering solution, and they produce undesirable evaluation results if the clustering solutions are inaccurate. To address these limitations, we proposed an ensemble-based feature ranking method with random subspace and multiple-k ensemble (FRRM). The proposed FRRM algorithm evaluates the importance of each feature with the random subspace ensemble, and all evaluation results are combined with the ensemble importance scores. Moreover, FRRM does not require the determination of the true number of clusters in advance through the use of the multiple-k ensemble idea. Experiments on various benchmark datasets were conducted to examine the properties of the proposed FRRM algorithm and to compare its performance with that of existing feature ranking methods. The experimental results demonstrated that the proposed FRRM outperformed the competitors.Keywords: clustering analysis, multiple-k ensemble, random subspace-based feature evaluation, unsupervised feature ranking
Procedia PDF Downloads 3393160 Characterization of Chest Pain in Patients Consulting to the Emergency Department of a Health Institution High Level of Complexity during 2014-2015, Medellin, Colombia
Authors: Jorge Iván Bañol-Betancur, Lina María Martínez-Sánchez, María de los Ángeles Rodríguez-Gázquez, Estefanía Bahamonde-Olaya, Ana María Gutiérrez-Tamayo, Laura Isabel Jaramillo-Jaramillo, Camilo Ruiz-Mejía, Natalia Morales-Quintero
Abstract:
Acute chest pain is a distressing sensation between the diaphragm and the base of the neck and it represents a diagnostic challenge for any physician in the emergency department. Objective: To establish the main clinical and epidemiological characteristics of patients who present with chest pain to the emergency department in a private clinic from the city of Medellin, during 2014-2015. Methods: Cross-sectional retrospective observational study. Population and sample were patients who consulted for chest pain in the emergency department who met the eligibility criteria. The information was analyzed in SPSS program vr.21; qualitative variables were described through relative frequencies, and the quantitative through mean and standard deviation or medians according to their distribution in the study population. Results: A total of 231 patients were evaluated, the mean age was 49.5 ± 19.9 years, 56.7% were females. The most frequent pathological antecedents were hypertension 35.5%, diabetes 10,8%, dyslipidemia 10.4% and coronary disease 5.2%. Regarding pain features, in 40.3% of the patients the pain began abruptly, in 38.2% it had a precordial location, for 20% of the cases physical activity acted as a trigger, and 60.6% was oppressive. Costochondritis was the most common cause of chest pain among patients with an established etiologic diagnosis, representing the 18.2%. Conclusions: Although the clinical features of pain reported coincide with the clinical presentation of an acute coronary syndrome, the most common cause of chest pain in study population was costochondritis instead, indicating that it is a differential diagnostic in the approach of patients with pain acute chest.Keywords: acute coronary syndrome, chest pain, epidemiology, osteochondritis
Procedia PDF Downloads 3433159 Synthesis and Characterization of Silver/Graphene Oxide Co-Decorated TiO2 Nanotubular Arrays for Biomedical Applications
Authors: Alireza Rafieerad, Bushroa Abd Razak, Bahman Nasiri Tabrizi, Jamunarani Vadivelu
Abstract:
Recently, reports on the fabrication of nanotubular arrays have generated considerable scientific interest, owing to the broad range of applications of the oxide nanotubes in solar cells, orthopedic and dental implants, photocatalytic devices as well as lithium-ion batteries. A more attractive approach for the fabrication of oxide nanotubes with controllable morphology is the electrochemical anodization of substrate in a fluoride-containing electrolyte. Consequently, titanium dioxide nanotubes (TiO2 NTs) have been highly considered as an applicable material particularly in the district of artificial implants. In addition, regarding long-term efficacy and reasons of failing and infection after surgery of currently used dental implants required to enhance the cytocompatibility properties of Ti-based bone-like tissue. As well, graphene oxide (GO) with relevant biocompatibility features in tissue sites, osseointegration and drug delivery functionalization was fully understood. Besides, the boasting antibacterial ability of silver (Ag) remarkably provided for implantable devices without infection symptoms. Here, surface modification of Ti–6Al–7Nb implants (Ti67IMP) by the development of Ag/GO co-decorated TiO2 NTs was examined. Initially, the anodic TiO2 nanotubes obtained at a constant potential of 60 V were annealed at 600 degree centigrade for 2 h to improve the adhesion of the coating. Afterward, the Ag/GO co-decorated TiO2 NTs were developed by spin coating on Ti67IM. The microstructural features, phase composition and wettability behavior of the nanostructured coating were characterized comparably. In a nutshell, the results of the present study may contribute to the development of the nanostructured Ti67IMP with improved surface properties.Keywords: anodic tio2 nanotube, biomedical applications, graphene oxide, silver, spin coating
Procedia PDF Downloads 3253158 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models
Authors: Ainouna Bouziane
Abstract:
The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.Keywords: electron tomography, supported catalysts, nanometrology, error assessment
Procedia PDF Downloads 883157 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia
Authors: Carol Anne Hargreaves
Abstract:
A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.Keywords: machine learning, stock market trading, logistic regression, cluster analysis, factor analysis, decision trees, neural networks, automated stock investment system
Procedia PDF Downloads 1573156 Elements of Successful Commercial Streets: A Socio-Spatial Analysis of Commercial Streets in Cairo
Authors: Toka Aly
Abstract:
Historically, marketplaces were the most important nodes and focal points of cities, where different activities took place. Commercial streets offer more than just spaces for shopping; they also offer choices for social activities and cultural exchange. They are considered the backbone of the city’s vibrancy and vitality. Despite that, the public life in Cairo’s commercial streets has deteriorated, where the shopping activities became reliant mainly on 'planned formal places', mainly in privatized or indoor spaces like shopping malls. The main aim of this paper is to explore the key elements and tools of assessing the successfulness of commercial streets in Cairo. The methodology followed in this paper is based on a case study methodology (multiple cases) that is based on assessing and analyzing the physical and social elements in historical and contemporary commercial streets in El Muiz Street and Baghdad Street in Cairo. The data collection is based on personal observations, photographs, maps and street sections. Findings indicate that the key factors of analyzing commercial streets are factors affecting the sensory experience, factors affecting the social behavior, and general aspects that attract people. Findings also indicate that urban features have clear influence on shopping pedestrian activities in both streets. Moreover, in order for a commercial street to be successful, shopping patterns must provide people with a quality public space that can provide easy navigation and accessibility, good visual continuity, and well-designed urban features and social gathering. Outcomes of this study will be a significant endeavor in providing a good background for urban designers on analyzing and assessing successfulness of commercial streets. The study will also help in understanding the different physical and social pattern of vending activities taking place in Cairo.Keywords: activities, commercial street, marketplace, successful, vending
Procedia PDF Downloads 3023155 Numerical Approach for Characterization of Flow Field in Pump Intake Using Two Phase Model: Detached Eddy Simulation
Authors: Rahul Paliwal, Gulshan Maheshwari, Anant S. Jhaveri, Channamallikarjun S. Mathpati
Abstract:
Large pumping facility is the necessary requirement of the cooling water systems for power plants, process and manufacturing facilities, flood control and water or waste water treatment plant. With a large capacity of few hundred to 50,000 m3/hr, cares must be taken to ensure the uniform flow to the pump to limit vibration, flow induced cavitation and performance problems due to formation of air entrained vortex and swirl flow. Successful prediction of these phenomena requires numerical method and turbulence model to characterize the dynamics of these flows. In the past years, single phase shear stress transport (SST) Reynolds averaged Navier Stokes Models (like k-ε, k-ω and RSM) were used to predict the behavior of flow. Literature study showed that two phase model will be more accurate over single phase model. In this paper, a 3D geometries simulated using detached eddy simulation (LES) is used to predict the behavior of the fluid and the results are compared with experimental results. Effect of different grid structure and boundary condition is also studied. It is observed that two phase flow model can more accurately predict the mean flow and turbulence statistics compared to the steady SST model. These validate model will be used for further analysis of vortex structure in lab scale model to generate their frequency-plot and intensity at different location in the set-up. This study will help in minimizing the ill effect of vortex on pump performance.Keywords: grid structure, pump intake, simulation, vibration, vortex
Procedia PDF Downloads 1753154 Influence of a Company’s Dynamic Capabilities on Its Innovation Capabilities
Authors: Lovorka Galetic, Zeljko Vukelic
Abstract:
The advanced concepts of strategic and innovation management in the sphere of company dynamic and innovation capabilities, and achieving their mutual alignment and a synergy effect, are important elements in business today. This paper analyses the theory and empirically investigates the influence of a company’s dynamic capabilities on its innovation capabilities. A new multidimensional model of dynamic capabilities is presented, consisting of five factors appropriate to real time requirements, while innovation capabilities are considered pursuant to the official OECD and Eurostat standards. After examination of dynamic and innovation capabilities indicated their theoretical links, the empirical study testing the model and examining the influence of a company’s dynamic capabilities on its innovation capabilities showed significant results. In the study, a research model was posed to relate company dynamic and innovation capabilities. One side of the model features the variables that are the determinants of dynamic capabilities defined through their factors, while the other side features the determinants of innovation capabilities pursuant to the official standards. With regard to the research model, five hypotheses were set. The study was performed in late 2014 on a representative sample of large and very large Croatian enterprises with a minimum of 250 employees. The research instrument was a questionnaire administered to company top management. For both variables, the position of the company was tested in comparison to industry competitors, on a fivepoint scale. In order to test the hypotheses, correlation tests were performed to determine whether there is a correlation between each individual factor of company dynamic capabilities with the existence of its innovation capabilities, in line with the research model. The results indicate a strong correlation between a company’s possession of dynamic capabilities in terms of their factors, due to the new multi-dimensional model presented in this paper, with its possession of innovation capabilities. Based on the results, all five hypotheses were accepted. Ultimately, it was concluded that there is a strong association between the dynamic and innovation capabilities of a company.Keywords: dynamic capabilities, innovation capabilities, competitive advantage, business results
Procedia PDF Downloads 3053153 Features of Testing of the Neuronetwork Converter Biometrics-Code with Correlation Communications between Bits of the Output Code
Authors: B. S. Akhmetov, A. I. Ivanov, T. S. Kartbayev, A. Y. Malygin, K. Mukapil, S. D. Tolybayev
Abstract:
The article examines the testing of the neural network converter of biometrics code. Determined the main reasons that prevented the use adopted in the works of foreign researchers classical a Binomial Law when describing distribution of measures of Hamming "Alien" codes-responses.Keywords: biometrics, testing, neural network, converter of biometrics-code, Hamming's measure
Procedia PDF Downloads 11383152 Reduplication in Dhiyan: An Indo-Aryan Language of Assam
Authors: S. Sulochana Singha
Abstract:
Dhiyan or Dehan is the name of the community and language spoken by the Koch-Rajbangshi people of Barak Valley of Assam. Ethnically, they are Mongoloids, and their language belongs to the Indo-Aryan language family. However, Dhiyan is absent in any classification of Indo-Aryan languages. So the classification of Dhiyan language under the Indo-Aryan language family is completely based on the shared typological features of the other Indo-Aryan languages. Typologically, Dhiyan is an agglutinating language, and it shares many features of Indo-Aryan languages like presence of aspirated voiced stops, non-tonal, verb-person agreement, adjectives as different word class, prominent tense and subject object verb word order. Reduplication is a productive word-formation process in Dhiyan. Besides it also expresses plurality, intensification, and distributive. Generally, reduplication in Dhiyan can be at the morphological or lexical level. Morphological reduplication in Dhiyan involves expressives which includes onomatopoeias, sound symbolism, idiophones, and imitatives. Lexical reduplication in the language can be formed by echo formations and word reduplication. Echo formation in Dhiyan is formed by partial repetition from the base word which can be either consonant alternation or vowel alternation. The consonant alternation is basically found in onset position while the alternation of vowel is basically found in open syllable particularly in final syllable. Word reduplication involves reduplication of nouns, interrogatives, adjectives, and numerals which further can be class changing or class maintaining reduplication. The process of reduplication can be partial or complete whether it is lexical or morphological. The present paper is an attempt to describe some aspects of the formation, function, and usage of reduplications in Dhiyan which is mainly spoken in ten villages in the Eastern part of Barak River in the Cachar District of Assam.Keywords: Barak-Valley, Dhiyan, Indo-Aryan, reduplication
Procedia PDF Downloads 2173151 Oneness of Scriptures and Oneness of God
Authors: Shyam Sunder Gupta
Abstract:
GOD is an infinite source of knowledge. From time to time, as per the need of mankind, GOD keeps revealing, some small, selected part of HIS knowledge as WORDS, to a chosen entity whose responsibility is to function as Messenger and share WORDS, in the form of verses, with common masses. GOD has confirmed that Messenger may not understand every WORD revealed to him, and HE directs Messenger to learn from persons who have knowledge of WORDS revealed in earlier times, as some revealed content is identical and some different by design. In due course of time, Verses, as communicated orally, are collected, and edited by an individual in a planned manner or by a group of individuals and get edited unintentionally and converted in the form of Scripture. Whatever gets collected, depending on the knowledge of the Editor(s), some errors, scientific and other forms, get into Scripture. In the present world, there are three major religions: Christianity, Islam and Hinduism, accounting for more than two-thirds of the world’s population. Each of the religions has its own Scripture, namely the Bible, Quran, and Veda. Since the source of WORDS for each of these Scriptures is the same, there is ONENESS of all Scriptures. There are amazing similarities between the events described, like the flood during the time of Noah and King Satyavara. The description of the creation of man and woman is identical. Description of Last Day, categorization of human beings, identical names, etc., have remarkable similarities. Ram, the hero of Ramayana, is a common name in Hinduism and two of Jesus’ ancestors’ names were Ram and many names in the Bible are derived from Ram. Attributes of GOD are common in all Scriptures, namely, GOD is Eternal, Unborn, Immortal, Creator of Universe(s) and everything that exists within the Universe, Omnipotent, Omnipresent, Omniscient, Subtlest of all, Unchangeable, Unique, Always Works, Source of Eternal Bliss, etc. There is the Oneness of GOD.Keywords: GOD, scriptures, oneness, WORDS, Jesus, Ram
Procedia PDF Downloads 623150 Students’ Opinions Related to Virtual Classrooms within the Online Distance Education Graduate Program
Authors: Secil Kaya Gulen
Abstract:
Face to face and virtual classrooms that came up with different conditions and environments, but similar purposes have different characteristics. Although virtual classrooms have some similar facilities with face-to-face classes such as program, students, and administrators, they have no walls and corridors. Therefore, students can attend the courses from a distance and can control their own learning spaces. Virtual classrooms defined as simultaneous online environments where students in different places come together at the same time with the guidance of a teacher. Distance education and virtual classes require different intellectual and managerial skills and models. Therefore, for effective use of virtual classrooms, the virtual property should be taken into consideration. One of the most important factors that affect the spread and effective use of the virtual classrooms is the perceptions and opinions of students -as one the main participants-. Student opinions and recommendations are important in terms of providing information about the fulfillment of expectation. This will help to improve the applications and contribute to the more efficient implementations. In this context, ideas and perceptions of the students related to the virtual classrooms, in general, were determined in this study. Advantages and disadvantages of virtual classrooms expected contributions to the educational system and expected characteristics of virtual classrooms have examined in this study. Students of an online distance education graduate program in which all the courses offered by virtual classrooms have asked for their opinions. Online Distance Education Graduate Program has totally 19 students. The questionnaire that consists of open-ended and multiple choice questions sent to these 19 students and finally 12 of them answered the questionnaire. Analysis of the data presented as frequencies and percentages for each item. SPSS for multiple-choice questions and Nvivo for open-ended questions were used for analyses. According to the results obtained by the analysis, participants stated that they did not get any training on virtual classes before the courses; but they emphasize that newly enrolled students should be educated about the virtual classrooms. In addition, all participants mentioned that virtual classroom contribute their personal development and they want to improve their skills by gaining more experience. The participants, who mainly emphasize the advantages of virtual classrooms, express that the dissemination of virtual classrooms will contribute to the Turkish Education System. Within the advantages of virtual classrooms, ‘recordable and repeatable lessons’ and ‘eliminating the access and transportation costs’ are most common advantages according to the participants. On the other hand, they mentioned ‘technological features and keyboard usage skills affect the attendance’ is the most common disadvantage. Participants' most obvious problem during virtual lectures is ‘lack of technical support’. Finally ‘easy to use’, ‘support possibilities’, ‘communication level’ and ‘flexibility’ come to the forefront in the scope of expected features of virtual classrooms. Last of all, students' opinions about the virtual classrooms seems to be generally positive. Designing and managing virtual classrooms according to the prioritized features will increase the students’ satisfaction and will contribute to improve applications that are more effective.Keywords: distance education, virtual classrooms, higher education, e-learning
Procedia PDF Downloads 2693149 Climate Change in Awash River Basin of Ethiopia: A Projection Study Using Global and Regional Climate Model Simulations
Authors: Mahtsente Tadese, Lalit Kumar, Richard Koech
Abstract:
The aim of this study was to project and analyze climate change in the Awash River Basin (ARB) using bias-corrected Global and Regional Climate Model simulations. The analysis included a baseline period from 1986-2005 and two future scenarios (the 2050s and 2070s) under two representative concentration pathways (RCP4.5 and RCP8.5). Bias correction methods were evaluated using graphical and statistical methods. Following the evaluation of bias correction methods, the Distribution Mapping (DM) and Power Transformation (PT) were used for temperature and precipitation projection, respectively. The 2050s and 2070s RCP4 simulations showed an increase in precipitation during half of the months with 32 and 10%, respectively. Moreover, the 2050s and 2070s RCP8.5 simulation indicated a decrease in precipitation with 18 and 26%, respectively. The 2050s and 2070s RCP8.5 simulation indicated a significant decrease in precipitation in four of the months (February/March to May) with the highest decreasing rate of 34.7%. The 2050s and 2070s RCP4.5 simulation showed an increase of 0.48-2.6 °C in maximum temperature. In the case of RCP8.5, the increase rate reached 3.4 °C and 4.1 °C in the 2050s and 2070s, respectively. The changes in precipitation and temperature might worsen the water stress, flood, and drought in ARB. Moreover, the critical focus should be given to mitigation strategies and management options to reduce the negative impact. The findings of this study provide valuable information on future precipitation and temperature change in ARB, which will help in the planning and design of sustainable mitigation approaches in the basin.Keywords: variability, climate change, Awash River Basin, precipitation
Procedia PDF Downloads 1743148 Research and Implementation of Cross-domain Data Sharing System in Net-centric Environment
Authors: Xiaoqing Wang, Jianjian Zong, Li Li, Yanxing Zheng, Jinrong Tong, Mao Zhan
Abstract:
With the rapid development of network and communication technology, a great deal of data has been generated in different domains of a network. These data show a trend of increasing scale and more complex structure. Therefore, an effective and flexible cross-domain data-sharing system is needed. The Cross-domain Data Sharing System(CDSS) in a net-centric environment is composed of three sub-systems. The data distribution sub-system provides data exchange service through publish-subscribe technology that supports asynchronism and multi-to-multi communication, which adapts to the needs of the dynamic and large-scale distributed computing environment. The access control sub-system adopts Attribute-Based Access Control(ABAC) technology to uniformly model various data attributes such as subject, object, permission and environment, which effectively monitors the activities of users accessing resources and ensures that legitimate users get effective access control rights within a legal time. The cross-domain access security negotiation subsystem automatically determines the access rights between different security domains in the process of interactive disclosure of digital certificates and access control policies through trust policy management and negotiation algorithms, which provides an effective means for cross-domain trust relationship establishment and access control in a distributed environment. The CDSS’s asynchronous,multi-to-multi and loosely-coupled communication features can adapt well to data exchange and sharing in dynamic, distributed and large-scale network environments. Next, we will give CDSS new features to support the mobile computing environment.Keywords: data sharing, cross-domain, data exchange, publish-subscribe
Procedia PDF Downloads 1243147 Implications of Learning Resource Centre in a Web Environment
Authors: Darshana Lal, Sonu Rana
Abstract:
Learning Resource Centers (LRC) are acquiring different kinds of documents like books, journals, thesis, dissertations, standard, databases etc. in print and e-form. This article deals with the different types of sources available in LRC. It also discusses the concept of the web, as a tool, as a multimedia system and the different interfaces available on the web. The reasons for establishing LRC are highlighted along with the assignments of LRC. Different features of LRC‘S like self-learning and group learning are described. It also implements a group of activities like reading, learning, educational etc. The use of LRC by students and faculties are given and concluded with the benefits.Keywords: internet, search engine, resource centre, opac, self-learning, group learning
Procedia PDF Downloads 3783146 Winning the “Culture War”: Greater Hungary and the American Confederacy as Sites of Nostalgia, Mythology, and Problem-Making for the Far Right in the US and Hungary
Authors: Grace Rademacher
Abstract:
Trauma” of the Kingdom of Hungary and the “Lost Cause” of the American Confederacy. Applying Nicole Maurantonio’s articulation of “confederate exceptionalism” and Svetlana Boym’s definition of “restorative nostalgia”, this article argues that, via memorialization and public discourse, both far right bodies flood their constituencies with narratives of nostalgia and martyrdom to sow existential anxieties about past and prophetic victimhood, all under the guise of protecting or restoring heritage. Linking this practice to gamification and conspiracy theorizing and following the work of Patrick Jagoda, this article identifies such industries of nostalgia as means by which the far right in both nations can partake in the “immanent and improvisational process of problem making.” Reified through monuments and references to the Trianon Trauma and the American confederacy, political actors “problem make” by alleging that they are victims of the West or the Left, subject to the cruel whims of liberalism and denial of historical legitimacy. In both nations, relying on their victimhood, pundits and politicians can appeal to white supremacists and distract citizens from legitimate active conflicts, such as wars or democratic rollbacks, redirecting them to fictional, mythical attacks on Hungarian or American society and civilization. This article will examine memorials and monuments as “lieux de memoire” and identify the purposeful similarities between the discourse of public figures and politicians such as María Schmidt, János Lázár, and Viktor Orbán, with that of Donald Trump and pundits such as Tucker Carlson.Keywords: nationalism, political memory, white supremacy, trianon
Procedia PDF Downloads 763145 Survey of Web Service Composition
Authors: Wala Ben Messaoud, Khaled Ghedira, Youssef Ben Halima, Henda Ben Ghezala
Abstract:
A web service (WS) is called compound or composite when its execution involves interactions with other WS to use their features. The composition of WS specifies which services need to be invoked, in what order and how to handle exception conditions. This paper gives an overview of research efforts of WS composition. The approaches proposed in the literature are diverse, interesting and have opened important research areas. Based on many studies, we extracted the most important role of WS composition use in order to facilitate its introduction in WS concept.Keywords: SOA, web services, composition approach, composite WS
Procedia PDF Downloads 3083144 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas
Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards
Abstract:
Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.Keywords: airborne laser scanning, digital terrain models, filtering, forested areas
Procedia PDF Downloads 1393143 Evaluation of the Beach Erosion Process in Varadero, Matanzas, Cuba: Effects of Different Hurricane Trajectories
Authors: Ana Gabriela Diaz, Luis Fermín Córdova, Jr., Roberto Lamazares
Abstract:
The island of Cuba, the largest of the Greater Antilles, is located in the tropical North Atlantic. It is annually affected by numerous weather events, which have caused severe damage to our coastal areas. In the same way that many other coastlines around the world, the beautiful beaches of the Hicacos Peninsula also suffer from erosion. This leads to a structural regression of the coastline. If measures are not taken, the hotels will be exposed to the advance of the sea, and it will be a serious problem for the economy. With the aim of studying the intensity of this type of activity, specialists of group of coastal and marine engineering from CIH, in the framework of the research conducted within the project MEGACOSTAS 2, provide their research to simulate extreme events and assess their impact in coastal areas, mainly regarding the definition of flood volumes and morphodynamic changes in sandy beaches. The main objective of this work is the evaluation of the process of Varadero beach erosion (the coastal sector has an important impact in the country's economy) on the Hicacos Peninsula for different paths of hurricanes. The mathematical model XBeach, which was integrated into the Coastal engineering system introduced by the project of MEGACOSTA 2 to determine the area and the more critical profiles for the path of hurricanes under study, was applied. The results of this project have shown that Center area is the greatest dynamic area in the simulation of the three paths of hurricanes under study, showing high erosion volumes and the greatest average length of regression of the coastline, from 15- 22 m.Keywords: beach, erosion, mathematical model, coastal areas
Procedia PDF Downloads 2303142 Modeling Sediment Transports under Extreme Storm Situation along Persian Gulf North Coast
Authors: Majid Samiee Zenoozian
Abstract:
The Persian Gulf is a bordering sea with an normal depth of 35 m and a supreme depth of 100 m near its narrow appearance. Its lengthen bathymetric axis divorces two main geological shires — the steady Arabian Foreland and the unbalanced Iranian Fold Belt — which are imitated in the conflicting shore and bathymetric morphologies of Arabia and Iran. The sediments were experimented with from 72 offshore positions through an oceanographic cruise in the winter of 2018. Throughout the observation era, several storms and river discharge actions happened, as well as the major flood on record since 1982. Suspended-sediment focus at all three sites varied in reaction to both wave resuspension and advection of river-derived sediments. We used hydrological models to evaluation and associate the wave height and inundation distance required to carriage the rocks inland. Our results establish that no known or possible storm happening on the Makran coast is accomplished of detaching and transporting the boulders. The fluid mud consequently is conveyed seaward due to gravitational forcing. The measured sediment focus and velocity profiles on the shelf provide a strong indication to provision this assumption. The sediment model is joined with a 3D hydrodynamic module in the Environmental Fluid Dynamics Code (EFDC) model that offers data on estuarine rotation and salinity transport under normal temperature conditions. 3-D sediment transport from model simulations specify dynamic sediment resuspension and transport near zones of highly industrious oyster beds.Keywords: sediment transport, storm, coast, fluid dynamics
Procedia PDF Downloads 1153141 Traumatic Brain Injury Induced Lipid Profiling of Lipids in Mice Serum Using UHPLC-Q-TOF-MS
Authors: Seema Dhariwal, Kiran Maan, Ruchi Baghel, Apoorva Sharma, Poonam Rana
Abstract:
Introduction: Traumatic brain injury (TBI) is defined as the temporary or permanent alteration in brain function and pathology caused by an external mechanical force. It represents the leading cause of mortality and morbidity among children and youth individuals. Various models of TBI in rodents have been developed in the laboratory to mimic the scenario of injury. Blast overpressure injury is common among civilians and military personnel, followed by accidents or explosive devices. In addition to this, the lateral Controlled cortical impact (CCI) model mimics the blunt, penetrating injury. Method: In the present study, we have developed two different mild TBI models using blast and CCI injury. In the blast model, helium gas was used to create an overpressure of 130 kPa (±5) via a shock tube, and CCI injury was induced with an impact depth of 1.5mm to create diffusive and focal injury, respectively. C57BL/6J male mice (10-12 weeks) were divided into three groups: (1) control, (2) Blast treated, (3) CCI treated, and were exposed to different injury models. Serum was collected on Day1 and day7, followed by biphasic extraction using MTBE/Methanol/Water. Prepared samples were separated on Charged Surface Hybrid (CSH) C18 column and acquired on UHPLC-Q-TOF-MS using ESI probe with inhouse optimized parameters and method. MS peak list was generated using Markerview TM. Data were normalized, Pareto-scaled, and log-transformed, followed by multivariate and univariate analysis in metaboanalyst. Result and discussion: Untargeted profiling of lipids generated extensive data features, which were annotated through LIPID MAPS® based on their m/z and were further confirmed based on their fragment pattern by LipidBlast. There is the final annotation of 269 features in the positive and 182 features in the negative mode of ionization. PCA and PLS-DA score plots showed clear segregation of injury groups to controls. Among various lipids in mild blast and CCI, five lipids (Glycerophospholipids {PC 30:2, PE O-33:3, PG 28:3;O3 and PS 36:1 } and fatty acyl { FA 21:3;O2}) were significantly altered in both injury groups at Day 1 and Day 7, and also had VIP score >1. Pathway analysis by Biopan has also shown hampered synthesis of Glycerolipids and Glycerophospholipiods, which coincides with earlier reports. It could be a direct result of alteration in the Acetylcholine signaling pathway in response to TBI. Understanding the role of a specific class of lipid metabolism, regulation and transport could be beneficial to TBI research since it could provide new targets and determine the best therapeutic intervention. This study demonstrates the potential lipid biomarkers which can be used for injury severity diagnosis and identification irrespective of injury type (diffusive or focal).Keywords: LipidBlast, lipidomic biomarker, LIPID MAPS®, TBI
Procedia PDF Downloads 1133140 Simulation of Optimal Runoff Hydrograph Using Ensemble of Radar Rainfall and Blending of Runoffs Model
Authors: Myungjin Lee, Daegun Han, Jongsung Kim, Soojun Kim, Hung Soo Kim
Abstract:
Recently, the localized heavy rainfall and typhoons are frequently occurred due to the climate change and the damage is becoming bigger. Therefore, we may need a more accurate prediction of the rainfall and runoff. However, the gauge rainfall has the limited accuracy in space. Radar rainfall is better than gauge rainfall for the explanation of the spatial variability of rainfall but it is mostly underestimated with the uncertainty involved. Therefore, the ensemble of radar rainfall was simulated using error structure to overcome the uncertainty and gauge rainfall. The simulated ensemble was used as the input data of the rainfall-runoff models for obtaining the ensemble of runoff hydrographs. The previous studies discussed about the accuracy of the rainfall-runoff model. Even if the same input data such as rainfall is used for the runoff analysis using the models in the same basin, the models can have different results because of the uncertainty involved in the models. Therefore, we used two models of the SSARR model which is the lumped model, and the Vflo model which is a distributed model and tried to simulate the optimum runoff considering the uncertainty of each rainfall-runoff model. The study basin is located in Han river basin and we obtained one integrated runoff hydrograph which is an optimum runoff hydrograph using the blending methods such as Multi-Model Super Ensemble (MMSE), Simple Model Average (SMA), Mean Square Error (MSE). From this study, we could confirm the accuracy of rainfall and rainfall-runoff model using ensemble scenario and various rainfall-runoff model and we can use this result to study flood control measure due to climate change. Acknowledgements: This work is supported by the Korea Agency for Infrastructure Technology Advancement(KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 18AWMP-B083066-05).Keywords: radar rainfall ensemble, rainfall-runoff models, blending method, optimum runoff hydrograph
Procedia PDF Downloads 2803139 Multimodal Rhetoric in the Wildlife Documentary, “My Octopus Teacher”
Authors: Visvaganthie Moodley
Abstract:
While rhetoric goes back as far as Aristotle who focalised its meaning as the “art of persuasion”, most scholars have focused on elocutio and dispositio canons, neglecting the rhetorical impact of multimodal texts, such as documentaries. Film documentaries are being increasingly rhetoric, often used by wildlife conservationists for influencing people to become more mindful about humanity’s connection with nature. This paper examines the award-winning film documentary, “My Octopus Teacher”, which depicts naturalist, Craig Foster’s unique discovery and relationship with a female octopus in the southern tip of Africa, the Cape of Storms in South Africa. It is anchored in Leech and Short’s (2007) framework of linguistic and stylistic categories – comprising lexical items, grammatical features, figures of speech and other rhetoric features, and cohesiveness – with particular foci on diction, anthropomorphic language, metaphors and symbolism. It also draws on Kress and van Leeuwen’s (2006) multimodal analysis to show how verbal cues (the narrator’s commentary), visual images in motion, visual images as metaphors and symbolism, and aural sensory images such as music and sound synergise for rhetoric effect. In addition, the analysis of “My Octopus Teacher” is guided by Nichol’s (2010) narrative theory; features of a documentary which foregrounds the credibility of the narrative as a text that represents real events with real people; and its modes of construction, viz., the poetic mode, the expository mode, observational mode and participatory mode, and their integration – forging documentaries as multimodal texts. This paper presents a multimodal rhetoric discussion on the sequence of salient episodes captured in the slow moving one-and-a-half-hour documentary. These are: (i) The prologue: on the brink of something extraordinary; (ii) The day it all started; (iii) The narrator’s turmoil: getting back into the ocean; (iv) The incredible encounter with the octopus; (v) Establishing a relationship; (vi) Outwitting the predatory pyjama shark; (vii) The cycle of life; and (viii) The conclusion: lessons from an octopus. The paper argues that wildlife documentaries, characterized by plausibility and which provide researchers the lens to examine the ideologies about animals and humans, offer an assimilation of the various senses – vocal, visual and audial – for engaging viewers in stylized compelling way; they have the ability to persuade people to think and act in particular ways. As multimodal texts, with its use of lexical items; diction; anthropomorphic language; linguistic, visual and aural metaphors and symbolism; and depictions of anthropocentrism, wildlife documentaries are powerful resources for promoting wildlife conservation and conscientizing people of the need for establishing a harmonious relationship with nature and humans alike.Keywords: documentaries, multimodality, rhetoric, style, wildlife, conservation
Procedia PDF Downloads 943138 AI-Enhanced Self-Regulated Learning: Proposing a Comprehensive Model with 'Studium' to Meet a Student-Centric Perspective
Authors: Smita Singh
Abstract:
Objective: The Faculty of Chemistry Education at Humboldt University has developed ‘Studium’, a web application designed to enhance long-term self-regulated learning (SRL) and academic achievement. Leveraging advanced generative AI, ‘Studium’ offers a dynamic and adaptive educational experience tailored to individual learning preferences and languages. The application includes evolving tools for personalized notetaking from preferred sources, customizable presentation capabilities, and AI-assisted guidance from academic documents or textbooks. It also features workflow automation and seamless integration with collaborative platforms like Miro, powered by AI. This study aims to propose a model that combines generative AI with traditional features and customization options, empowering students to create personalized learning environments that effectively address the challenges of SRL. Method: To achieve this, the study included graduate and undergraduate students from diverse subject streams, with 15 participants each from Germany and India, ensuring a diverse educational background. An exploratory design was employed using a speed dating method with enactment, where different scenario sessions were created to allow participants to experience various features of ‘Studium’. The session lasted for 50 minutes, providing an in-depth exploration of the platform's capabilities. Participants interacted with Studium’s features via Zoom conferencing and were then engaged in semi-structured interviews lasting 10-15 minutes to gain deeper insights into the effectiveness of ‘Studium’. Additionally, online questionnaire surveys were conducted before and after the session to gather feedback and evaluate satisfaction with self-regulated learning (SRL) after using ‘Studium’. The response rate of this survey was 100%. Results: The findings of this study indicate that students widely acknowledged the positive impact of ‘Studium’ on their learning experience, particularly its adaptability and intuitive design. They expressed a desire for more tools like ‘Studium’ to support self-regulated learning in the future. The application significantly fostered students' independence in organizing information and planning study workflows, which in turn enhanced their confidence in mastering complex concepts. Additionally, ‘Studium’ promoted strategic decision-making and helped students overcome various learning challenges, reinforcing their self-regulation, organization, and motivation skills. Conclusion: This proposed model emphasizes the need for effective integration of personalized AI tools into active learning and SRL environments. By addressing key research questions, our framework aims to demonstrate how AI-assisted platforms like “Studium” can facilitate deeper understanding, maintain student motivation, and support the achievement of academic goals. Thus, our ideal model for AI-assisted educational platforms provides a strategic approach to enhance student's learning experiences and promote their development as self-regulated learners. This proposed model emphasizes the need for effective integration of personalized AI tools into active learning and SRL environments. By addressing key research questions, our framework aims to demonstrate how AI-assisted platforms like ‘Studium’ can facilitate deeper understanding, maintain student motivation, and support the achievement of academic goals. Thus, our ideal model for AI-assisted educational platforms provides a strategic approach to enhance student's learning experiences and promote their development as self-regulated learners.Keywords: self-regulated learning (SRL), generative AI, AI-assisted educational platforms
Procedia PDF Downloads 29