Search results for: well data integration
22228 A Review of Material and Methods Used in Liner Layers in Various Landfills
Authors: S. Taghvamanesh
Abstract:
Modern landfills are highly engineered containment systems that are designed to reduce the environmental and human health impacts of solid waste (trash). In modern landfills, waste is contained by a liner system. The primary goal of the liner system is to isolate the landfill contents from the environment, thereby protecting the soil and groundwater from pollution caused by the leachate of a landfill. Landfill leachate is the most serious threat to groundwater. Therefore, it is necessary to design a system that prevents the penetration of this dangerous substance into the environment. These layers are made up of two basic elements: clay and geosynthetics. Hydraulic conductivity and flexibility are two desirable properties of these materials. There are three different types of liner systems that will be discussed in this paper. According to available data, the current article analyzed materials and methods for constructing liner layers made of distinct leachates, including various harmful components and heavy metals from all around the world. Also, this study attempted to gather data on leachates for each of the sites discussed. In conclusion, every landfill requires a specific type of liner, which depends on the type of leachate that it produces daily. It should also be emphasized that, based on available data, this article focused on the number of landfills that each country or continent possesses.Keywords: landfill, liner layer, impervious layer, barrier layer
Procedia PDF Downloads 7722227 The Economics of Ecosystem Services and Biodiversity: Valuing Ecotourism-Local Perspectives to Global Discourses-Stakeholders’ Analysis
Authors: Diptimayee Nayak
Abstract:
Ecotourism has been recognised as a popular component of alternative tourism, which claims to guard host local environment and economy. This concept of ecological tourism (eco-tourism) has become more meaningful in evaluating the recreational function and services of any pristine ecosystem in context of ‘The Economics of Ecosystem and Biodiversity (TEEB)’. This ecotourism is said to be a local solution to the global problem of conserving ecosystems and optimising the utilisations of their services. This paper takes a case of recreational services of an Indian protected area ecosystems ‘Bhitarakanika mangrove protected area’ discussing how ecotourism is functioning taking the perspectives of different stakeholders. Specific stakeholders are taken for analysis, viz., tourists and local people, as they are believed to be the major beneficiaries of ecotourism. The stakeholders’ analysis is evaluated on the basis of travel cost techniques (by using truncated Poisson distribution model) for tourists and descriptive and analytical tools for local people. The evaluation of stakeholders’ analysis of ecotourism has gained its impetus after the formulation of Ecotourism guidelines by the Ministry of Environment and Forest (MoEF), Government of India. The paper concludes that ecotourism issues and challenges are site-specific and region-specific; without critically focussing challenges of ecotourism faced at local level the discourses of ecotourism at global level cannot be tackled. Mere integration and replication of policies at global level to be followed at local level will not be successful (top down policies). Rather mainstreaming the decision making process at local level with the global policy stature helps to solve global issues to a bigger extent (bottom up).Keywords: ecosystem services, ecotourism, TEEB, economic valuation, stakeholders, travel cost techniques
Procedia PDF Downloads 24922226 The Event of Extreme Precipitation Occurred in the Metropolitan Mesoregion of the Capital of Para
Authors: Natasha Correa Vitória Bandeira, Lais Cordeiro Soares, Claudineia Brazil, Luciane Teresa Salvi
Abstract:
The intense rain event that occurred between February 16 and 18, 2018, in the city of Barcarena in Pará, located in the North region of Brazil, demonstrates the importance of analyzing this type of event. The metropolitan mesoregion of Belem was severely punished by rains much above the averages normally expected for that time of year; this phenomenon affected, in addition to the capital, the municipalities of Barcarena, Murucupi and Muruçambá. Resulting in a great flood in the rivers of the region, whose basins were affected with great intensity of precipitation, causing concern for the local population because in this region, there are located companies that accumulate ore tailings, and in this specific case, the dam of any of these companies, leaching the ore to the water bodies of the Murucupi River Basin. This article aims to characterize this phenomenon through a special analysis of the distribution of rainfall, using data from atmospheric soundings, satellite images, radar images and data from the GPCP (Global Precipitation Climatology Project), in addition to rainfall stations located in the study region. The results of the work demonstrated a dissociation between the data measured in the meteorological stations and the other forms of analysis of this extreme event. Monitoring carried out solely on the basis of data from pluviometric stations is not sufficient for monitoring and/or diagnosing extreme weather events, and investment by the competent bodies is important to install a larger network of pluviometric stations sufficient to meet the demand in a given region.Keywords: extreme precipitation, great flood, GPCP, ore dam
Procedia PDF Downloads 10822225 Comparative Operating Speed and Speed Differential Day and Night Time Models for Two Lane Rural Highways
Authors: Vinayak Malaghan, Digvijay Pawar
Abstract:
Speed is the independent parameter which plays a vital role in the highway design. Design consistency of the highways is checked based on the variation in the operating speed. Often the design consistency fails to meet the driver’s expectation which results in the difference between operating and design speed. Literature reviews have shown that significant crashes take place in horizontal curves due to lack of design consistency. The paper focuses on continuous speed profile study on tangent to curve transition for both day and night daytime. Data is collected using GPS device which gives continuous speed profile and other parameters such as acceleration, deceleration were analyzed along with Tangent to Curve Transition. In this present study, models were developed to predict operating speed on tangents and horizontal curves as well as model indicating the speed reduction from tangent to curve based on continuous speed profile data. It is observed from the study that vehicle tends to decelerate from approach tangent to between beginning of the curve and midpoint of the curve and then accelerates from curve to tangent transition. The models generated were compared for both day and night and can be used in the road safety improvement by evaluating the geometric design consistency.Keywords: operating speed, design consistency, continuous speed profile data, day and night time
Procedia PDF Downloads 15722224 The Importance of Awareness and Appropriate Management in Inclusive Education in India
Authors: Lusia Ndahafa Nghitotelwa
Abstract:
India is a home to many languages, cultures, traditions, castes and religions. This diversity, when observed in education, appears to be challenging and difficult to manage with respect to including everyone in the educational system. But in order to achieve this, attempts to understand the complexity of the issue and find some solutions for including everyone in education has been made in India since independence, regardless of the students’ background. Despite that, the challenge is still topical. Plenty of students are left out of the system due to the lack of awareness and appropriate management of these diversities. Therefore, the present paper makes an attempt to study the awareness and management of diversity in Indian schools. Existing studies on diversity in Indian schools, along with how measures and which measures have been taken to accommodate and retain everyone in school, have been looked at, and a thorough critical analysis of findings has been narrated. It was found that a lot of efforts have been conjugated to include and educate children of all castes, religions, and linguistic backgrounds. Furthermore, the awareness of inclusive education among teachers and society members is moderate, but teachers lack the necessary skills and knowledge on how to deal with students with special educational needs in regular classes. Also, the management is aware of inclusive education, but the management does not include teachers in decision-making. Moreover, it was found that the poor management of inclusion services and retention of special needs students in Indian schools results in their poor effective integration into the workforce. Finally, the management was found to have stringent admission criteria, which has the effect of hindering some students from entering the educational system. Based on the results of the study, it is clear that the implementation of inclusive education is still a challenge in India. However, there are promising results in tackling the issue. All children should be given an opportunity to learn together with other children in order to broaden their interest and challenge their potential.Keywords: awareness, management, inclusive education, students
Procedia PDF Downloads 23022223 Improvement of Analysis Vertical Oil Exploration Wells (Case Study)
Authors: Azza Hashim Abbas, Wan Rosli Wan Suliman
Abstract:
The old school of study, well testing reservoir engineers used the transient pressure analyses to get certain parameters and variable factors on the reservoir's physical properties, such as, (permeability-thickness). Recently, the difficulties facing the newly discovered areas are the convincing fact that the exploration and production (E&p) team should have sufficiently accurate and appropriate data to work with due to different sources of errors. The well-test analyst does the work without going through well-informed and reliable data from colleagues which may consequently cause immense environmental damage and unnecessary financial losses as well as opportunity losses to the project. In 2003, new potential oil field (Moga) face circulation problem well-22 was safely completed. However the high mud density had caused an extensive damage to the nearer well area which also distracted the hypothetical oil rate of flow that was not representive of the real reservoir characteristics This paper presents methods to analyze and interpret the production rate and pressure data of an oil field. Specifically for Well- 22 using the Deconvolution technique to enhance the transient pressure .Applying deconvolution to get the best range of certainty of results needed for the next subsequent operation. The range determined and analysis of skin factor range was reasonable.Keywords: well testing, exploration, deconvolution, skin factor, un certainity
Procedia PDF Downloads 44522222 Investigation of Engineers` and Student Engineers` University Choices Effect over Professional Expectations
Authors: Alev Erenler, Yeliz Yazici
Abstract:
It is undoubtful that the development in the technology has been increasing the importance of engineering day by day along with the interest of the profession also. Like in any other genre, the success in engineering career is directly related to the amount of the satisfaction from the profession. Having satisfaction is an important factor for both having psychological health and efficiency. In this concept, the engineers from all steps, like students from different grades, working in related professions and the candidates of engineering have been included in order to define the expectations of the profession and the levels if professional satisfaction. In the concept of the study, the factors such as; the graduated university, the university which has been attending at, the grades of the participants, the reasons behind the choosing the university, the order of the choices and demographic values have planned to examine. It is thought that these factors have a meaningful effect on the professional expectations. It is also aimed to find the similar participants from the working life, and the data is to be compared to candidates of engineering in terms if differentiation of expectations. The related data will be gathered by the help of the scale prepared and developed by the researchers special for this study, titled as ' the professional expectation scale for engineers'. The data is to be analyzed in SPSS program, and the results will be interpreted in relation with the literature.Keywords: engineering education, engineers' professional expectations, engineering students' professional expectations, students’ university choices
Procedia PDF Downloads 33622221 The DC Behavioural Electrothermal Model of Silicon Carbide Power MOSFETs under SPICE
Authors: Lakrim Abderrazak, Tahri Driss
Abstract:
This paper presents a new behavioural electrothermal model of power Silicon Carbide (SiC) MOSFET under SPICE. This model is based on the MOS model level 1 of SPICE, in which phenomena such as Drain Leakage Current IDSS, On-State Resistance RDSon, gate Threshold voltage VGSth, the transconductance (gfs), I-V Characteristics Body diode, temperature-dependent and self-heating are included and represented using behavioural blocks ABM (Analog Behavioural Models) of Spice library. This ultimately makes this model flexible and easily can be integrated into the various Spice -based simulation softwares. The internal junction temperature of the component is calculated on the basis of the thermal model through the electric power dissipated inside and its thermal impedance in the form of the localized Foster canonical network. The model parameters are extracted from manufacturers' data (curves data sheets) using polynomial interpolation with the method of simulated annealing (S A) and weighted least squares (WLS). This model takes into account the various important phenomena within transistor. The effectiveness of the presented model has been verified by Spice simulation results and as well as by data measurement for SiC MOS transistor C2M0025120D CREE (1200V, 90A).Keywords: SiC power MOSFET, DC electro-thermal model, ABM Spice library, SPICE modelling, behavioural model, C2M0025120D CREE.
Procedia PDF Downloads 58122220 Arousal, Encoding, And Intrusive Memories
Authors: Hannah Gutmann, Rick Richardson, Richard Bryant
Abstract:
Intrusive memories following a traumatic event are not uncommon. However, in some individuals, these memories become maladaptive and lead to prolonged stress reactions. A seminal model of PTSD explains that aberrant processing during trauma may lead to prolonged stress reactions and intrusive memories. This model explains that elevated arousal at the time of the trauma promotes data driven processing, leading to fragmented and intrusive memories. This study investigated the role of elevated arousal on the development of intrusive memories. We measured salivary markers of arousal and investigated what impact this had on data driven processing, memory fragmentation, and subsequently, the development of intrusive memories. We assessed 100 healthy participants to understand their processing style, arousal, and experience of intrusive memories. Participants were randomised to a control or experimental condition, the latter of which was designed to increase their arousal. Based on current theory, participants in the experimental condition were expected to engage in more data driven processing and experience more intrusive memories than participants in the control condition. This research aims to shed light on the mechanisms underlying the development of intrusive memories to illustrate ways in which therapeutic approaches for PTSD may be augmented for greater efficacy.Keywords: stress, cortisol, SAA, PTSD, intrusive memories
Procedia PDF Downloads 19722219 Efficacy Study of Post-Tensioned I Girder Made of Ultra-High Performance Fiber Reinforced Concrete and Ordinary Concrete for IRC Loading
Authors: Ayush Satija, Ritu Raj
Abstract:
Escalating demand for elevated structures as a remedy for traffic congestion has led to a surge in the construction of viaducts and bridges predominantly employing prestressed beams. However, post-tensioned I-girder superstructures are gaining traction for their attributes like structural efficiency, cost-effectiveness, and easy construction. Recently, Ultra-high-performance fiber-reinforced concrete (UHPFRC) has emerged as a revolutionary material in reshaping conventional infrastructure engineering. UHPFRC offers exceptional properties including high compressive and tensile strength, alongside enhanced durability. Its adoption in bridges yields benefits, notably a remarkable strength-to-weight ratio enabling the design of lighter and slender structural elements, enhancing functionality and sustainability. Despite its myriad advantages, integration of UHPFRC in construction is still evolving, hindered by factors like cost, material availability, and design standardization. Consequently, there's a need to assess the feasibility of substituting ordinary concrete (OC) with UHPFRC in bridges, focusing on economic considerations. This research undertakes an efficacy study between post-tensioned I-girders fabricated from UHPFRC and OC, evaluating cost parameters associated with concrete production, reinforcement, and erection. The study reveals that UHPFRC becomes economically viable for spans exceeding 40.0m. This shift in cost-effectiveness is attributed to factors like reduced girder depth, elimination of un-tensioned steel, diminished need for shear reinforcement and decreased erection costs.Keywords: post tensioned I girder, superstructure, ultra-high-performance fiber reinforced concrete, ordinary concrete
Procedia PDF Downloads 4022218 Transparency of Algorithmic Decision-Making: Limits Posed by Intellectual Property Rights
Authors: Olga Kokoulina
Abstract:
Today, algorithms are assuming a leading role in various areas of decision-making. Prompted by a promise to provide increased economic efficiency and fuel solutions for pressing societal challenges, algorithmic decision-making is often celebrated as an impartial and constructive substitute for human adjudication. But in the face of this implied objectivity and efficiency, the application of algorithms is also marred with mounting concerns about embedded biases, discrimination, and exclusion. In Europe, vigorous debates on risks and adverse implications of algorithmic decision-making largely revolve around the potential of data protection laws to tackle some of the related issues. For example, one of the often-cited venues to mitigate the impact of potentially unfair decision-making practice is a so-called 'right to explanation'. In essence, the overall right is derived from the provisions of the General Data Protection Regulation (‘GDPR’) ensuring the right of data subjects to access and mandating the obligation of data controllers to provide the relevant information about the existence of automated decision-making and meaningful information about the logic involved. Taking corresponding rights and obligations in the context of the specific provision on automated decision-making in the GDPR, the debates mainly focus on efficacy and the exact scope of the 'right to explanation'. In essence, the underlying logic of the argued remedy lies in a transparency imperative. Allowing data subjects to acquire as much knowledge as possible about the decision-making process means empowering individuals to take control of their data and take action. In other words, forewarned is forearmed. The related discussions and debates are ongoing, comprehensive, and, often, heated. However, they are also frequently misguided and isolated: embracing the data protection law as ultimate and sole lenses are often not sufficient. Mandating the disclosure of technical specifications of employed algorithms in the name of transparency for and empowerment of data subjects potentially encroach on the interests and rights of IPR holders, i.e., business entities behind the algorithms. The study aims at pushing the boundaries of the transparency debate beyond the data protection regime. By systematically analysing legal requirements and current judicial practice, it assesses the limits of the transparency requirement and right to access posed by intellectual property law, namely by copyrights and trade secrets. It is asserted that trade secrets, in particular, present an often-insurmountable obstacle for realising the potential of the transparency requirement. In reaching that conclusion, the study explores the limits of protection afforded by the European Trade Secrets Directive and contrasts them with the scope of respective rights and obligations related to data access and portability enshrined in the GDPR. As shown, the far-reaching scope of the protection under trade secrecy is evidenced both through the assessment of its subject matter as well as through the exceptions from such protection. As a way forward, the study scrutinises several possible legislative solutions, such as flexible interpretation of the public interest exception in trade secrets as well as the introduction of the strict liability regime in case of non-transparent decision-making.Keywords: algorithms, public interest, trade secrets, transparency
Procedia PDF Downloads 12422217 Exploring the Role of Media Activity Theory as a Conceptual Basis for Advancing Journalism Education: A Comprehensive Analysis of Its Impact on News Production and Consumption in the Digital Age
Authors: Shohnaza Uzokova Beknazarovna
Abstract:
This research study provides a comprehensive exploration of the Theory of Media Activity and its relevance as a conceptual framework for journalism education. The author offers a thorough review of existing literature on media activity theory, emphasizing its potential to enhance the understanding of the evolving media landscape and its implications for journalism practice. Through a combination of theoretical analysis and practical examples, the paper elucidates the ways in which the Theory of Media Activity can inform and enrich journalism education, particularly in relation to the interactive and participatory nature of contemporary media. The author presents a compelling argument for the integration of media activity theory into journalism curricula, emphasizing its capacity to equip students with a nuanced understanding of the reciprocal relationship between media producers and consumers. Furthermore, the paper discusses the implications of technological advancements on media production and consumption, highlighting the need for journalism educators to prepare students to navigate and contribute to the future of journalism in a rapidly changing media environment. Overall, this research paper offers valuable insights into the potential benefits of embracing the Theory of Media Activity as a foundational framework for journalism education. Its thorough analysis and practical implications make it a valuable resource for educators, researchers, and practitioners seeking to enhance journalism pedagogy in response to the dynamic nature of contemporary media.Keywords: theory of media activity, journalism education, media landscape, media production, media consumption, interactive media, participatory media, technological advancements, media producers, media consumers, journalism practice, contemporary media environment, journalism pedagogy, media theory, media studies
Procedia PDF Downloads 4722216 Analyses of the Constitutional Identity in Hungary: A Case Study on the Concept of Constitutionalism and Legal Continuity in New Fundamental Law of Hungary
Authors: Zsuzsanna Fejes
Abstract:
The aim of this paper is to provide an overview of the legal history of constitutionalism in Hungary, in focus of the democratic transitions in 1989-1990, describing the historical and political background of the changes and presenting the main and most important features of the new democracy, and institutional and legal orders. In Hungary the evolved political, economic and moral crisis prior to the constitutional years 2010-11 had been such a constitutional moment, which led to an opportune and unavoidable change at the same time. The Hungarian constitutional power intended to adopt a new constitution, which was competent to create a common constitutional identity and to express a national unity. The Hungarian Parliament on 18th April 2011 passed the New Fundamental Law. The new Fundamental Law rich in national values meant a new challenge for the academics, lawyers, and political scientists. Not only the classical political science, but also the constitutional law and theory have to struggle with the interpretation of the new declarations about national constitutional values in the Fundamental Law. The main features and structure of the new Fundamental Law will be analysed, and given a detailed interpretation of the Preamble as a declaration of constitutional values. During the examination of the Preamble shall be cleared up the components of Hungarian statehood and national unity, individual and common human rights, the practical and theoretical demand on national sovereignty, and the content and possibilities for the interpretation of the achievements of the historical Constitution. These scopes of problems will be presented during the examination of the text of National Avowal, as a preamble of the Fundamental Law. It is examined whether the Fundamental Law itself could be suitable and sufficient means to citizens of Hungary to express the ideas therein as their own, it will be analysed how could the national and European common traditions, values and principles stated in the Fundamental Law mean maintenance in Hungary’s participation in the European integration.Keywords: common constitutional values, constitutionalism, national identity, national sovereignty, national unity, statehood
Procedia PDF Downloads 29422215 Variable Refrigerant Flow (VRF) Zonal Load Prediction Using a Transfer Learning-Based Framework
Authors: Junyu Chen, Peng Xu
Abstract:
In the context of global efforts to enhance building energy efficiency, accurate thermal load forecasting is crucial for both device sizing and predictive control. Variable Refrigerant Flow (VRF) systems are widely used in buildings around the world, yet VRF zonal load prediction has received limited attention. Due to differences between VRF zones in building-level prediction methods, zone-level load forecasting could significantly enhance accuracy. Given that modern VRF systems generate high-quality data, this paper introduces transfer learning to leverage this data and further improve prediction performance. This framework also addresses the challenge of predicting load for building zones with no historical data, offering greater accuracy and usability compared to pure white-box models. The study first establishes an initial variable set of VRF zonal building loads and generates a foundational white-box database using EnergyPlus. Key variables for VRF zonal loads are identified using methods including SRRC, PRCC, and Random Forest. XGBoost and LSTM are employed to generate pre-trained black-box models based on the white-box database. Finally, real-world data is incorporated into the pre-trained model using transfer learning to enhance its performance in operational buildings. In this paper, zone-level load prediction was integrated with transfer learning, and a framework was proposed to improve the accuracy and applicability of VRF zonal load prediction.Keywords: zonal load prediction, variable refrigerant flow (VRF) system, transfer learning, energyplus
Procedia PDF Downloads 2822214 Enhancing Power System Resilience: An Adaptive Under-Frequency Load Shedding Scheme Incorporating PV Generation and Fast Charging Stations
Authors: Sami M. Alshareef
Abstract:
In the rapidly evolving energy landscape, the integration of renewable energy sources and the electrification of transportation are essential steps toward achieving sustainability goals. However, these advancements introduce new challenges, particularly in maintaining frequency stability due to variable photovoltaic (PV) generation and the growing demand for fast charging stations. The variability of photovoltaic (PV) generation due to weather conditions can disrupt the balance between generation and load, resulting in frequency deviations. To ensure the stability of power systems, it is imperative to develop effective under frequency load-shedding schemes. This research proposal presents an adaptive under-frequency load shedding scheme based on the power swing equation, designed explicitly for the IEEE-9 Bus Test System, that includes PV generation and fast charging stations. This research aims to address these challenges by developing an advanced scheme that dynamically disconnects fast charging stations based on power imbalances. The scheme prioritizes the disconnection of stations near affected areas to expedite system frequency stabilization. To achieve these goals, the research project will leverage the power swing equation, a widely recognized model for analyzing system dynamics during under-frequency events. By utilizing this equation, the proposed scheme will adaptively adjust the load-shedding process in real-time to maintain frequency stability and prevent power blackouts. The research findings will support the transition towards sustainable energy systems by ensuring a reliable and uninterrupted electricity supply while enhancing the resilience and stability of power systems during under-frequency events.Keywords: load shedding, fast charging stations, pv generation, power system resilience
Procedia PDF Downloads 8122213 Hybrid Knowledge and Data-Driven Neural Networks for Diffuse Optical Tomography Reconstruction in Medical Imaging
Authors: Paola Causin, Andrea Aspri, Alessandro Benfenati
Abstract:
Diffuse Optical Tomography (DOT) is an emergent medical imaging technique which employs NIR light to estimate the spatial distribution of optical coefficients in biological tissues for diagnostic purposes, in a noninvasive and non-ionizing manner. DOT reconstruction is a severely ill-conditioned problem due to prevalent scattering of light in the tissue. In this contribution, we present our research in adopting hybrid knowledgedriven/data-driven approaches which exploit the existence of well assessed physical models and build upon them neural networks integrating the availability of data. Namely, since in this context regularization procedures are mandatory to obtain a reasonable reconstruction [1], we explore the use of neural networks as tools to include prior information on the solution. 2. Materials and Methods The idea underlying our approach is to leverage neural networks to solve PDE-constrained inverse problems of the form 𝒒 ∗ = 𝒂𝒓𝒈 𝒎𝒊𝒏𝒒 𝐃(𝒚, 𝒚̃), (1) where D is a loss function which typically contains a discrepancy measure (or data fidelity) term plus other possible ad-hoc designed terms enforcing specific constraints. In the context of inverse problems like (1), one seeks the optimal set of physical parameters q, given the set of observations y. Moreover, 𝑦̃ is the computable approximation of y, which may be as well obtained from a neural network but also in a classic way via the resolution of a PDE with given input coefficients (forward problem, Fig.1 box ). Due to the severe ill conditioning of the reconstruction problem, we adopt a two-fold approach: i) we restrict the solutions (optical coefficients) to lie in a lower-dimensional subspace generated by auto-decoder type networks. This procedure forms priors of the solution (Fig.1 box ); ii) we use regularization procedures of type 𝒒̂ ∗ = 𝒂𝒓𝒈𝒎𝒊𝒏𝒒 𝐃(𝒚, 𝒚̃)+ 𝑹(𝒒), where 𝑹(𝒒) is a regularization functional depending on regularization parameters which can be fixed a-priori or learned via a neural network in a data-driven modality. To further improve the generalizability of the proposed framework, we also infuse physics knowledge via soft penalty constraints (Fig.1 box ) in the overall optimization procedure (Fig.1 box ). 3. Discussion and Conclusion DOT reconstruction is severely hindered by ill-conditioning. The combined use of data-driven and knowledgedriven elements is beneficial and allows to obtain improved results, especially with a restricted dataset and in presence of variable sources of noise.Keywords: inverse problem in tomography, deep learning, diffuse optical tomography, regularization
Procedia PDF Downloads 7422212 Multivariate Control Chart to Determine Efficiency Measurements in Industrial Processes
Authors: J. J. Vargas, N. Prieto, L. A. Toro
Abstract:
Control charts are commonly used to monitor processes involving either variable or attribute of quality characteristics and determining the control limits as a critical task for quality engineers to improve the processes. Nonetheless, in some applications it is necessary to include an estimation of efficiency. In this paper, the ability to define the efficiency of an industrial process was added to a control chart by means of incorporating a data envelopment analysis (DEA) approach. In depth, a Bayesian estimation was performed to calculate the posterior probability distribution of parameters as means and variance and covariance matrix. This technique allows to analyse the data set without the need of using the hypothetical large sample implied in the problem and to be treated as an approximation to the finite sample distribution. A rejection simulation method was carried out to generate random variables from the parameter functions. Each resulting vector was used by stochastic DEA model during several cycles for establishing the distribution of each efficiency measures for each DMU (decision making units). A control limit was calculated with model obtained and if a condition of a low level efficiency of DMU is presented, system efficiency is out of control. In the efficiency calculated a global optimum was reached, which ensures model reliability.Keywords: data envelopment analysis, DEA, Multivariate control chart, rejection simulation method
Procedia PDF Downloads 37422211 Exploring the Challenges to Usage of Building Construction Cost Indices in Ghana
Authors: Jerry Gyimah, Ernest Kissi, Safowaa Osei-Tutu, Charles Dela Adobor, Theophilus Adjei-Kumi, Ernest Osei-Tutu
Abstract:
Price fluctuation contract is imperative and of paramount essence, in the construction industry as it provides adequate relief and cushioning for changes in the prices of input resources during construction. As a result, several methods have been devised to better help in arriving at fair recompense in the event of price changes. However, stakeholders often appear not to be satisfied with the existing methods of fluctuation evaluation, ostensibly because of the challenges associated with them. The aim of this study was to identify the challenges to the usage of building construction cost indices in Ghana. Data was gathered from contractors and quantity surveying firms. The study utilized a survey questionnaire approach to elicit responses from the contractors and the consultants. Data gathered was analyzed scientifically, using the relative importance index (RII) to rank the problems associated with the existing methods. The findings revealed the following, among others, late release of data, inadequate recovery of costs, and work items of interest not included in the published indices as the main challenges of the existing methods. Findings provide useful lessons for policymakers and practitioners in decision making towards the usage and improvement of available indices.Keywords: building construction cost indices, challenges, usage, Ghana
Procedia PDF Downloads 15222210 The Effects of SMS on the Formal Writings of the Students: A Comparative Study among the Students of Different Departments of IUB
Authors: Sumaira Saleem
Abstract:
This study reveals that the use of SMS effect the formal writing of the students. SMS is in vogue sine the last decade but its detrimental effects are effecting not only to the set norms but also deviant forms of expressions have come into the community to which all are not acquainted and it creates a hurdle in effective communication. It also determines the reasons behind the usage of SMS practices in the formal writings like in assignments and examinations. For this study a questionnaire was designed for faculty and students the data was collected from The Islamia University Bahawalpur and the formal work of the students was also collected to check the manifestation of SMS practices in writings. Data was analysed on excel sheet and the tables and graphs are used to explain the ratios and percentages of SMS usage. The results show that the usage of SMS has very strong effect upon the students writing.Keywords: technology, writing, effects, SMS
Procedia PDF Downloads 38022209 Investigation of Glacier Activity Using Optical and Radar Data in Zardkooh
Authors: Mehrnoosh Ghadimi, Golnoush Ghadimi
Abstract:
Precise monitoring of glacier velocity is critical in determining glacier-related hazards. Zardkooh Mountain was studied in terms of glacial activity rate in Zagros Mountainous region in Iran. In this study, we assessed the ability of optical and radar imagery to derive glacier-surface velocities in mountainous terrain. We processed Landsat 8 for optical data and Sentinel-1a for radar data. We used methods that are commonly used to measure glacier surface movements, such as cross correlation of optical and radar satellite images, SAR tracking techniques, and multiple aperture InSAR (MAI). We also assessed time series glacier surface displacement using our modified method, Enhanced Small Baseline Subset (ESBAS). The ESBAS has been implemented in StaMPS software, with several aspects of the processing chain modified, including filtering prior to phase unwrapping, topographic correction within three-dimensional phase unwrapping, reducing atmospheric noise, and removing the ramp caused by ionosphere turbulence and/or orbit errors. Our findings indicate an average surface velocity rate of 32 mm/yr in the Zardkooh mountainous areas.Keywords: active rock glaciers, landsat 8, sentinel-1a, zagros mountainous region
Procedia PDF Downloads 7722208 Towards Long-Range Pixels Connection for Context-Aware Semantic Segmentation
Authors: Muhammad Zubair Khan, Yugyung Lee
Abstract:
Deep learning has recently achieved enormous response in semantic image segmentation. The previously developed U-Net inspired architectures operate with continuous stride and pooling operations, leading to spatial data loss. Also, the methods lack establishing long-term pixels connection to preserve context knowledge and reduce spatial loss in prediction. This article developed encoder-decoder architecture with bi-directional LSTM embedded in long skip-connections and densely connected convolution blocks. The network non-linearly combines the feature maps across encoder-decoder paths for finding dependency and correlation between image pixels. Additionally, the densely connected convolutional blocks are kept in the final encoding layer to reuse features and prevent redundant data sharing. The method applied batch-normalization for reducing internal covariate shift in data distributions. The empirical evidence shows a promising response to our method compared with other semantic segmentation techniques.Keywords: deep learning, semantic segmentation, image analysis, pixels connection, convolution neural network
Procedia PDF Downloads 10322207 Crop Leaf Area Index (LAI) Inversion and Scale Effect Analysis from Unmanned Aerial Vehicle (UAV)-Based Hyperspectral Data
Authors: Xiaohua Zhu, Lingling Ma, Yongguang Zhao
Abstract:
Leaf Area Index (LAI) is a key structural characteristic of crops and plays a significant role in precision agricultural management and farmland ecosystem modeling. However, LAI retrieved from different resolution data contain a scaling bias due to the spatial heterogeneity and model non-linearity, that is, there is scale effect during multi-scale LAI estimate. In this article, a typical farmland in semi-arid regions of Chinese Inner Mongolia is taken as the study area, based on the combination of PROSPECT model and SAIL model, a multiple dimensional Look-Up-Table (LUT) is generated for multiple crops LAI estimation from unmanned aerial vehicle (UAV) hyperspectral data. Based on Taylor expansion method and computational geometry model, a scale transfer model considering both difference between inter- and intra-class is constructed for scale effect analysis of LAI inversion over inhomogeneous surface. The results indicate that, (1) the LUT method based on classification and parameter sensitive analysis is useful for LAI retrieval of corn, potato, sunflower and melon on the typical farmland, with correlation coefficient R2 of 0.82 and root mean square error RMSE of 0.43m2/m-2. (2) The scale effect of LAI is becoming obvious with the decrease of image resolution, and maximum scale bias is more than 45%. (3) The scale effect of inter-classes is higher than that of intra-class, which can be corrected efficiently by the scale transfer model established based Taylor expansion and Computational geometry. After corrected, the maximum scale bias can be reduced to 1.2%.Keywords: leaf area index (LAI), scale effect, UAV-based hyperspectral data, look-up-table (LUT), remote sensing
Procedia PDF Downloads 44022206 Road Traffic Noise Mapping for Riyadh City Using GIS and Lima
Authors: Khalid A. Alsaif, Mosaad A. Foda
Abstract:
The primary objective of this study is to develop the first round of road traffic noise maps for Riyadh City using Geographical Information Systems (GIS) and software LimA 7810 predictor. The road traffic data were measured or estimated as accurate as possible in order to obtain reliable noise maps. Meanwhile, the attributes of the roads and buildings are automatically exported from GIS. The simulation results at some chosen locations are validated by actual field measurements, which are obtained by a system that consists of a sound level meter, a GPS receiver and a database to manage the measured data. The results show that the average error between the predicted and measured noise levels is below 3.0 dB.Keywords: noise pollution, road traffic noise, LimA predictor, GIS
Procedia PDF Downloads 40622205 Dynamics of Marital Status and Information Search through Consumer Generated Media: An Exploratory Study
Authors: Shivkumar Krishnamurti, Ruchi Agarwal
Abstract:
The study examines the influence of marital status on consumers of products and services using blogs as a source of information. A pre-designed questionnaire was used to collect the primary data from the respondents (experiences). Data were collected from one hundred and eighty seven respondents residing in and around the Emirates of Sharjah and Dubai of the United Arab Emirates. The collected data was analyzed with the help of statistical tools such as averages, percentages, factor analysis, student’s t-test and structural equation modeling technique. Objectives of the study are to know the reasons how married and unmarried or single consumers of products and services are motivated to use blogs as a source of information, to know whether the consumers of products and services irrespective of their marital status share their views and experiences with other bloggers and to know the respondents’ future intentions towards blogging. The study revealed the following: Majority of the respondents have the motivation to blog because they are willing to receive comments on what they post about services, convenience of blogs to search for information about services and products, by blogging respondents share information on the symptoms of a disease/ disorder that may be experienced by someone, helps to share information about ready to cook mix products and are keen to spend more time blogging in the future.Keywords: blog, consumer, information, marital status
Procedia PDF Downloads 38522204 Design of a Real Time Heart Sounds Recognition System
Authors: Omer Abdalla Ishag, Magdi Baker Amien
Abstract:
Physicians used the stethoscope for listening patient heart sounds in order to make a diagnosis. However, the determination of heart conditions by acoustic stethoscope is a difficult task so it requires special training of medical staff. This study developed an accurate model for analyzing the phonocardiograph signal based on PC and DSP processor. The system has been realized into two phases; offline and real time phase. In offline phase, 30 cases of heart sounds files were collected from medical students and doctor's world website. For experimental phase (real time), an electronic stethoscope has been designed, implemented and recorded signals from 30 volunteers, 17 were normal cases and 13 were various pathologies cases, these acquired 30 signals were preprocessed using an adaptive filter to remove lung sounds. The background noise has been removed from both offline and real data, using wavelet transform, then graphical and statistics features vector elements were extracted, finally a look-up table was used for classification heart sounds cases. The obtained results of the implemented system showed accuracy of 90%, 80% and sensitivity of 87.5%, 82.4% for offline data, and real data respectively. The whole system has been designed on TMS320VC5509a DSP Platform.Keywords: code composer studio, heart sounds, phonocardiograph, wavelet transform
Procedia PDF Downloads 44622203 Tagging a corpus of Media Interviews with Diplomats: Challenges and Solutions
Authors: Roberta Facchinetti, Sara Corrizzato, Silvia Cavalieri
Abstract:
Increasing interconnection between data digitalization and linguistic investigation has given rise to unprecedented potentialities and challenges for corpus linguists, who need to master IT tools for data analysis and text processing, as well as to develop techniques for efficient and reliable annotation in specific mark-up languages that encode documents in a format that is both human and machine-readable. In the present paper, the challenges emerging from the compilation of a linguistic corpus will be taken into consideration, focusing on the English language in particular. To do so, the case study of the InterDiplo corpus will be illustrated. The corpus, currently under development at the University of Verona (Italy), represents a novelty in terms both of the data included and of the tag set used for its annotation. The corpus covers media interviews and debates with diplomats and international operators conversing in English with journalists who do not share the same lingua-cultural background as their interviewees. To date, this appears to be the first tagged corpus of international institutional spoken discourse and will be an important database not only for linguists interested in corpus analysis but also for experts operating in international relations. In the present paper, special attention will be dedicated to the structural mark-up, parts of speech annotation, and tagging of discursive traits, that are the innovational parts of the project being the result of a thorough study to find the best solution to suit the analytical needs of the data. Several aspects will be addressed, with special attention to the tagging of the speakers’ identity, the communicative events, and anthropophagic. Prominence will be given to the annotation of question/answer exchanges to investigate the interlocutors’ choices and how such choices impact communication. Indeed, the automated identification of questions, in relation to the expected answers, is functional to understand how interviewers elicit information as well as how interviewees provide their answers to fulfill their respective communicative aims. A detailed description of the aforementioned elements will be given using the InterDiplo-Covid19 pilot corpus. The data yielded by our preliminary analysis of the data will highlight the viable solutions found in the construction of the corpus in terms of XML conversion, metadata definition, tagging system, and discursive-pragmatic annotation to be included via Oxygen.Keywords: spoken corpus, diplomats’ interviews, tagging system, discursive-pragmatic annotation, english linguistics
Procedia PDF Downloads 18522202 The Importance of including All Data in a Linear Model for the Analysis of RNAseq Data
Authors: Roxane A. Legaie, Kjiana E. Schwab, Caroline E. Gargett
Abstract:
Studies looking at the changes in gene expression from RNAseq data often make use of linear models. It is also common practice to focus on a subset of data for a comparison of interest, leaving aside the samples not involved in this particular comparison. This work shows the importance of including all observations in the modeling process to better estimate variance parameters, even when the samples included are not directly used in the comparison under test. The human endometrium is a dynamic tissue, which undergoes cycles of growth and regression with each menstrual cycle. The mesenchymal stem cells (MSCs) present in the endometrium are likely responsible for this remarkable regenerative capacity. However recent studies suggest that MSCs also plays a role in the pathogenesis of endometriosis, one of the most common medical conditions affecting the lower abdomen in women in which the endometrial tissue grows outside the womb. In this study we compared gene expression profiles between MSCs and non-stem cell counterparts (‘non-MSC’) obtained from women with (‘E’) or without (‘noE’) endometriosis from RNAseq. Raw read counts were used for differential expression analysis using a linear model with the limma-voom R package, including either all samples in the study or only the samples belonging to the subset of interest (e.g. for the comparison ‘E vs noE in MSC cells’, including only MSC samples from E and noE patients but not the non-MSC ones). Using the full dataset we identified about 100 differentially expressed (DE) genes between E and noE samples in MSC samples (adj.p-val < 0.05 and |logFC|>1) while only 9 DE genes were identified when using only the subset of data (MSC samples only). Important genes known to be involved in endometriosis such as KLF9 and RND3 were missed in the latter case. When looking at the MSC vs non-MSC cells comparison, the linear model including all samples identified 260 genes for noE samples (including the stem cell marker SUSD2) while the subset analysis did not identify any DE genes. When looking at E samples, 12 genes were identified with the first approach and only 1 with the subset approach. Although the stem cell marker RGS5 was found in both cases, the subset test missed important genes involved in stem cell differentiation such as NOTCH3 and other potentially related genes to be used for further investigation and pathway analysis.Keywords: differential expression, endometriosis, linear model, RNAseq
Procedia PDF Downloads 43222201 Thermodynamic Behaviour of Binary Mixtures of 1, 2-Dichloroethane with Some Cyclic Ethers: Experimental Results and Modelling
Authors: Fouzia Amireche-Ziar, Ilham Mokbel, Jacques Jose
Abstract:
The vapour pressures of the three binary mixtures: 1, 2- dichloroethane + 1,3-dioxolane, + 1,4-dioxane or + tetrahydropyrane, are carried out at ten temperatures ranging from 273 to 353.15 K. An accurate static device was employed for these measurements. The VLE data were reduced using the Redlich-Kister equation by taking into consideration the vapour pressure non-ideality in terms of the second molar virial coefficient. The experimental data were compared to the results predicted with the DISQUAC and Dortmund UNIFAC group contribution models for the total pressures P and the excess molar Gibbs energies GE.Keywords: disquac model, dortmund UNIFAC model, excess molar Gibbs energies GE, VLE
Procedia PDF Downloads 25822200 Chemometric-Based Voltammetric Method for Analysis of Vitamins and Heavy Metals in Honey Samples
Authors: Marwa A. A. Ragab, Amira F. El-Yazbi, Amr El-Hawiet
Abstract:
The analysis of heavy metals in honey samples is crucial. When found in honey, they denote environmental pollution. Some of these heavy metals as lead either present at low or high concentrations are considered to be toxic. Other heavy metals, for example, copper and zinc, if present at low concentrations, they considered safe even vital minerals. On the contrary, if they present at high concentrations, they are toxic. Their voltammetric determination in honey represents a challenge due to the presence of other electro-active components as vitamins, which may overlap with the peaks of the metal, hindering their accurate and precise determination. The simultaneous analysis of some vitamins: nicotinic acid (B3) and riboflavin (B2), and heavy metals: lead, cadmium, and zinc, in honey samples, was addressed. The analysis was done in 0.1 M Potassium Chloride (KCl) using a hanging mercury drop electrode (HMDE), followed by chemometric manipulation of the voltammetric data using the derivative method. Then the derivative data were convoluted using discrete Fourier functions. The proposed method allowed the simultaneous analysis of vitamins and metals though their varied responses and sensitivities. Although their peaks were overlapped, the proposed chemometric method allowed their accurate and precise analysis. After the chemometric treatment of the data, metals were successfully quantified at low levels in the presence of vitamins (1: 2000). The heavy metals limit of detection (LOD) values after the chemometric treatment of data decreased by more than 60% than those obtained from the direct voltammetric method. The method applicability was tested by analyzing the selected metals and vitamins in real honey samples obtained from different botanical origins.Keywords: chemometrics, overlapped voltammetric peaks, derivative and convoluted derivative methods, metals and vitamins
Procedia PDF Downloads 15022199 Differences in Production of Knowledge between Internationally Mobile versus Nationally Mobile and Non-Mobile Scientists
Authors: Valeria Aman
Abstract:
The presented study examines the impact of international mobility on knowledge production among mobile scientists and within the sending and receiving research groups. Scientists are relevant to the dynamics of knowledge production because scientific knowledge is mainly characterized by embeddedness and tacitness. International mobility enables the dissemination of scientific knowledge to other places and encourages new combinations of knowledge. It can also increase the interdisciplinarity of research by forming synergetic combinations of knowledge. Particularly innovative ideas can have their roots in related research domains and are sometimes transferred only through the physical mobility of scientists. Diversity among scientists with respect to their knowledge base can act as an engine for the creation of knowledge. It is therefore relevant to study how knowledge acquired through international mobility affects the knowledge production process. In certain research domains, international mobility may be essential to contextualize knowledge and to gain access to knowledge located at distant places. The knowledge production process contingent on the type of international mobility and the epistemic culture of a research field is examined. The production of scientific knowledge is a multi-faceted process, the output of which is mainly published in scholarly journals. Therefore, the study builds upon publication and citation data covered in Elsevier’s Scopus database for the period of 1996 to 2015. To analyse these data, bibliometric and social network analysis techniques are used. A basic analysis of scientific output using publication data, citation data and data on co-authored publications is combined with a content map analysis. Abstracts of publications indicate whether a research stay abroad makes an original contribution methodologically, theoretically or empirically. Moreover, co-citations are analysed to map linkages among scientists and emerging research domains. Finally, acknowledgements are studied that can function as channels of formal and informal communication between the actors involved in the process of knowledge production. The results provide better understanding of how the international mobility of scientists contributes to the production of knowledge, by contrasting the knowledge production dynamics of internationally mobile scientists with those being nationally mobile or immobile. Findings also allow indicating whether international mobility accelerates the production of knowledge and the emergence of new research fields.Keywords: bibliometrics, diversity, interdisciplinarity, international mobility, knowledge production
Procedia PDF Downloads 293