Search results for: social media data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32025

Search results for: social media data

22905 Secret Security Smart Lock Using Artificial Intelligence Hybrid Algorithm

Authors: Vahid Bayrami Rad

Abstract:

Ever since humans developed a collective way of life to the development of urbanization, the concern of security has always been considered one of the most important challenges of life. To protect property, locks have always been a practical tool. With the advancement of technology, the form of locks has changed from mechanical to electric. One of the most widely used fields of using artificial intelligence is its application in the technology of surveillance security systems. Currently, the technologies used in smart anti-theft door handles are one of the most potential fields for using artificial intelligence. Artificial intelligence has the possibility to learn, calculate, interpret and process by analyzing data with the help of algorithms and mathematical models and make smart decisions. We will use Arduino board to process data.

Keywords: arduino board, artificial intelligence, image processing, solenoid lock

Procedia PDF Downloads 65
22904 A Machine Learning Approach for Efficient Resource Management in Construction Projects

Authors: Soheila Sadeghi

Abstract:

Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.

Keywords: resource allocation, machine learning, optimization, data-driven decision-making, project management

Procedia PDF Downloads 30
22903 Exploring Public Trust in Democracy

Authors: Yaron Katz

Abstract:

The investigation of immigrants' electoral choices has remained relatively uncharted territory despite the fact that numerous nations extend political rights to their expatriates. This paper centers its attention on the matter of public trust in democracy, with a focus on the intricacies of Israeli politics as a divided system. It delves into the potential implications of political and social transformations stemming from the involvement of expatriate voters in elections taking place in their country of origin. In doing so, the article endeavors to explore a pathway for resolving a persistent challenge facing the stability of the Israeli political landscape over the past decade: the difficulty in forming a resilient government that genuinely represents the majority of voters. An examination is conducted into the role played by a demographic with the capacity to exert significant influence on election outcomes, namely, individuals residing outside of Israel. The objective of this research is to delve into this subject, dissecting social developments and political prospects that may shape the country's trajectory in the coming decades. This inquiry is especially pertinent given the extensive engagement of migrants in Israeli politics and the link between Israelis living abroad and their home country. Nevertheless, the study's findings reveal that while former citizens exhibit extensive involvement in Israeli politics and are cognizant of the potential consequences of permitting them to participate in elections, they maintain steadfastly unfavorable views regarding the inclusion of Israelis living overseas in their home country's electoral processes.

Keywords: trust, globalization, policy, democracy

Procedia PDF Downloads 43
22902 Hydrogen: Contention-Aware Hybrid Memory Management for Heterogeneous CPU-GPU Architectures

Authors: Yiwei Li, Mingyu Gao

Abstract:

Integrating hybrid memories with heterogeneous processors could leverage heterogeneity in both compute and memory domains for better system efficiency. To ensure performance isolation, we introduce Hydrogen, a hardware architecture to optimize the allocation of hybrid memory resources to heterogeneous CPU-GPU systems. Hydrogen supports efficient capacity and bandwidth partitioning between CPUs and GPUs in both memory tiers. We propose decoupled memory channel mapping and token-based data migration throttling to enable flexible partitioning. We also support epoch-based online search for optimized configurations and lightweight reconfiguration with reduced data movements. Hydrogen significantly outperforms existing designs by 1.21x on average and up to 1.31x.

Keywords: hybrid memory, heterogeneous systems, dram cache, graphics processing units

Procedia PDF Downloads 75
22901 Computer-Based versus Paper-Based Tests: A Comparative Study of Two Types of Indonesian National Examination for Senior High School Students

Authors: Faizal Mansyur

Abstract:

The objective of this research is to find out whether there is a significant difference in the English language scores of senior high school students in the Indonesia National Examination for students tested by using computer-based and paper-based tests. The population of this research is senior high school students in South Sulawesi Province who sat the Indonesian National Examination for 2015/2016 academic year. The samples of this research are 800 students’ scores from 8 schools taken by employing the multistage random sampling technique. The data of this research is a secondary data since it is obtained from the education office for South Sulawesi. In analyzing the collected data, the researcher employed the independent samples T-Test with the help of SPSS v.24 program. The finding of this research reveals that there is a significant difference in the English language scores of senior high school students in the Indonesia National Examination for students tested by using computer-based and paper-based Tests (p < .05). Moreover, students tested by using PBT (Mean = 63.13, SD = 13.63) achieve higher score than those tested by using CBT (Mean = 46.33, SD = 14.68).

Keywords: computer-based test, paper-based test, Indonesian national examination, testing

Procedia PDF Downloads 163
22900 Close-Range Remote Sensing Techniques for Analyzing Rock Discontinuity Properties

Authors: Sina Fatolahzadeh, Sergio A. Sepúlveda

Abstract:

This paper presents advanced developments in close-range, terrestrial remote sensing techniques to enhance the characterization of rock masses. The study integrates two state-of-the-art laser-scanning technologies, the HandySCAN and GeoSLAM laser scanners, to extract high-resolution geospatial data for rock mass analysis. These instruments offer high accuracy, precision, low acquisition time, and high efficiency in capturing intricate geological features in small to medium size outcrops and slope cuts. Using the HandySCAN and GeoSLAM laser scanners facilitates real-time, three-dimensional mapping of rock surfaces, enabling comprehensive assessments of rock mass characteristics. The collected data provide valuable insights into structural complexities, surface roughness, and discontinuity patterns, which are essential for geological and geotechnical analyses. The synergy of these advanced remote sensing technologies contributes to a more precise and straightforward understanding of rock mass behavior. In this case, the main parameters of RQD, joint spacing, persistence, aperture, roughness, infill, weathering, water condition, and joint orientation in a slope cut along the Sea-to-Sky Highway, BC, were remotely analyzed to calculate and evaluate the Rock Mass Rating (RMR) and Geological Strength Index (GSI) classification systems. Automatic and manual analyses of the acquired data are then compared with field measurements. The results show the usefulness of the proposed remote sensing methods and their appropriate conformity with the actual field data.

Keywords: remote sensing, rock mechanics, rock engineering, slope stability, discontinuity properties

Procedia PDF Downloads 60
22899 Application of Data Mining for Aquifer Environmental Assessment

Authors: Saman Javadi, Mehdi Hashemy, Mohahammad Mahmoodi

Abstract:

Vulnerability maps are employed as an important solution in order to handle entrance of pollution into the aquifers. The common way to provide vulnerability map is DRASTIC. Meanwhile, application of the method is not easy to apply for any aquifer due to choosing appropriate constant values of weights and ranks. In this study, a new approach using k-means clustering is applied to make vulnerability maps. Four features of depth to groundwater, hydraulic conductivity, recharge value and vadose zone were considered at the same time as features of clustering. Five regions are recognized out of the case study represent zones with different level of vulnerability. The finding results show that clustering provides a realistic vulnerability map so that, Pearson’s correlation coefficients between nitrate concentrations and clustering vulnerability is obtained 61%.

Keywords: clustering, data mining, groundwater, vulnerability assessment

Procedia PDF Downloads 600
22898 Digital Literacy, Assessment and Higher Education

Authors: James Moir

Abstract:

Recent evidence suggests that academic staff face difficulties in applying new technologies as a means of assessing higher order assessment outcomes such as critical thinking, problem solving and creativity. Although higher education institutional mission statements and course unit outlines purport the value of these higher order skills there is still some question about how well academics are equipped to design curricula and, in particular, assessment strategies accordingly. Despite a rhetoric avowing the benefits of these higher order skills, it has been suggested that academics set assessment tasks up in such a way as to inadvertently lead students on the path towards lower order outcomes. This is a controversial claim, and one that this papers seeks to explore and critique in terms of challenging the conceptual basis of assessing higher order skills through new technologies. It is argued that the use of digital media in higher education is leading to a focus on students’ ability to use and manipulate of these products as an index of their flexibility and adaptability to the demands of the knowledge economy. This focus mirrors market flexibility and encourages programmes and courses of study to be rhetorically packaged as such. Curricular content has become a means to procure more or less elaborate aggregates of attributes. Higher education is now charged with producing graduates who are entrepreneurial and creative in order to drive forward economic sustainability. It is argued that critical independent learning can take place through the democratisation afforded by cultural and knowledge digitization and that assessment needs to acknowledge the changing relations between audience and author, expert and amateur, creator and consumer.

Keywords: higher education, curriculum, new technologies, assessment, higher order skills

Procedia PDF Downloads 371
22897 The Moment of Departure: Redefining Self and Space in Literacy Activism

Authors: Sofie Dewayani, Pratiwi Retnaningdyah

Abstract:

Literacy practice is situated within the identity enactment in a particular time and space. The literacy practices in public places, ranging from city parks, urban slums to city roads are meeting places of discursive practices produced by dynamic interactions, and sometimes contestations, of social powers and capitals. The present paper examines the ways the literacy activists construct their sense of space in attempts to develop possibilities for literacy programs as they are sent to work with marginalized communities far away from their hometowns in Indonesia. In particular, this paper analyzes the activists’ reflections of identity enactment - othering, familiarity, and sense of comfort - as they are trying to make meaning of the communities’ literacy capitals and practices in the process of adapting with the communities. Data collected for this paper were travel diaries - serving as literacy narratives - obtained from a literacy residency program sponsored by the Indonesian Ministry of Education and Culture. The residency program itself involved 30 youths (18 to 30 years old) to work with marginalized communities in literacy activism programs. This paper analyzes the written narratives of four focal participants using Bakhtin’s chronotopes - the configurations of time and space - that figure into the youth’s meaning-making of literacy as well as their exercise of power and identity. Follow-up interviews were added to enrich the analysis. The analysis considers the youth’s ‘moment of departure’ a critical point in their reconstructions of self and space. This paper expands the discussions of literacy discourse and spatiality while lending its supports to literacy activism in highly diverse multicultural settings.

Keywords: chronotopes, discourse, identity, literacy activism

Procedia PDF Downloads 178
22896 The Effect of the Cultural Constraint on the Reform of Corporate Governance: The Observation of Taiwan's Efforts to Transform Its Corporate Governance

Authors: Yuanyi (Richard) Fang

Abstract:

Under the theory of La Porta, Lopez-de-Silanes, Shleifer, and Vishny, if a country can increase its legal protections for minority shareholders, the country can develop an ideal securities market that only arises under the dispersed ownership corporate governance. However, the path-dependence scholarship, such as Lucian Arye Bebchuk and Mark J. Roe, presented a different view with LLS&V. They pointed out that the initial framework of the ownership structure and traditional culture will prevent the change of the corporate governance structure through legal reform. This paper contends that traditional culture factors as an important aspect when forming the corporate governance structure. However, it is not impossible for the government to change its traditional corporate governance structure and traditional culture because the culture does not remain intact. Culture evolves with time. The occurrence of the important events will affect the people’s psychological process. The psychological process affects the evolution of culture. The new cultural norms can help defeat the force of the traditional culture and the resistance from the initial corporate ownership structure. Using Taiwan as an example, through analyzing the historical background, related corporate rules and the reactions of adoption new rules from the media, this paper try to show that Taiwan’s culture norms do not remain intact and have changed with time. It further provides that the culture is not always the hurdle for the adoption of the dispersed ownership corporate governance structure as the culture can change. A new culture can provide strong support for the adoption of the new corporate governance structure.

Keywords: LLS&V theory, corporate governance, culture, path–dependent theory

Procedia PDF Downloads 473
22895 Further Investigation of α+12C and α+16O Elastic Scattering

Authors: Sh. Hamada

Abstract:

The current work aims to study the rainbow like-structure observed in the elastic scattering of alpha particles on both 12C and 16O nuclei. We reanalyzed the experimental elastic scattering angular distributions data for α+12C and α+16O nuclear systems at different energies using both optical model and double folding potential of different interaction models such as: CDM3Y1, DDM3Y1, CDM3Y6 and BDM3Y1. Potential created by BDM3Y1 interaction model has the shallowest depth which reflects the necessity to use higher renormalization factor (Nr). Both optical model and double folding potential of different interaction models fairly reproduce the experimental data.

Keywords: density distribution, double folding, elastic scattering, nuclear rainbow, optical model

Procedia PDF Downloads 232
22894 Energy Consumption and Economic Growth: Testimony of Selected Sub-Saharan Africa Countries

Authors: Alfred Quarcoo

Abstract:

The main purpose of this paper is to examine the causal relationship between energy consumption and economic growth in Sub-Saharan Africa using panel data techniques. An annual data on energy consumption and Economic Growth (proxied by real gross domestic product per capita) spanning from 1990 to 2016 from the World bank index database was used. The results of the Augmented Dickey–Fuller unit root test shows that the series for all countries are not stationary at levels. However, the log of economic growth in Benin and Congo become stationary after taking the differences of the data, and log of energy consumption become stationary for all countries and Log of economic growth in Kenya and Zimbabwe were found to be stationary after taking the second differences of the panel series. The findings of the Johansen cointegration test demonstrate that the variables Log of Energy Consumption and Log of economic growth are not co-integrated for the cases of Kenya and Zimbabwe, so no long-run relationship between the variables were established in any country. The Granger causality test indicates that there is a unidirectional causality running from energy use to economic growth in Kenya and no causal linkage between Energy consumption and economic growth in Benin, Congo and Zimbabwe.

Keywords: Cointegration, Granger Causality, Sub-Sahara Africa, World Bank Development Indicators

Procedia PDF Downloads 47
22893 Time Travel Testing: A Mechanism for Improving Renewal Experience

Authors: Aritra Majumdar

Abstract:

While organizations strive to expand their new customer base, retaining existing relationships is a key aspect of improving overall profitability and also showcasing how successful an organization is in holding on to its customers. It is an experimentally proven fact that the lion’s share of profit always comes from existing customers. Hence seamless management of renewal journeys across different channels goes a long way in improving trust in the brand. From a quality assurance standpoint, time travel testing provides an approach to both business and technology teams to enhance the customer experience when they look to extend their partnership with the organization for a defined phase of time. This whitepaper will focus on key pillars of time travel testing: time travel planning, time travel data preparation, and enterprise automation. Along with that, it will call out some of the best practices and common accelerator implementation ideas which are generic across verticals like healthcare, insurance, etc. In this abstract document, a high-level snapshot of these pillars will be provided. Time Travel Planning: The first step of setting up a time travel testing roadmap is appropriate planning. Planning will include identifying the impacted systems that need to be time traveled backward or forward depending on the business requirement, aligning time travel with other releases, frequency of time travel testing, preparedness for handling renewal issues in production after time travel testing is done and most importantly planning for test automation testing during time travel testing. Time Travel Data Preparation: One of the most complex areas in time travel testing is test data coverage. Aligning test data to cover required customer segments and narrowing it down to multiple offer sequencing based on defined parameters are keys for successful time travel testing. Another aspect is the availability of sufficient data for similar combinations to support activities like defect retesting, regression testing, post-production testing (if required), etc. This section will talk about the necessary steps for suitable data coverage and sufficient data availability from a time travel testing perspective. Enterprise Automation: Time travel testing is never restricted to a single application. The workflow needs to be validated in the downstream applications to ensure consistency across the board. Along with that, the correctness of offers across different digital channels needs to be checked in order to ensure a smooth customer experience. This section will talk about the focus areas of enterprise automation and how automation testing can be leveraged to improve the overall quality without compromising on the project schedule. Along with the above-mentioned items, the white paper will elaborate on the best practices that need to be followed during time travel testing and some ideas pertaining to accelerator implementation. To sum it up, this paper will be written based on the real-time experience author had on time travel testing. While actual customer names and program-related details will not be disclosed, the paper will highlight the key learnings which will help other teams to implement time travel testing successfully.

Keywords: time travel planning, time travel data preparation, enterprise automation, best practices, accelerator implementation ideas

Procedia PDF Downloads 153
22892 Electronic Data Interchange (EDI) in the Supply Chain: Impact on Customer Satisfaction

Authors: Hicham Amine, Abdelouahab Mesnaoui

Abstract:

Electronic data interchange EDI is the computer-to-computer exchange of structured business information. This information typically takes the form of standardized electronic business documents, such as invoices, purchase orders, bills of lading, and so on. The purpose of this study is to identify the impact EDI might have on supply chain and typically on customer satisfaction keeping in mind the constraints the organization might face. This study included 139 subject matter experts (SMEs) who participated by responding to a survey that was distributed. 85% responded that they are extremely for the implementation while 10% were neutral and 5% were against the implementation. From the quality assurance department, we have got 75% from the clients agreed to move on with the change whereas 10% stayed neutral and finally 15% were against the change. From the legal department where 80% of the answers were for the implementation and 10% of the participants stayed neutral whereas the last 10% were against it. The survey consisted of 40% male and 60% female (sex-ratio (F/M=1,5), who had chosen to participate. Our survey also contained 3 categories in terms of technical background where 80% are from technical background and 15% were from nontechnical background and 5% had some average technical background. This study examines the impact of EDI on customer satisfaction which is the primary hypothesis and justifies the importance of the implementation which enhances the customer satisfaction.

Keywords: electronic data interchange, supply chain, subject matter experts, customer satisfaction

Procedia PDF Downloads 333
22891 Accelerating Side Channel Analysis with Distributed and Parallelized Processing

Authors: Kyunghee Oh, Dooho Choi

Abstract:

Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.

Keywords: DPA, distributed computing, parallelized processing, side channel analysis

Procedia PDF Downloads 423
22890 Monitoring of Hydrological Parameters in the Alexandra Jukskei Catchment in South Africa

Authors: Vhuhwavho Gadisi, Rebecca Alowo, German Nkhonjera

Abstract:

It has been noted that technical programming for handling groundwater resources is not accessible. The lack of these systems hinders groundwater management processes necessary for decision-making through monitoring and evaluation regarding the Jukskei River of the Crocodile River (West) Basin in Johannesburg, South Africa. Several challenges have been identified in South Africa's Jukskei Catchment concerning groundwater management. Some of those challenges will include the following: Gaps in data records; there is a need for training and equipping of monitoring staff; formal accreditation of monitoring capacities and equipment; there is no access to regulation terms (e.g., meters). Taking into consideration necessities and human requirements as per typical densities in various regions of South Africa, there is a need to construct several groundwater level monitoring stations in a particular segment; the available raw data on groundwater level should be converted into consumable products for example, short reports on delicate areas (e.g., Dolomite compartments, wetlands, aquifers, and sole source) and considering the increasing civil unrest there has been vandalism and theft of groundwater monitoring infrastructure. GIS was employed at the catchment level to plot the relationship between those identified groundwater parameters in the catchment area and the identified borehole. GIS-based maps were designed for groundwater monitoring to be pretested on one borehole in the Jukskei catchment. This data will be used to establish changes in the borehole compared to changes in the catchment area according to identified parameters.

Keywords: GIS, monitoring, Jukskei, catchment

Procedia PDF Downloads 89
22889 Transportation Mode Classification Using GPS Coordinates and Recurrent Neural Networks

Authors: Taylor Kolody, Farkhund Iqbal, Rabia Batool, Benjamin Fung, Mohammed Hussaeni, Saiqa Aleem

Abstract:

The rising threat of climate change has led to an increase in public awareness and care about our collective and individual environmental impact. A key component of this impact is our use of cars and other polluting forms of transportation, but it is often difficult for an individual to know how severe this impact is. While there are applications that offer this feedback, they require manual entry of what transportation mode was used for a given trip, which can be burdensome. In order to alleviate this shortcoming, a data from the 2016 TRIPlab datasets has been used to train a variety of machine learning models to automatically recognize the mode of transportation. The accuracy of 89.6% is achieved using single deep neural network model with Gated Recurrent Unit (GRU) architecture applied directly to trip data points over 4 primary classes, namely walking, public transit, car, and bike. These results are comparable in accuracy to results achieved by others using ensemble methods and require far less computation when classifying new trips. The lack of trip context data, e.g., bus routes, bike paths, etc., and the need for only a single set of weights make this an appropriate methodology for applications hoping to reach a broad demographic and have responsive feedback.

Keywords: classification, gated recurrent unit, recurrent neural network, transportation

Procedia PDF Downloads 132
22888 Data Mining to Capture User-Experience: A Case Study in Notebook Product Appearance Design

Authors: Rhoann Kerh, Chen-Fu Chien, Kuo-Yi Lin

Abstract:

In the era of rapidly increasing notebook market, consumer electronics manufacturers are facing a highly dynamic and competitive environment. In particular, the product appearance is the first part for user to distinguish the product from the product of other brands. Notebook product should differ in its appearance to engage users and contribute to the user experience (UX). The UX evaluates various product concepts to find the design for user needs; in addition, help the designer to further understand the product appearance preference of different market segment. However, few studies have been done for exploring the relationship between consumer background and the reaction of product appearance. This study aims to propose a data mining framework to capture the user’s information and the important relation between product appearance factors. The proposed framework consists of problem definition and structuring, data preparation, rules generation, and results evaluation and interpretation. An empirical study has been done in Taiwan that recruited 168 subjects from different background to experience the appearance performance of 11 different portable computers. The results assist the designers to develop product strategies based on the characteristics of consumers and the product concept that related to the UX, which help to launch the products to the right customers and increase the market shares. The results have shown the practical feasibility of the proposed framework.

Keywords: consumers decision making, product design, rough set theory, user experience

Procedia PDF Downloads 304
22887 Audit of TPS photon beam dataset for small field output factors using OSLDs against RPC standard dataset

Authors: Asad Yousuf

Abstract:

Purpose: The aim of the present study was to audit treatment planning system beam dataset for small field output factors against standard dataset produced by radiological physics center (RPC) from a multicenter study. Such data are crucial for validity of special techniques, i.e., IMRT or stereotactic radiosurgery. Materials/Method: In this study, multiple small field size output factor datasets were measured and calculated for 6 to 18 MV x-ray beams using the RPC recommend methods. These beam datasets were measured at 10 cm depth for 10 × 10 cm2 to 2 × 2 cm2 field sizes, defined by collimator jaws at 100 cm. The measurements were made with a Landauer’s nanoDot OSLDs whose volume is small enough to gather a full ionization reading even for the 1×1 cm2 field size. At our institute the beam data including output factors have been commissioned at 5 cm depth with an SAD setup. For comparison with the RPC data, the output factors were converted to an SSD setup using tissue phantom ratios. SSD setup also enables coverage of the ion chamber in 2×2 cm2 field size. The measured output factors were also compared with those calculated by Eclipse™ treatment planning software. Result: The measured and calculated output factors are in agreement with RPC dataset within 1% and 4% respectively. The large discrepancies in TPS reflect the increased challenge in converting measured data into a commissioned beam model for very small fields. Conclusion: OSLDs are simple, durable, and accurate tool to verify doses that delivered using small photon beam fields down to a 1x1 cm2 field sizes. The study emphasizes that the treatment planning system should always be evaluated for small field out factors for the accurate dose delivery in clinical setting.

Keywords: small field dosimetry, optically stimulated luminescence, audit treatment, radiological physics center

Procedia PDF Downloads 320
22886 Nonlinear Multivariable Analysis of CO2 Emissions in China

Authors: Hsiao-Tien Pao, Yi-Ying Li, Hsin-Chia Fu

Abstract:

This paper addressed the impacts of energy consumption, economic growth, financial development, and population size on environmental degradation using grey relational analysis (GRA) for China, where foreign direct investment (FDI) inflows is the proxy variable for financial development. The more recent historical data during the period 2004–2011 are used, because the use of very old data for data analysis may not be suitable for rapidly developing countries. The results of the GRA indicate that the linkage effects of energy consumption–emissions and GDP–emissions are ranked first and second, respectively. These reveal that energy consumption and economic growth are strongly correlated with emissions. Higher economic growth requires more energy consumption and increasing environmental pollution. Likewise, more efficient energy use needs a higher level of economic development. Therefore, policies to improve energy efficiency and create a low-carbon economy can reduce emissions without hurting economic growth. The finding of FDI–emissions linkage is ranked third. This indicates that China do not apply weak environmental regulations to attract inward FDI. Furthermore, China’s government in attracting inward FDI should strengthen environmental policy. The finding of population–emissions linkage effect is ranked fourth, implying that population size does not directly affect CO2 emissions, even though China has the world’s largest population, and Chinese people are very economical use of energy-related products. Overall, the energy conservation, improving efficiency, managing demand, and financial development, which aim at curtailing waste of energy, reducing both energy consumption and emissions, and without loss of the country’s competitiveness, can be adopted for developing economies. The GRA is one of the best way to use a lower data to build a dynamic analysis model.

Keywords: China, CO₂ emissions, foreign direct investment, grey relational analysis

Procedia PDF Downloads 401
22885 Retrospective Insight on the Changing Status of the Romanian Language Spoken in the Republic of Moldova

Authors: Gina Aurora Necula

Abstract:

From its transformation into a taboo and its hiding under the so-called “Moldovan language” or under the euphemistic expression “state language” to its regained status recognition as an official language, the Romanian language spoken in the Republic of Moldova has undergone impressive reforms in the last 60 years. Meant to erase the awareness of citizens’ ethnic identity and turn a majority language into a minority one, all the laws and regulations issued on the field succeeded into setting numerous barriers for speakers of Romanian. Either manifested as social constraints or materialized into assumed rejection of mother tongue usage, all these laws have demonstrated their usefulness and major impact on the Romanian-speaking population. This article is the result of our research carried out over 10 years with the support of students, and Moldovan citizens, from the master's degree program "Romanian language - identity and cultural awareness." We present here a retrospective insight of the reforms, laws, and regulations that contributed to the shifted status of the Romanian language from the official language, seen as the language of common use both in the public and private spheres, in the minority language that surrendered its privileged place to the Russian language, firstly in the public sphere, and then, slowly but surely, in the private sphere. Our main goal here is to identify and make speakers understand what the barriers to learning Romanian language are nowadays when the social pressure on using Russian no longer exists.

Keywords: linguistic barriers, lingua franca, private sphere, public sphere, reformation

Procedia PDF Downloads 110
22884 Estimation of Longitudinal Dispersion Coefficient Using Tracer Data

Authors: K. Ebrahimi, Sh. Shahid, M. Mohammadi Ghaleni, M. H. Omid

Abstract:

The longitudinal dispersion coefficient is a crucial parameter for 1-D water quality analysis of riverine flows. So far, different types of empirical equations for estimation of the coefficient have been developed, based on various case studies. The main objective of this paper is to develop an empirical equation for estimation of the coefficient for a riverine flow. For this purpose, a set of tracer experiments was conducted, involving salt tracer, at three sections located in downstream of a lengthy canal. Tracer data were measured in three mixing lengths along the canal including; 45, 75 and 100m. According to the results, the obtained coefficients from new developed empirical equation gave an encouraging level of agreement with the theoretical values.

Keywords: coefficients, dispersion, river, tracer, water quality

Procedia PDF Downloads 386
22883 Art, Nature, and City in the Construction of Contemporary Public Space

Authors: Rodrigo Coelho

Abstract:

We believe that in the majority of the “recent production of public space", the overvaluation of the "image", of the "ephemeral" and of the "objectual", has come to determine the configuration of banal and (more or less) arbitrary "public spaces", mostly linked to a problem of “outdoor decoration”, reflecting a clear sign of uncertainty and arbitrariness about the meaning, the role and shape of public space and public art.This "inconsistency" which is essentially linked to the loss of urban, but also social, cultural and political, vocation of the disciplines that “shape” the urban space (but is also linked to the lack of urban and technical culture of techinicians and policy makers) converted a significant set of the recently built "public space" and “urban art” into diffuse and multi-referenced pieces, which generally shares the inability of confering to the urban space, civic, aesthetic, social and symbolic meanings. In this sense we consider it is essential to undertake a theoretical reflection on the values, the meaning(s) and the shape(s) that open space, and urban art may (or must) take in the current urban and cultural context, in order to redeem for public space its status of significant physical reference, able to embody a spatial and urban identity, and simultaneously enable the collective accession and appropriation of public space. Taking as reference public space interventions built in the last decade on the European context, we will seek to explore and defend the need of considering public space as a true place of exception, an exceptional support where the emphasis is placed on the quality of the experience, especially by the relations public space/urban art can established with the city, with nature and geography in a broad sense, referring us back to a close and inseparable and timeless relationship between nature and culture.

Keywords: art, city, nature, public space

Procedia PDF Downloads 445
22882 Game-Based Learning in a Higher Education Course: A Case Study with Minecraft Education Edition

Authors: Salvador Antelmo Casanova Valencia

Abstract:

This study documents the use of the Minecraft Education Edition application to explore immersive game-based learning environments. We analyze the contributions of fourth-year university students who are pursuing a degree in Administrative Computing at the Universidad Michoacana de San Nicolas de Hidalgo. In this study, descriptive data and statistical inference are detailed using a quasi-experimental design using the Wilcoxon test. The instruments will provide data validation. Game-based learning in immersive environments necessarily implies greater student participation and commitment, resulting in the study, motivation, and significant improvements, promoting cooperation and autonomous learning.

Keywords: game-based learning, gamification, higher education, Minecraft

Procedia PDF Downloads 159
22881 Determining the Direction of Causality between Creating Innovation and Technology Market

Authors: Liubov Evstigneeva

Abstract:

In this paper an attempt is made to establish causal nexuses between innovation and international trade in Russia. The topicality of this issue is determined by the necessity of choosing policy instruments for economic modernization and transition to innovative development. The vector auto regression (VAR) model and Granger test are applied for the Russian monthly data from 2005 until the second quartile of 2015. Both lagged import and export at the national level cause innovation, the latter starts to stimulate foreign trade since it is a remote lag. In comparison to aggregate data, the results by patent’s categories are more diverse. Importing technologies from foreign countries stimulates patent activity, while innovations created in Russia are only Granger causality for import to Commonwealth of Independent States.

Keywords: export, import, innovation, patents

Procedia PDF Downloads 318
22880 Using Inverted 4-D Seismic and Well Data to Characterise Reservoirs from Central Swamp Oil Field, Niger Delta

Authors: Emmanuel O. Ezim, Idowu A. Olayinka, Michael Oladunjoye, Izuchukwu I. Obiadi

Abstract:

Monitoring of reservoir properties prior to well placements and production is a requirement for optimisation and efficient oil and gas production. This is usually done using well log analyses and 3-D seismic, which are often prone to errors. However, 4-D (Time-lapse) seismic, incorporating numerous 3-D seismic surveys of the same field with the same acquisition parameters, which portrays the transient changes in the reservoir due to production effects over time, could be utilised because it generates better resolution. There is, however dearth of information on the applicability of this approach in the Niger Delta. This study was therefore designed to apply 4-D seismic, well-log and geologic data in monitoring of reservoirs in the EK field of the Niger Delta. It aimed at locating bypassed accumulations and ensuring effective reservoir management. The Field (EK) covers an area of about 1200km2 belonging to the early (18ma) Miocene. Data covering two 4-D vintages acquired over a fifteen-year interval were obtained from oil companies operating in the field. The data were analysed to determine the seismic structures, horizons, Well-to-Seismic Tie (WST), and wavelets. Well, logs and production history data from fifteen selected wells were also collected from the Oil companies. Formation evaluation, petrophysical analysis and inversion alongside geological data were undertaken using Petrel, Shell-nDi, Techlog and Jason Software. Well-to-seismic tie, formation evaluation and saturation monitoring using petrophysical and geological data and software were used to find bypassed hydrocarbon prospects. The seismic vintages were interpreted, and the amounts of change in the reservoir were defined by the differences in Acoustic Impedance (AI) inversions of the base and the monitor seismic. AI rock properties were estimated from all the seismic amplitudes using controlled sparse-spike inversion. The estimated rock properties were used to produce AI maps. The structural analysis showed the dominance of NW-SE trending rollover collapsed-crest anticlines in EK with hydrocarbons trapped northwards. There were good ties in wells EK 27, 39. Analysed wavelets revealed consistent amplitude and phase for the WST; hence, a good match between the inverted impedance and the good data. Evidence of large pay thickness, ranging from 2875ms (11420 TVDSS-ft) to about 2965ms, were found around EK 39 well with good yield properties. The comparison between the base of the AI and the current monitor and the generated AI maps revealed zones of untapped hydrocarbons as well as assisted in determining fluids movement. The inverted sections through EK 27, 39 (within 3101 m - 3695 m), indicated depletion in the reservoirs. The extent of the present non-uniform gas-oil contact and oil-water contact movements were from 3554 to 3575 m. The 4-D seismic approach led to better reservoir characterization, well development and the location of deeper and bypassed hydrocarbon reservoirs.

Keywords: reservoir monitoring, 4-D seismic, well placements, petrophysical analysis, Niger delta basin

Procedia PDF Downloads 113
22879 Analysis of Cardiovascular Diseases Using Artificial Neural Network

Authors: Jyotismita Talukdar

Abstract:

In this paper, a study has been made on the possibility and accuracy of early prediction of several Heart Disease using Artificial Neural Network. (ANN). The study has been made in both noise free environment and noisy environment. The data collected for this analysis are from five Hospitals. Around 1500 heart patient’s data has been collected and studied. The data is analysed and the results have been compared with the Doctor’s diagnosis. It is found that, in noise free environment, the accuracy varies from 74% to 92%and in noisy environment (2dB), the results of accuracy varies from 62% to 82%. In the present study, four basic attributes considered are Blood Pressure (BP), Fasting Blood Sugar (FBS), Thalach (THAL) and Cholesterol (CHOL.). It has been found that highest accuracy(93%), has been achieved in case of PPI( Post-Permanent-Pacemaker Implementation ), around 79% in case of CAD(Coronary Artery disease), 87% in DCM (Dilated Cardiomyopathy), 89% in case of RHD&MS(Rheumatic heart disease with Mitral Stenosis), 75 % in case of RBBB +LAFB (Right Bundle Branch Block + Left Anterior Fascicular Block), 72% for CHB(Complete Heart Block) etc. The lowest accuracy has been obtained in case of ICMP (Ischemic Cardiomyopathy), about 38% and AF( Atrial Fibrillation), about 60 to 62%.

Keywords: coronary heart disease, chronic stable angina, sick sinus syndrome, cardiovascular disease, cholesterol, Thalach

Procedia PDF Downloads 171
22878 Damage Detection in a Cantilever Beam under Different Excitation and Temperature Conditions

Authors: A. Kyprianou, A. Tjirkallis

Abstract:

Condition monitoring of structures in service is very important as it provides information about the risk of damage development. One of the essential constituents of structural condition monitoring is the damage detection methodology. In the context of condition monitoring of in service structures a damage detection methodology analyses data obtained from the structure while it is in operation. Usually, this means that the data could be affected by operational and environmental conditions in a way that could mask the effects of a possible damage on the data. This, depending on the damage detection methodology, could lead to either false alarms or miss existing damages. In this article a damage detection methodology that is based on the Spatio-temporal continuous wavelet transform (SPT-CWT) analysis of a sequence of experimental time responses of a cantilever beam is proposed. The cantilever is subjected to white and pink noise excitation to simulate different operating conditions. In addition, in order to simulate changing environmental conditions, the cantilever is subjected to heating by a heat gun. The response of the cantilever beam is measured by a high-speed camera. Edges are extracted from the series of images of the beam response captured by the camera. Subsequent processing of the edges gives a series of time responses on 439 points on the beam. This sequence is then analyzed using the SPT-CWT to identify damage. The algorithm proposed was able to clearly identify damage under any condition when the structure was excited by white noise force. In addition, in the case of white noise excitation, the analysis could also reveal the position of the heat gun when it was used to heat the structure. The analysis could identify the different operating conditions i.e. between responses due to white noise excitation and responses due to pink noise excitation. During the pink noise excitation whereas damage and changing temperature were identified it was not possible to clearly identify the effect of damage from that of temperature. The methodology proposed in this article for damage detection enables the separation the damage effect from that due to temperature and excitation on data obtained from measurements of a cantilever beam. This methodology does not require information about the apriori state of the structure.

Keywords: spatiotemporal continuous wavelet transform, damage detection, data normalization, varying temperature

Procedia PDF Downloads 272
22877 The Mediating Role of Resilience in the Association Between Stigma and Psychosocial Adjustment: A Cross-sectional Study Among Young and Middle-Aged Patients With Lung Cancer

Authors: Ziyun Li, Jiudi Zhong, June Zhang

Abstract:

Background: The diagnosis and treatment of lung cancer lead to varying degrees of psychological and social maladjustment among patients with lung cancer. Understanding psychosocial adjustment (PA) and its influencing factors in young and middle-aged lung cancer patients is essential to help them return to society and lead a normal life. Objectives: This study aims to examine the mediating role of resilience in the association between stigma and psychosocial adjustment among young and middle-aged patients with lung cancer. Methods: A total of 235 patients with lung cancer were recruited from a tertiary grade A cancer center in southern China and investigated using a self-designed general information questionnaire, Psychosocial Adjustment to Illness Scale Self-Report, Social Impact Scale, and Conner-Davidson Resilience Scale. Results: The mean score of PA was (32.61±14.75), and its influencing factors included treatment modalities, stigma, and resilience. The total effect of stigma on PA was significant (total effect=0.418, SE=0.045, 95%CI [0.310-0.497]), and a positive indirect effect was identified for stigma on PA via resilience (indirect effect=0.143, SE=0.041, 95% CI [0.075-0.236]). Conclusion: Stigma and resilience are significantly associated with PA, and resilience is also a mediating variable between stigma and PA. This study suggests that individualized interventions can be made to improve the PA by alleviating their stigma, or by enhancing their resilience in young and middle-aged lung cancer patients.

Keywords: psychosocial adjustment, lung cancer, cancer caring, nursing, young and middle-aged

Procedia PDF Downloads 85
22876 Investigating of Predisposing Factors for Domestic Violence against Women

Authors: Mozhgan Sigarchian, Shiva Alizadeh, Seyedeh Akram Nazarkardeh

Abstract:

Introduction: The one of the most common forms of violence against women is domestic violence and it is one of the most acute social problems that affecting the individual physical and mental health and, in turn, the health of the family and the community. In all of the world especially in developing country, women suffer violent during her lifetime. Violence against women and girls is a serious threat to health and human rights. Several factors such as low literacy, the low income and poverty affects violence. The purpose of this study was to determine the factors conducive to domestic violence against women in Rasht, Iran, So that based on the findings, preventive measures can be taken to reduce violence and increase support for women. Methods: This is a descriptive-analytic study that was performed on 300 eligible women referred to clinics and physicians' offices in Rasht, Iran, 2017, by convenience sampling method. The questionnaire used included demographic questionnaires and domestic violence with 3 domains: physical, psychological, and sexual violence. Data were analyzed by SPSS software using independent t-test, Chi-square and Mann-Whitney tests. Result: The mean age in the group with and without domestic violence was 28.31 ± 6.097 and 32.52 ± 9.8, respectively. 168 women (56%) were reported to be violent. The results indicate that there is a significant relationship between age, husband's age, number of family members, and educational level of women with violence. But, there was no significant relationship between the duration of marriage, the education of husbands, the occupation of women and their husbands, housing situation, smoking with violence. Conclusion: The results showed that some factors such as education, age, and the number of families can affect the level of violence. According to the results, as well as a high prevalence of domestic violence among women in this study, it is suggested that training be given to families to increase women's empowerment and prevent violence against women.

Keywords: domestic violence, predisposing factors, violence, women

Procedia PDF Downloads 197