Search results for: features of federalism
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3814

Search results for: features of federalism

2134 The Restoration of the Old District in the Urbanization: The Case Study of Samsen Riverside Community, Dusit District, Bangkok

Authors: Tikhanporn Punluekdej, Saowapa Phaithayawat

Abstract:

The objectives of this research are: 1) to discover the mechanism in the restoration process of the old district, and 2) to study the people participation in the community with related units. This research utilizes qualitative research method together with the tools used in the study of historical and anthropological disciplines. The research revealed that the restoration process of the old district started with the needs of the local people in the community. These people are considered as a young generation in the community. The leading group of the community played a vital role in the restoration process by igniting the whole idea and followed by the help from those who have lived in the area of more than fifty years. The restoration process is the genuine desire of the local people without the intervention of the local politics. The core group would coordinate with the related units in which there were, for instance, the academic institutions in order to find out the most dominant historical features of the community including its settlement. The Crown Property Bureau, as the sole-owner of the land, joined the restoration in the physical development dimension. The restoration was possible due to the cooperation between local people and related units, under the designated plans, budget, and social activities.

Keywords: restoration, urban area, old district, people participation

Procedia PDF Downloads 406
2133 The Convolution Recurrent Network of Using Residual LSTM to Process the Output of the Downsampling for Monaural Speech Enhancement

Authors: Shibo Wei, Ting Jiang

Abstract:

Convolutional-recurrent neural networks (CRN) have achieved much success recently in the speech enhancement field. The common processing method is to use the convolution layer to compress the feature space by multiple upsampling and then model the compressed features with the LSTM layer. At last, the enhanced speech is obtained by deconvolution operation to integrate the global information of the speech sequence. However, the feature space compression process may cause the loss of information, so we propose to model the upsampling result of each step with the residual LSTM layer, then join it with the output of the deconvolution layer and input them to the next deconvolution layer, by this way, we want to integrate the global information of speech sequence better. The experimental results show the network model (RES-CRN) we introduce can achieve better performance than LSTM without residual and overlaying LSTM simply in the original CRN in terms of scale-invariant signal-to-distortion ratio (SI-SNR), speech quality (PESQ), and intelligibility (STOI).

Keywords: convolutional-recurrent neural networks, speech enhancement, residual LSTM, SI-SNR

Procedia PDF Downloads 196
2132 Tidal Current Behaviors and Remarkable Bathymetric Change in the South-Western Part of Khor Abdullah, Kuwait

Authors: Ahmed M. Al-Hasem

Abstract:

A study of the tidal current behavior and bathymetric changes was undertaken in order to establish an information base for future coastal management. The average velocity for tidal current was 0.46 m/s and the maximum velocity was 1.08 m/s during ebb tide. During spring tides, maximum velocities range from 0.90 m/s to 1.08 m/s, whereas maximum velocities vary from 0.40 m/s to 0.60 m/s during neap tides. Despite greater current velocities during flood tide, the bathymetric features enhance the dominance of the ebb tide. This can be related to the abundance of fine sediments from the ebb current approaching the study area, and the relatively coarser sediment from the approaching flood current. Significant bathymetric changes for the period from 1985 to 1998 were found with dominance of erosion process. Approximately 96.5% of depth changes occurred within the depth change classes of -5 m to 5 m. The high erosion processes within the study area will subsequently result in high accretion processes, particularly in the north, the location of the proposed Boubyan Port and its navigation channel.

Keywords: bathymetric change, Boubyan island, GIS, Khor Abdullah, tidal current behavior

Procedia PDF Downloads 285
2131 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing

Authors: Arjun Kumar Rath, Titus Dhanasingh

Abstract:

Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.

Keywords: board diagnostics software, embedded system, hardware testing, test frameworks

Procedia PDF Downloads 135
2130 Reconsidering Taylor’s Law with Chaotic Population Dynamical Systems

Authors: Yuzuru Mitsui, Takashi Ikegami

Abstract:

The exponents of Taylor’s law in deterministic chaotic systems are computed, and their meanings are intensively discussed. Taylor’s law is the scaling relationship between the mean and variance (in both space and time) of population abundance, and this law is known to hold in a variety of ecological time series. The exponents found in the temporal Taylor’s law are different from those of the spatial Taylor’s law. The temporal Taylor’s law is calculated on the time series from the same locations (or the same initial states) of different temporal phases. However, with the spatial Taylor’s law, the mean and variance are calculated from the same temporal phase sampled from different places. Most previous studies were done with stochastic models, but we computed the temporal and spatial Taylor’s law in deterministic systems. The temporal Taylor’s law evaluated using the same initial state, and the spatial Taylor’s law was evaluated using the ensemble average and variance. There were two main discoveries from this work. First, it is often stated that deterministic systems tend to have the value two for Taylor’s exponent. However, most of the calculated exponents here were not two. Second, we investigated the relationships between chaotic features measured by the Lyapunov exponent, the correlation dimension, and other indexes with Taylor’s exponents. No strong correlations were found; however, there is some relationship in the same model, but with different parameter values, and we will discuss the meaning of those results at the end of this paper.

Keywords: chaos, density effect, population dynamics, Taylor’s law

Procedia PDF Downloads 170
2129 CompleX-Machine: An Automated Testing Tool Using X-Machine Theory

Authors: E. K. A. Ogunshile

Abstract:

This paper is aimed at creating an Automatic Java X-Machine testing tool for software development. The nature of software development is changing; thus, the type of software testing tools required is also changing. Software is growing increasingly complex and, in part due to commercial impetus for faster software releases with new features and value, increasingly in danger of containing faults. These faults can incur huge cost for software development organisations and users; Cambridge Judge Business School’s research estimated the cost of software bugs to the global economy is $312 billion. Beyond the cost, faster software development methodologies and increasing expectations on developers to become testers is driving demand for faster, automated, and effective tools to prevent potential faults as early as possible in the software development lifecycle. Using X-Machine theory, this paper will explore a new tool to address software complexity, changing expectations on developers, faster development pressures and methodologies, with a view to reducing the huge cost of fixing software bugs.

Keywords: conformance testing, finite state machine, software testing, x-machine

Procedia PDF Downloads 264
2128 Operating System Support for Mobile Device Thermal Management and Performance Optimization in Augmented Reality Applications

Authors: Yasith Mindula Saipath Wickramasinghe

Abstract:

Augmented reality applications require a high processing power to load, render and live stream high-definition AR models and virtual scenes; it also requires device sensors to work excessively to coordinate with internal hardware, OS and give the expected outcome in advance features like object detection, real time tracking, as well as voice and text recognition. Excessive thermal generation due to these advanced functionalities has become a major research problem as it is unbearable for smaller mobile devices to manage such heat increment and battery drainage as it causes physical harm to the devices in the long term. Therefore, effective thermal management is one of the major requirements in Augmented Reality application development. As this paper discusses major causes for this issue, it also provides possible solutions in the means of operating system adaptations as well as further research on best coding practises to optimize the application performance that reduces thermal excessive thermal generation.

Keywords: augmented reality, device thermal management, GPU, operating systems, device I/O, overheating

Procedia PDF Downloads 113
2127 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 271
2126 Delineating Subsurface Linear Features and Faults Under Sedimentary Cover in the Bahira Basin Using Integrated Gravity and Magnetic Data

Authors: M. Lghoul, N. El Goumi, M. Guernouche

Abstract:

In order to predict the structural and tectonic framework of the Bahira basin and to have a 3D geological modeling of the basin, an integrated multidisciplinary work has been conducted using gravity, magnetic and geological data. The objective of the current study is delineating the subsurfacefeatures, faults, and geological limits, using airborne magnetic and gravity data analysis of the Bahira basin. To achieve our goal, we have applied different enhanced techniques on magnetic and gravity data: power spectral analysis techniques, reduction to pole (RTP), upward continuation, analytical signal, tilt derivative, total horizontal derivative, 3D Euler deconvolutionand source parameter imagining. The major lineaments/faults trend are: NE–SW, NW-SE, ENE–WSW, and WNW–ESE. The 3D Euler deconvolution analysis highlighted a number of fault trend, mainly in the ENE-WSW, WNW-ESE directions. The depth tothe top of the basement sources in the study area ranges between 200 m, in the southern and northern part of the Bahira basin, to 5000 m located in the Eastern part of the basin.

Keywords: magnetic, gravity, structural trend, depth to basement

Procedia PDF Downloads 129
2125 Face Recognition Using Eigen Faces Algorithm

Authors: Shweta Pinjarkar, Shrutika Yawale, Mayuri Patil, Reshma Adagale

Abstract:

Face recognition is the technique which can be applied to the wide variety of problems like image and film processing, human computer interaction, criminal identification etc. This has motivated researchers to develop computational models to identify the faces, which are easy and simple to implement. In this, demonstrates the face recognition system in android device using eigenface. The system can be used as the base for the development of the recognition of human identity. Test images and training images are taken directly with the camera in android device.The test results showed that the system produces high accuracy. The goal is to implement model for particular face and distinguish it with large number of stored faces. face recognition system detects the faces in picture taken by web camera or digital camera and these images then checked with training images dataset based on descriptive features. Further this algorithm can be extended to recognize the facial expressions of a person.recognition could be carried out under widely varying conditions like frontal view,scaled frontal view subjects with spectacles. The algorithm models the real time varying lightning conditions. The implemented system is able to perform real-time face detection, face recognition and can give feedback giving a window with the subject's info from database and sending an e-mail notification to interested institutions using android application. Face recognition is the technique which can be applied to the wide variety of problems like image and film processing, human computer interaction, criminal identification etc. This has motivated researchers to develop computational models to identify the faces, which are easy and simple to implement. In this , demonstrates the face recognition system in android device using eigenface. The system can be used as the base for the development of the recognition of human identity. Test images and training images are taken directly with the camera in android device.The test results showed that the system produces high accuracy. The goal is to implement model for particular face and distinguish it with large number of stored faces. face recognition system detects the faces in picture taken by web camera or digital camera and these images then checked with training images dataset based on descriptive features. Further this algorithm can be extended to recognize the facial expressions of a person.recognition could be carried out under widely varying conditions like frontal view,scaled frontal view subjects with spectacles. The algorithm models the real time varying lightning conditions. The implemented system is able to perform real-time face detection, face recognition and can give feedback giving a window with the subject's info from database and sending an e-mail notification to interested institutions using android application.

Keywords: face detection, face recognition, eigen faces, algorithm

Procedia PDF Downloads 353
2124 Magnetohydrodynamic (MHD) Flow of Cu-Water Nanofluid Due to a Rotating Disk with Partial Slip

Authors: Tasawar Hayat, Madiha Rashid, Maria Imtiaz, Ahmed Alsaedi

Abstract:

This problem is about the study of flow of viscous fluid due to rotating disk in nanofluid. Effects of magnetic field, slip boundary conditions and thermal radiations are encountered. An incompressible fluid soaked the porous medium. In this model, nanoparticles of Cu is considered with water as the base fluid. For Copper-water nanofluid, graphical results are presented to describe the influences of nanoparticles volume fraction (φ) on velocity and temperature fields for the slip boundary conditions. The governing differential equations are transformed to a system of nonlinear ordinary differential equations by suitable transformations. Convergent solution of the nonlinear system is developed. The obtained results are analyzed through graphical illustrations for different parameters. Moreover, the features of the flow and heat transfer characteristics are analyzed. It is found that the skin friction coefficient and heat transfer rate at the surface are highest in copper-water nanofluid.

Keywords: MHD nanofluid, porous medium, rotating disk, slip effect

Procedia PDF Downloads 254
2123 Smart Lean Manufacturing in the Context of Industry 4.0: A Case Study

Authors: M. Ramadan, B. Salah

Abstract:

This paper introduces a framework to digitalize lean manufacturing tools to enhance smart lean-based manufacturing environments or Lean 4.0 manufacturing systems. The paper discusses the integration between lean tools and the powerful features of recent real-time data capturing systems with the help of Information and Communication Technologies (ICT) to develop an intelligent real-time monitoring and controlling system of production operations concerning lean targets. This integration is represented in the Lean 4.0 system called Dynamic Value Stream Mapping (DVSM). Moreover, the paper introduces the practice of Radio Frequency Identification (RFID) and ICT to smartly support lean tools and practices during daily production runs to keep the lean system alive and effective. This work introduces a practical description of how the lean method tools 5S, standardized work, and poka-yoke can be digitalized and smartly monitored and controlled through DVSM. A framework of the three tools has been discussed and put into practice in a German switchgear manufacturer.

Keywords: lean manufacturing, Industry 4.0, radio frequency identification, value stream mapping

Procedia PDF Downloads 219
2122 Data Security: An Enhancement of E-mail Security Algorithm to Secure Data Across State Owned Agencies

Authors: Lindelwa Mngomezulu, Tonderai Muchenje

Abstract:

Over the decades, E-mails provide easy, fast and timely communication enabling businesses and state owned agencies to communicate with their stakeholders and with their own employees in real-time. Moreover, since the launch of Microsoft office 365 and many other clouds based E-mail services, many businesses have been migrating from the on premises E-mail services to the cloud and more precisely since the beginning of the Covid-19 pandemic, there has been a significant increase of E-mails utilization, which then leads to the increase of cyber-attacks. In that regard, E-mail security has become very important in the E-mail transportation to ensure that the E-mail gets to the recipient without the data integrity being compromised. The classification of the features to enhance E-mail security for further from the enhanced cyber-attacks as we are aware that since the technology is advancing so at the cyber-attacks. Therefore, in order to maximize the data integrity we need to also maximize security of the E-mails such as enhanced E-mail authentication. The successful enhancement of E-mail security in the future may lessen the frequency of information thefts via E-mails, resulting in the data of South African State-owned agencies not being compromised.

Keywords: e-mail security, cyber-attacks, data integrity, authentication

Procedia PDF Downloads 127
2121 Fabrication of Hollow Germanium Spheres by Dropping Method

Authors: Kunal D. Bhagat, Truong V. Vu, John C. Wells, Hideyuki Takakura, Yu Kawano, Fumio Ogawa

Abstract:

Hollow germanium alloy quasi-spheres of diameters 1 to 2 mm with a relatively smooth inner and outer surface have been produced. The germanium was first melted at around 1273 K and then exuded from a coaxial nozzle into an inert atmosphere by argon gas supplied to the inner nozzle. The falling spheres were cooled by water spray and collected in a bucket. The spheres had a horn type of structure on the outer surface, which might be caused by volume expansion induced by the density difference between solid and gas phase. The frequency of the sphere formation was determined from the videos to be about 133 Hz. The outer diameter varied in the range of 1.3 to 1.8 mm with a wall thickness in the range of 0.2 to 0.5 mm. Solid silicon spheres are used for spherical silicon solar cells (S₃CS), which have various attractive features. Hollow S₃CS promise substantially higher energy conversion efficiency if their wall thickness can be kept to 0.1–0.2 mm and the inner surface can be passivated. Our production of hollow germanium spheres is a significant step towards the production of hollow S₃CS with, we hope, higher efficiency and lower material cost than solid S₃CS.

Keywords: hollow spheres, semiconductor, compound jet, dropping method

Procedia PDF Downloads 199
2120 Concept, Design and Implementation of Power System Component Simulator Based on Thyristor Controlled Transformer and Power Converter

Authors: B. Kędra, R. Małkowski

Abstract:

This paper presents information on Power System Component Simulator – a device designed for LINTE^2 laboratory owned by Gdansk University of Technology in Poland. In this paper, we first provide an introductory information on the Power System Component Simulator and its capabilities. Then, the concept of the unit is presented. Requirements for the unit are described as well as proposed and introduced functions are listed. Implementation details are given. Hardware structure is presented and described. Information about used communication interface, data maintenance and storage solution, as well as used Simulink real-time features are presented. List and description of all measurements is provided. Potential of laboratory setup modifications is evaluated. Lastly, the results of experiments performed using Power System Component Simulator are presented. This includes simulation of under frequency load shedding, frequency and voltage dependent characteristics of groups of load units, time characteristics of group of different load units in a chosen area.

Keywords: power converter, Simulink Real-Time, Matlab, load, tap controller

Procedia PDF Downloads 234
2119 Usage and Benefits of Handheld Devices as Educational Tools in Higher Institutions of Learning in Lagos State, Nigeria

Authors: Abiola A. Sokoya

Abstract:

Handheld devices are now in use as educational tools for learning in most of the higher institutions, because of the features and functions which can be used in an academic environment. This study examined the usage and the benefits of handheld devices as learning tools. A structured questionnaire was used to collect data, while the data collected was analyzed using simple percentage. It was, however, observed that handheld devices offer numerous functions and application for learning, which could improve academic performance of students. Students are now highly interested in using handheld devices for mobile learning apart from making and receiving calls. The researchers recommended that seminars be organized for students on functions of some common handheld devices that can aid learning for academic purposes. It is also recommended that management of each higher institution should make appropriate policies in-line with the usage of handheld technologies to enhance mobile learning. Government should ensure that appropriate policies and regulations are put in place for the importation of high quality handheld devices into the country, Nigeria being a market place for the technologies. By this, using handheld devices for mobile learning will be enhanced.

Keywords: handheld devices, educational tools, mobile e- learning, usage, benefits

Procedia PDF Downloads 224
2118 Nanoscale Metal-Organic Framework Coated Carbon Nitride Nanosheet for Combination Cancer Therapy

Authors: Rui Chen, Jinfeng Zhang, Chun-Sing Lee

Abstract:

In the past couple of decades, nanoscale metal-organic frameworks (NMOFs) have been highlighted as promising delivery platforms for biomedical applications, which combine many potent features such as high loading capacity, progressive biodegradability and low cytotoxicity. While NMOF has been extensively used as carriers for drugs of different modalities, so far there is no report on exploiting the advantages of NMOF for combination therapy. Herein, we prepared core-shell nanoparticles, where each nanoparticle contains a single graphitic-phase carbon nitride (g-C3N4) nanosheet encapsulated by a zeolitic-imidazolate frameworks-8 (ZIF-8) shell. The g-C3N4 nanosheets are effective visible-light photosensitizer for photodynamic therapy (PDT). When hosting DOX (doxorubicin), the as-synthesized core-shell nanoparticles could realize combinational photo-chemo therapy and provide dual-color fluorescence imaging. Therefore, we expect NMOFs-based core-shell nanoparticles could provide a new way to achieve much-enhanced cancer therapy.

Keywords: carbon nitride, combination therapy, drug delivery, nanoscale metal-organic frameworks

Procedia PDF Downloads 418
2117 Modelling and Detecting the Demagnetization Fault in the Permanent Magnet Synchronous Machine Using the Current Signature Analysis

Authors: Yassa Nacera, Badji Abderrezak, Saidoune Abdelmalek, Houassine Hamza

Abstract:

Several kinds of faults can occur in a permanent magnet synchronous machine (PMSM) systems: bearing faults, electrically short/open faults, eccentricity faults, and demagnetization faults. Demagnetization fault means that the strengths of permanent magnets (PM) in PMSM decrease, and it causes low output torque, which is undesirable for EVs. The fault is caused by physical damage, high-temperature stress, inverse magnetic field, and aging. Motor current signature analysis (MCSA) is a conventional motor fault detection method based on the extraction of signal features from stator current. a simulation model of the PMSM under partial demagnetization and uniform demagnetization fault was established, and different degrees of demagnetization fault were simulated. The harmonic analyses using the Fast Fourier Transform (FFT) show that the fault diagnosis method based on the harmonic wave analysis is only suitable for partial demagnetization fault of the PMSM and does not apply to uniform demagnetization fault of the PMSM.

Keywords: permanent magnet, diagnosis, demagnetization, modelling

Procedia PDF Downloads 58
2116 Sustainability in the Purchase of Airline Tickets: Analysis of Digital Communication from the Perspective of Neuroscience

Authors: Rodríguez Sánchez Carla, Sancho-Esper Franco, Guillen-Davo Marina

Abstract:

Tourism is one of the most important sectors worldwide since it is an important economic engine for today's society. It is also one of the sectors that most negatively affect the environment in terms of CO₂ emissions due to this expansion. In light of this, airlines are developing Voluntary Carbon Offset (VCO). There is important evidence focused on analyzing the features of these VCO programs and their efficacy in reducing CO₂ emissions, and findings are mixed without a clear consensus. Different research approaches have centered on analyzing factors and consequences of VCO programs, such as economic modelling based on panel data, survey research based on traveler responses or experimental research analyzing customer decisions in a simulated context. This study belongs to the latter group because it tries to understand how different characteristics of an online ticket purchase website affect the willingness of a traveler to choose a sustainable one. The proposed behavioral model is based on several theories, such as the nudge theory, the dual processing ELM and the cognitive dissonance theory. This randomized experiment aims at overcoming previous studies based on self-reported measures that mainly study sustainable behavioral intention rather than actual decision-making. It also complements traditional self-reported independent variables by gathering objective information from an eye-tracking device. This experiment analyzes the influence of two characteristics of the online purchase website: i) the type of information regarding flight CO₂ emissions (quantitative vs. qualitative) and the comparison framework related to the sustainable purchase decision (negative: alternative with more emissions than the average flight of the route vs. positive: alternative with less emissions than the average flight of the route), therefore it is a 2x2 experiment with four alternative scenarios. A pretest was run before the actual experiment to refine the experiment features and to check the manipulations. Afterward, a different sample of students answered the pre-test questionnaire aimed at recruiting the cases and measuring several pre-stimulus measures. One week later, students came to the neurolab at the University setting to be part of the experiment, made their decision regarding online purchases and answered the post-test survey. A final sample of 21 students was gathered. The committee of ethics of the institution approved the experiment. The results show that qualitative information generates more sustainable decisions (less contaminant alternative) than quantitative information. Moreover, evidence shows that subjects are more willing to choose the sustainable decision to be more ecological (comparison of the average with the less contaminant alternative) rather than to be less contaminant (comparison of the average with the more contaminant alternative). There are also interesting differences in the information processing variables from the eye tracker. Both the total time to make the choice and the specific times by area of interest (AOI) differ depending on the assigned scenario. These results allow for a better understanding of the factors that condition the decision of a traveler to be part of a VCO program and provide useful information for airline managers to promote these programs to reduce environmental impact.

Keywords: voluntary carbon offset, airline, online purchase, carbon emission, sustainability, randomized experiment

Procedia PDF Downloads 66
2115 The Change of Urban Land Use/Cover Using Object Based Approach for Southern Bali

Authors: I. Gusti A. A. Rai Asmiwyati, Robert J. Corner, Ashraf M. Dewan

Abstract:

Change on land use/cover (LULC) dominantly affects spatial structure and function. It can have such impacts by disrupting social culture practice and disturbing physical elements. Thus, it has become essential to understand of the dynamics in time and space of LULC as it can be used as a critical input for developing sustainable LULC. This study was an attempt to map and monitor the LULC change in Bali Indonesia from 2003 to 2013. Using object based classification to improve the accuracy, and change detection, multi temporal land use/cover data were extracted from a set of ASTER satellite image. The overall accuracies of the classification maps of 2003 and 2013 were 86.99% and 80.36%, respectively. Built up area and paddy field were the dominant type of land use/cover in both years. Patch increase dominantly in 2003 illustrated the rapid paddy field fragmentation and the huge occurring transformation. This approach is new for the case of diverse urban features of Bali that has been growing fast and increased the classification accuracy than the manual pixel based classification.

Keywords: land use/cover, urban, Bali, ASTER

Procedia PDF Downloads 538
2114 Acute Superior Mesenteric Artery Thrombosis Leading to Pneumatosis Intestinalis and Portal Venous Gas in a Young Adult after COVID-19 Vaccination

Authors: Prakash Dhakal

Abstract:

Hepatic portal venous gas (HPVG) is diagnosed via computed tomography due to unusual imaging features. HPVG, when linked with pneumatosis intestinalis, has a high mortality rate and requires urgent intervention. We present a case of a 26-year-old young adult with superior mesenteric artery thrombosis who presented with severe abdominal pain. He had a history of COVID vaccination (First dose of COVISHILED) 15 days back. On imaging, HPVG and pneumatosis intestinalis were seen owing to the urgent intervention of the patient. The reliable interpretation of the imaging findings along with quick intervention led to a favorable outcome in our case. Herein we present a thorough review of the patient with a history of COVID-19 vaccination with superior mesenteric artery thrombosis leading to bowel ischemia and hepatic portal venous gas. The patient underwent subtotal small bowel resection.

Keywords: COVID-19 vaccination, SMA thrombosis, portal venoius gas, pneumatosis intestinalis

Procedia PDF Downloads 84
2113 PostureCheck with the Kinect and Proficio: Posture Modeling for Exercise Assessment

Authors: Elham Saraee, Saurabh Singh, Margrit Betke

Abstract:

Evaluation of a person’s posture while exercising is important in physical therapy. During a therapy session, a physical therapist or a monitoring system must assure that the person is performing an exercise correctly to achieve the desired therapeutic effect. In this work, we introduce a system called POSTURECHECK for exercise assessment in physical therapy. POSTURECHECK assesses the posture of a person who is exercising with the Proficio robotic arm while being recorded by the Microsoft Kinect interface. POSTURECHECK extracts unique features from the person’s upper body during the exercise, and classifies the sequence of postures as correct or incorrect using Bayesian estimation and majority voting. If POSTURECHECK recognizes an incorrect posture, it specifies what the user can do to correct it. The result of our experiment shows that POSTURECHECK is capable of recognizing the incorrect postures in real time while the user is performing an exercise.

Keywords: Bayesian estimation, majority voting, Microsoft Kinect, PostureCheck, Proficio robotic arm, upper body physical therapy

Procedia PDF Downloads 273
2112 A Phishing Email Detection Approach Using Machine Learning Techniques

Authors: Kenneth Fon Mbah, Arash Habibi Lashkari, Ali A. Ghorbani

Abstract:

Phishing e-mails are a security issue that not only annoys online users, but has also resulted in significant financial losses for businesses. Phishing advertisements and pornographic e-mails are difficult to detect as attackers have been becoming increasingly intelligent and professional. Attackers track users and adjust their attacks based on users’ attractions and hot topics that can be extracted from community news and journals. This research focuses on deceptive Phishing attacks and their variants such as attacks through advertisements and pornographic e-mails. We propose a framework called Phishing Alerting System (PHAS) to accurately classify e-mails as Phishing, advertisements or as pornographic. PHAS has the ability to detect and alert users for all types of deceptive e-mails to help users in decision making. A well-known email dataset has been used for these experiments and based on previously extracted features, 93.11% detection accuracy is obtainable by using J48 and KNN machine learning techniques. Our proposed framework achieved approximately the same accuracy as the benchmark while using this dataset.

Keywords: phishing e-mail, phishing detection, anti phishing, alarm system, machine learning

Procedia PDF Downloads 333
2111 Analyzing Success Factors of Canadian Play-Based Intervention Programs for Children with Different Abilities: A Comparative Study

Authors: Shuaa A. Mutawally, Budor H. Saigh, Ebtehal A. Mutawally

Abstract:

This study aims to analyze and compare the success factors of play-based intervention programs for children with different abilities in Canada. Children with disabilities often face limited participation in play and physical activities, leading to increased health risks. Understanding the specific features of these interventions that contribute to positive outcomes is crucial to promoting holistic development in these children. A comparative case study approach was used, selecting three similar successful intervention programs through purposive sampling. Data were collected through interviews and program documents, with 40 participants purposively chosen. Thematic analysis was conducted to identify key themes, including Quality Program, Meeting the Needs of Participants, and Lessons Learned from Experts and Practitioners. These programs play a vital role in addressing the gap in community programming for children with different abilities. The results of this study contribute to the generalization of success factors derived from best practices in play-based intervention programs for children with different abilities.

Keywords: children with different abilities, physical activity, play, play-based intervention programs

Procedia PDF Downloads 67
2110 Age-Dependent Anatomical Abnormalities of the Amygdala in Autism Spectrum Disorder and their Implications for Altered Socio-Emotional Development

Authors: Gabriele Barrocas, Habon Issa

Abstract:

The amygdala is one of various brain regions that tend to be pathological in individuals with autism spectrum disorder (ASD). ASD is a prevalent and heterogeneous developmental disorder affecting all ethnic and socioeconomic groups and consists of a broad range of severity, etiology, and behavioral symptoms. Common features of ASD include but are not limited to repetitive behaviors, obsessive interests, and anxiety. Neuroscientists view the amygdala as the core of the neural system that regulates behavioral responses to anxiogenic and threatening stimuli. Despite this consensus, many previous studies and literature reviews on the amygdala’s alterations in individuals with ASD have reported inconsistent findings. In this review, we will address these conflicts by highlighting recent studies which reveal that anatomical and related socio-emotional differences detected between individuals with and without ASD are highly age-dependent. We will specifically discuss studies using functional magnetic resonance imaging (fMRI), structural MRI, and diffusion tensor imaging (DTI) to provide insights into the neuroanatomical substrates of ASD across development, with a focus on amygdala volumes, cell densities, and connectivity.

Keywords: autism, amygdala, development, abnormalities

Procedia PDF Downloads 123
2109 Generalized Approach to Linear Data Transformation

Authors: Abhijith Asok

Abstract:

This paper presents a generalized approach for the simple linear data transformation, Y=bX, through an integration of multidimensional coordinate geometry, vector space theory and polygonal geometry. The scaling is performed by adding an additional ’Dummy Dimension’ to the n-dimensional data, which helps plot two dimensional component-wise straight lines on pairs of dimensions. The end result is a set of scaled extensions of observations in any of the 2n spatial divisions, where n is the total number of applicable dimensions/dataset variables, created by shifting the n-dimensional plane along the ’Dummy Axis’. The derived scaling factor was found to be dependent on the coordinates of the common point of origin for diverging straight lines and the plane of extension, chosen on and perpendicular to the ’Dummy Axis’, respectively. This result indicates the geometrical interpretation of a linear data transformation and hence, opportunities for a more informed choice of the factor ’b’, based on a better choice of these coordinate values. The paper follows on to identify the effect of this transformation on certain popular distance metrics, wherein for many, the distance metric retained the same scaling factor as that of the features.

Keywords: data transformation, dummy dimension, linear transformation, scaling

Procedia PDF Downloads 294
2108 Exploring the Effect of Accounting Information on Systematic Risk: An Empirical Evidence of Tehran Stock Exchange

Authors: Mojtaba Rezaei, Elham Heydari

Abstract:

This paper highlights the empirical results of analyzing the correlation between accounting information and systematic risk. This association is analyzed among financial ratios and systematic risk by considering the financial statement of 39 companies listed on the Tehran Stock Exchange (TSE) for five years (2014-2018). Financial ratios have been categorized into four groups and to describe the special features, as representative of accounting information we selected: Return on Asset (ROA), Debt Ratio (Total Debt to Total Asset), Current Ratio (current assets to current debt), Asset Turnover (Net sales to Total assets), and Total Assets. The hypotheses were tested through simple and multiple linear regression and T-student test. The findings illustrate that there is no significant relationship between accounting information and market risk. This indicates that in the selected sample, historical accounting information does not fully reflect the price of stocks.

Keywords: accounting information, market risk, systematic risk, stock return, efficient market hypothesis, EMH, Tehran stock exchange, TSE

Procedia PDF Downloads 127
2107 Feature Weighting Comparison Based on Clustering Centers in the Detection of Diabetic Retinopathy

Authors: Kemal Polat

Abstract:

In this paper, three feature weighting methods have been used to improve the classification performance of diabetic retinopathy (DR). To classify the diabetic retinopathy, features extracted from the output of several retinal image processing algorithms, such as image-level, lesion-specific and anatomical components, have been used and fed them into the classifier algorithms. The dataset used in this study has been taken from University of California, Irvine (UCI) machine learning repository. Feature weighting methods including the fuzzy c-means clustering based feature weighting, subtractive clustering based feature weighting, and Gaussian mixture clustering based feature weighting, have been used and compered with each other in the classification of DR. After feature weighting, five different classifier algorithms comprising multi-layer perceptron (MLP), k- nearest neighbor (k-NN), decision tree, support vector machine (SVM), and Naïve Bayes have been used. The hybrid method based on combination of subtractive clustering based feature weighting and decision tree classifier has been obtained the classification accuracy of 100% in the screening of DR. These results have demonstrated that the proposed hybrid scheme is very promising in the medical data set classification.

Keywords: machine learning, data weighting, classification, data mining

Procedia PDF Downloads 320
2106 Keypoints Extraction for Markerless Tracking in Augmented Reality Applications: A Case Study in Dar As-Saraya Museum

Authors: Jafar W. Al-Badarneh, Abdalkareem R. Al-Hawary, Abdulmalik M. Morghem, Mostafa Z. Ali, Rami S. Al-Gharaibeh

Abstract:

Archeological heritage is at the heart of each country’s national glory. Moreover, it could develop into a source of national income. Heritage management requires socially-responsible marketing that achieves high visitor satisfaction while maintaining high site conservation. We have developed an Augmented Reality (AR) experience for heritage and cultural reservation at Dar-As-Saraya museum in Jordan. Our application of this notion relied on markerless-based tracking approach. This approach uses keypoints extraction technique where features of the environment are identified and defined into the system as keypoints. A set of these keypoints forms a tracker for an augmented object to be displayed and overlaid with a real scene at Dar As-Saraya museum. We tested and compared several techniques for markerless tracking and then applied the best technique to complete a mosaic artifact with AR content. The successful results from our application open the door for applications in open archeological sites where markerless tracking is mostly needed.

Keywords: augmented reality, cultural heritage, keypoints extraction, virtual recreation

Procedia PDF Downloads 331
2105 KSVD-SVM Approach for Spontaneous Facial Expression Recognition

Authors: Dawood Al Chanti, Alice Caplier

Abstract:

Sparse representations of signals have received a great deal of attention in recent years. In this paper, the interest of using sparse representation as a mean for performing sparse discriminative analysis between spontaneous facial expressions is demonstrated. An automatic facial expressions recognition system is presented. It uses a KSVD-SVM approach which is made of three main stages: A pre-processing and feature extraction stage, which solves the problem of shared subspace distribution based on the random projection theory, to obtain low dimensional discriminative and reconstructive features; A dictionary learning and sparse coding stage, which uses the KSVD model to learn discriminative under or over dictionaries for sparse coding; Finally a classification stage, which uses a SVM classifier for facial expressions recognition. Our main concern is to be able to recognize non-basic affective states and non-acted expressions. Extensive experiments on the JAFFE static acted facial expressions database but also on the DynEmo dynamic spontaneous facial expressions database exhibit very good recognition rates.

Keywords: dictionary learning, random projection, pose and spontaneous facial expression, sparse representation

Procedia PDF Downloads 299