Search results for: Spatial Data Analyses
23816 Dimension Free Rigid Point Set Registration in Linear Time
Authors: Jianqin Qu
Abstract:
This paper proposes a rigid point set matching algorithm in arbitrary dimensions based on the idea of symmetric covariant function. A group of functions of the points in the set are formulated using rigid invariants. Each of these functions computes a pair of correspondence from the given point set. Then the computed correspondences are used to recover the unknown rigid transform parameters. Each computed point can be geometrically interpreted as the weighted mean center of the point set. The algorithm is compact, fast, and dimension free without any optimization process. It either computes the desired transform for noiseless data in linear time, or fails quickly in exceptional cases. Experimental results for synthetic data and 2D/3D real data are provided, which demonstrate potential applications of the algorithm to a wide range of problems.Keywords: covariant point, point matching, dimension free, rigid registration
Procedia PDF Downloads 16823815 The Applicability of International Humanitarian Law to Non-State Actors
Authors: Yin Cheung Lam
Abstract:
In 1949, the ratification of the Geneva Conventions heralded the international community’s adoption of a new universal and non-discriminatory approach to human rights in situations of conflict. However, with the proliferation of international terrorism after the 9/11 attacks on the United States (U.S.), the international community’s uneven and contradictory implementations of international humanitarian law (IHL) questioned its agenda of universal human rights. Specifically, the derogation from IHL has never been so pronounced in the U.S. led ‘War on Terror’. While an extensive literature has ‘assessed the impact’ of the implementation of the Geneva Conventions, limited attention has been paid to interrogating the ways in which the Geneva Conventions and its resulting implementation have functioned to discursively reproduce certain understandings of human rights between states and non-state actors. Through a discursive analysis of the Geneva Conventions and the conceptualization of human rights in relation to terrorism, this thesis problematises the way in which the U.S. has understood and reproduced understandings of human rights. Using the U.S. ‘War on Terror’ as an example, it seeks to extend previous analyses of the U.S.’ practice of IHL through a qualitative discursive analysis of the human rights content that appears in the Geneva Conventions in addition to the speeches and policy documents on the ‘War on Terror’.Keywords: discursive analysis, human rights, non-state actors, war on terror
Procedia PDF Downloads 60623814 Influence of Extractives Leaching from Larch Wood on Durability of Semi-Transparent Oil-Based Coating during Accelerated Weathering
Authors: O. Dvorak, M. Panek, E. Oberhofnerova, I. Sterbova
Abstract:
Extractives contained in larch wood (Larix decidua, Mill.) reduce the service-life of exterior coating systems, especially transparent and semi-transparent. The aim of this work was to find out whether the initial several-week leaching of extractives from untreated wood in the exterior will positively affect the selected characteristics and the overall life of the semi-transparent oil-based coating. Samples exposed to exterior leaching for 10 or 20 weeks, and the reference samples without leaching were then treated with a coating system. Testing was performed by the method of artificial accelerated weathering in the UV chamber combined with thermal cycling during 6 weeks. The changes of colour, gloss, surface wetting, microscopic analyses of surfaces, and visual damage of paint were evaluated. Only 20-week initial leaching had a positive effect. Both to increase the color stability during aging, but also to slightly increase the overall life of the tested semi-transparent coating system on larch wood.Keywords: larch wood, coating, durability. extractives
Procedia PDF Downloads 13423813 Explanation of Sustainable Architecture Models in Tabriz Residential Fabric Monuments: Case Study of Sharbatoglu House and Ghadaki House
Authors: Fereshteh Pashaei Kamali, Elham Kazemi, Shokooh Neshani Fam
Abstract:
The subject of sustainable development is a reformist revision of modernism and tradition, proposing reconciliatory strategies between these two. Sustainability in architecture cannot only be interpreted as the construction’s physical stability, but also as stability, the preserving of the continuous totality of earth and its energy resources as well, whose available resources and materials should be employed more efficiently. In other words, by referring to the building ecology, emphasizing the combinatory capacity of the building with the environmental factors (existence context), the aim of sustainability is to achieve spatial quality and comfort, as well as proper design in the architectural composition. To achieve these traditional Iranian architecture objectives, it is essential to plan on protecting the environment, maintaining aesthetic measures and responding to the needs of each climatic region. This study was conducted based on the descriptive-analytical method, and aimed to express the design patterns compatible with the climate of the Tabriz residential fabric. The present article attempts to express the techniques and patterns used in traditional Iranian architecture, especially the Tabriz Sharbatoglu houses and Ghadaki houses, which are supposed to be in accordance with modern concepts of sustainable architecture.Keywords: sustainable architecture, climate, Tabriz, Sharbatoglu house, Ghadaki house
Procedia PDF Downloads 37523812 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things
Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker
Abstract:
Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.Keywords: CUSUM, evidence theory, kl divergence, quickest change detection, time series data
Procedia PDF Downloads 33423811 When the Poor Do Not Matter: Environmental Justice and Solid Waste Management in Kinshasa, the Democratic Republic of Congo
Authors: N. S. Kubanza, D. Simatele, D. K. Das
Abstract:
The purpose of this paper is to understand the urban environmental problems in Kinshasa and the consequences of these for the poor. This paper particularly examines the concept of environmental injustice in solid waste management in Kinshasa, the capital of the Democratic Republic of Congo (DRC). The urban low-income communities in Kinshasa face multiple consequences of poor solid waste management associated with unhealthy living conditions. These situations stemmed from overcrowding, poor sanitary, accumulation of solid waste, resulting in the prevalence of water and air borne diseases. Using a mix of reviewed archival records, scholarly literature, a semi-structured interview conducted with the local community members and qualitative surveys among stakeholders; it was found that solid waste management challenge in Kinshasa is not only an environmental and health risk issues, but also, a problem that generates socio-spatial disparities in the distribution of the solid waste burden. It is argued in the paper that the urban poor areas in Kinshasa are often hardest affected by irregularities of waste collection. They lack sanitary storage capacities and have undermined organizational capacity for collective action within solid waste management. In view of these observations, this paper explores mechanisms and stakeholders’ engagement necessary to lessen environmental injustice in solid waste management (SWM) in Kinshasa.Keywords: environmental justice, solid waste management, urban environmental problems, urban poor
Procedia PDF Downloads 26423810 Application of KL Divergence for Estimation of Each Metabolic Pathway Genes
Authors: Shohei Maruyama, Yasuo Matsuyama, Sachiyo Aburatani
Abstract:
The development of the method to annotate unknown gene functions is an important task in bioinformatics. One of the approaches for the annotation is The identification of the metabolic pathway that genes are involved in. Gene expression data have been utilized for the identification, since gene expression data reflect various intracellular phenomena. However, it has been difficult to estimate the gene function with high accuracy. It is considered that the low accuracy of the estimation is caused by the difficulty of accurately measuring a gene expression. Even though they are measured under the same condition, the gene expressions will vary usually. In this study, we proposed a feature extraction method focusing on the variability of gene expressions to estimate the genes' metabolic pathway accurately. First, we estimated the distribution of each gene expression from replicate data. Next, we calculated the similarity between all gene pairs by KL divergence, which is a method for calculating the similarity between distributions. Finally, we utilized the similarity vectors as feature vectors and trained the multiclass SVM for identifying the genes' metabolic pathway. To evaluate our developed method, we applied the method to budding yeast and trained the multiclass SVM for identifying the seven metabolic pathways. As a result, the accuracy that calculated by our developed method was higher than the one that calculated from the raw gene expression data. Thus, our developed method combined with KL divergence is useful for identifying the genes' metabolic pathway.Keywords: metabolic pathways, gene expression data, microarray, Kullback–Leibler divergence, KL divergence, support vector machines, SVM, machine learning
Procedia PDF Downloads 40323809 Impact of Instagram Food Bloggers on Consumer (Generation Z) Decision Making Process in Islamabad. Pakistan
Authors: Tabinda Sadiq, Tehmina Ashfaq Qazi, Hoor Shumail
Abstract:
Recently, the advent of emerging technology has created an emerging generation of restaurant marketing. It explores the aspects that influence customers’ decision-making process in selecting a restaurant after reading food bloggers' reviews online. The motivation behind this research is to investigate the correlation between the credibility of the source and their attitude toward restaurant visits. The researcher collected the data by distributing a survey questionnaire through google forms by employing the Source credibility theory. Non- probability purposive sampling technique was used to collect data. The questionnaire used a predeveloped and validated scale by Ohanian to measure the relationship. Also, the researcher collected data from 250 respondents in order to investigate the influence of food bloggers on Gen Z's decision-making process. SPSS statistical version 26 was used for statistical testing and analyzing the data. The findings of the survey revealed that there is a moderate positive correlation between the variables. So, it can be analyzed that food bloggers do have an impact on Generation Z's decision making process.Keywords: credibility, decision making, food bloggers, generation z, e-wom
Procedia PDF Downloads 7323808 Performance Measurement of Logistics Systems for Thailand's Wholesales and Retails Industries by Data Envelopment Analysis
Authors: Pornpimol Chaiwuttisak
Abstract:
The study aims to compare the performance of the logistics for Thailand’s wholesale and retail trade industries (except motor vehicles, motorcycle, and stalls) by using data (data envelopment analysis). Thailand Standard Industrial Classification in 2009 (TSIC - 2009) categories that industries into sub-group no. 45: wholesale and retail trade (except for the repair of motor vehicles and motorcycles), sub-group no. 46: wholesale trade (except motor vehicles and motorcycles), and sub-group no. 47: retail trade (except motor vehicles and motorcycles. Data used in the study is collected by the National Statistical Office, Thailand. The study consisted of four input factors include the number of companies, the number of personnel in logistics, the training cost in logistics, and outsourcing logistics management. Output factor includes the percentage of enterprises having inventory management. The results showed that the average relative efficiency of small-sized enterprises equals to 27.87 percent and 49.68 percent for the medium-sized enterprises.Keywords: DEA, wholesales and retails, logistics, Thailand
Procedia PDF Downloads 41623807 Assess Changes in Groundwater Dynamics Caused by Mini Dam Construction in Arid Zone of District Killa Abdullah, Pakistan
Authors: Akhtar Malik Muhammad, Agha Mirwais
Abstract:
Dams are considered to recharge aquifers by raising the water table, especially the ones near wells. The present study investigates the impact of dams on groundwater recharge in Jilga, Pakistan. The comparative analysis of changes in the groundwater table of the year 2012 and 2019 was carried out using ArcGIS 10.5 through the kriging method and remote sensing techniques to evaluate the mini dam's impact on the upstream area. Arc Info Spatial Analyze extension was used to find static water level maps of the years. The water table was observed minimum 67.08 feet and maximum 130.09 feet in 2012 whereas in 2019 the minimum water table level 49.89 feet and maximum 115.85 feet. Groundwater recharge with different ratio was noted, but the most significant was at Rabbani dam with 26ft due to supported lithology conditions and the lowest recharge was found at Garang dam14ft. The overall positive trend indicates the rehabilitation of dead karez and agriculture activities by increasing 36% the vegetation area in 2019. An over 6% increase in human settlement indicates socioeconomic development. Thus, it highlights the need for preferential focus on the construction of the dam so that the water level could be sustained to cater to the agricultural and domestic needs of the local population around the yearKeywords: water table, GIS, land cover, mini dams, agriculture
Procedia PDF Downloads 8523806 Event Data Representation Based on Time Stamp for Pedestrian Detection
Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita
Abstract:
In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption
Procedia PDF Downloads 9723805 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea
Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi
Abstract:
Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow
Procedia PDF Downloads 12323804 Application of Observational Medical Outcomes Partnership-Common Data Model (OMOP-CDM) Database in Nursing Health Problems with Prostate Cancer-a Pilot Study
Authors: Hung Lin-Zin, Lai Mei-Yen
Abstract:
Prostate cancer is the most commonly diagnosed male cancer in the U.S. The prevalence is around 1 in 8. The etiology of prostate cancer is still unknown, but some predisposing factors, such as age, black race, family history, and obesity, may increase the risk of the disease. In 2020, a total of 7,178 Taiwanese people were nearly diagnosed with prostate cancer, accounting for 5.88% of all cancer cases, and the incidence rate ranked fifth among men. In that year, the total number of deaths from prostate cancer was 1,730, accounting for 3.45% of all cancer deaths, and the death rate ranked 6th among men, accounting for 94.34% of the cases of male reproductive organs. Looking for domestic and foreign literature on the use of OMOP (Observational Medical Outcomes Partnership, hereinafter referred to as OMOP) database analysis, there are currently nearly a hundred literature published related to nursing-related health problems and nursing measures built in the OMOP general data model database of medical institutions are extremely rare. The OMOP common data model construction analysis platform is a system developed by the FDA in 2007, using a common data model (common data model, CDM) to analyze and monitor healthcare data. It is important to build up relevant nursing information from the OMOP- CDM database to assist our daily practice. Therefore, we choose prostate cancer patients who are our popular care objects and use the OMOP- CDM database to explore the common associated health problems. With the assistance of OMOP-CDM database analysis, we can expect early diagnosis and prevention of prostate cancer patients' comorbidities to improve patient care.Keywords: OMOP, nursing diagnosis, health problem, prostate cancer
Procedia PDF Downloads 6923803 Investigation of Learning Challenges in Building Measurement Unit
Authors: Argaw T. Gurmu, Muhammad N. Mahmood
Abstract:
The objective of this research is to identify the architecture and construction management students’ learning challenges of the building measurement. This research used the survey data obtained collected from the students who completed the building measurement unit. NVivo qualitative data analysis software was used to identify relevant themes. The analysis of the qualitative data revealed the major learning difficulties such as inadequacy of practice questions for the examination, inability to work as a team, lack of detailed understanding of the prerequisite units, insufficiency of the time allocated for tutorials and incompatibility of lecture and tutorial schedules. The output of this research can be used as a basis for improving the teaching and learning activities in construction measurement units.Keywords: building measurement, construction management, learning challenges, evaluate survey
Procedia PDF Downloads 13823802 Using Data-Driven Model on Online Customer Journey
Authors: Ing-Jen Hung, Tzu-Chien Wang
Abstract:
Nowadays, customers can interact with firms through miscellaneous online ads on different channels easily. In other words, customer now has innumerable options and limitless time to accomplish their commercial activities with firms, individualizing their own online customer journey. This kind of convenience emphasizes the importance of online advertisement allocation on different channels. Therefore, profound understanding of customer behavior can make considerable benefit from optimizing fund allocation on diverse ad channels. To achieve this objective, multiple firms utilize numerical methodology to create data-driven advertisement policy. In our research, we aim to exploit online customer click data to discover the correlations between each channel and their sequential relations. We use LSTM to deal with sequential property of our data and compare its accuracy with other non-sequential methods, such as CART decision tree, logistic regression, etc. Besides, we also classify our customers into several groups by their behavioral characteristics to perceive the differences between all groups as customer portrait. As a result, we discover distinct customer journey under each customer portrait. Our article provides some insights into marketing research and can help firm to formulate online advertising criteria.Keywords: LSTM, customer journey, marketing, channel ads
Procedia PDF Downloads 12123801 Content Based Face Sketch Images Retrieval in WHT, DCT, and DWT Transform Domain
Authors: W. S. Besbas, M. A. Artemi, R. M. Salman
Abstract:
Content based face sketch retrieval can be used to find images of criminals from their sketches for 'Crime Prevention'. This paper investigates the problem of CBIR of face sketch images in transform domain. Face sketch images that are similar to the query image are retrieved from the face sketch database. Features of the face sketch image are extracted in the spectrum domain of a selected transforms. These transforms are Discrete Cosine Transform (DCT), Discrete Wavelet Transform (DWT), and Walsh Hadamard Transform (WHT). For the performance analyses of features selection methods three face images databases are used. These are 'Sheffield face database', 'Olivetti Research Laboratory (ORL) face database', and 'Indian face database'. The City block distance measure is used to evaluate the performance of the retrieval process. The investigation concludes that, the retrieval rate is database dependent. But in general, the DCT is the best. On the other hand, the WHT is the best with respect to the speed of retrieving images.Keywords: Content Based Image Retrieval (CBIR), face sketch image retrieval, features selection for CBIR, image retrieval in transform domain
Procedia PDF Downloads 49323800 Development of Al Foam by a Low-Cost Salt Replication Method for Industrial Applications
Abstract:
Metal foams of Al find diverse applications in several industrial sectors such as in automotive and sports equipment industry as impact, acoustic and vibration absorbers, the aerospace industry as structural components in turbines and spatial cones, in the naval industry as low frequency vibration absorbers, and in construction industry as sound barriers inside tunnels, as fire proof materials and structure protection systems against explosions and even in heat exchangers, orthopedic components, and decorative items. Here, we report on the development of Al foams by a low cost and convenient technique of salt replication method with efficient control over size, geometry and distribution of the pores. Sodium bicarbonate was used as the foaming agent to form the porous refractory salt pattern. The mixed refractory salt slurry was microwave dried followed by sintering for selected time periods. Molten Al was infiltrated into the salt pattern in an inert atmosphere at a pressure of 2 bars. The final products were obtained by leaching out the refractory salt pattern. Mechanical properties of the derived samples were studied with a universal testing machine. The results were analyzed in correlation with their microstructural features evaluated with a scanning electron microscope (SEM).Keywords: metal foam, Al, salt replication method, mechanical properties, SEM
Procedia PDF Downloads 35423799 Socio-Demographic Predictors of Divorce Adjustment in Pakistani Women
Authors: Rukhsana Kausar, Nida Zafar
Abstract:
The present research investigated socio-demographic predictors of divorce adjustment in Pakistani women. The sample comprised of 80 divorced women from different areas of Lahore. Self developed Socio-Demographic predictor scale and Divorce Adjustment Scale by (Fisher, 2001) was used for assessment. Analyses showed that working divorced women living with joint family system are more adjusted as compared to non-working divorced women living with joint family system. Women having one child are more adjusted as compared to women having more than one child. Findings highlight importance of presence of father for healthy development of adolescents. Adjustment of divorcee women was positively associated with income, social support from the family, having favorable attitudes toward marital dissolution prior to divorce, and being the partner who initiated the divorce. In addition, older women showed some evidence of poorer adjustment than did younger women. Findings highlight importance of support for divorce adjustment.Keywords: socio-demographic, adjustment, women, divorce
Procedia PDF Downloads 46823798 General Time-Dependent Sequenced Route Queries in Road Networks
Authors: Mohammad Hossein Ahmadi, Vahid Haghighatdoost
Abstract:
Spatial databases have been an active area of research over years. In this paper, we study how to answer the General Time-Dependent Sequenced Route queries. Given the origin and destination of a user over a time-dependent road network graph, an ordered list of categories of interests and a departure time interval, our goal is to find the minimum travel time path along with the best departure time that minimizes the total travel time from the source location to the given destination passing through a sequence of points of interests belonging to each of the specified categories of interest. The challenge of this problem is the added complexity to the optimal sequenced route queries, where we assume that first the road network is time dependent, and secondly the user defines a departure time interval instead of one single departure time instance. For processing general time-dependent sequenced route queries, we propose two solutions as Discrete-Time and Continuous-Time Sequenced Route approaches, finding approximate and exact solutions, respectively. Our proposed approaches traverse the road network based on A*-search paradigm equipped with an efficient heuristic function, for shrinking the search space. Extensive experiments are conducted to verify the efficiency of our proposed approaches.Keywords: trip planning, time dependent, sequenced route query, road networks
Procedia PDF Downloads 32123797 A Secure Proxy Signature Scheme with Fault Tolerance Based on RSA System
Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi
Abstract:
Due to the rapid growth in modern communication systems, fault tolerance and data security are two important issues in a secure transaction. During the transmission of data between the sender and receiver, errors may occur frequently. Therefore, the sender must re-transmit the data to the receiver in order to correct these errors, which makes the system very feeble. To improve the scalability of the scheme, we present a secure proxy signature scheme with fault tolerance over an efficient and secure authenticated key agreement protocol based on RSA system. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties.Keywords: proxy signature, fault tolerance, rsa, key agreement protocol
Procedia PDF Downloads 28623796 Estimating the Receiver Operating Characteristic Curve from Clustered Data and Case-Control Studies
Authors: Yalda Zarnegarnia, Shari Messinger
Abstract:
Receiver operating characteristic (ROC) curves have been widely used in medical research to illustrate the performance of the biomarker in correctly distinguishing the diseased and non-diseased groups. Correlated biomarker data arises in study designs that include subjects that contain same genetic or environmental factors. The information about correlation might help to identify family members at increased risk of disease development, and may lead to initiating treatment to slow or stop the progression to disease. Approaches appropriate to a case-control design matched by family identification, must be able to accommodate both the correlation inherent in the design in correctly estimating the biomarker’s ability to differentiate between cases and controls, as well as to handle estimation from a matched case control design. This talk will review some developed methods for ROC curve estimation in settings with correlated data from case control design and will discuss the limitations of current methods for analyzing correlated familial paired data. An alternative approach using Conditional ROC curves will be demonstrated, to provide appropriate ROC curves for correlated paired data. The proposed approach will use the information about the correlation among biomarker values, producing conditional ROC curves that evaluate the ability of a biomarker to discriminate between diseased and non-diseased subjects in a familial paired design.Keywords: biomarker, correlation, familial paired design, ROC curve
Procedia PDF Downloads 24023795 Rubber Crumbs in Alkali Activated Clay Roof Tiles at Low Temperature
Authors: Aswin Kumar Krishnan, Yat Choy Wong, Reiza Mukhlis, Zipeng Zhang, Arul Arulrajah
Abstract:
The continuous increase in vehicle uptake escalates the number of rubber tyre waste which need to be managed to avoid landfilling and stockpiling. The present research focused on the sustainable use of rubber crumbs in clay roof tiles. The properties of roof tiles composed of clay, rubber crumbs, NaOH, and Na₂SiO₃ with a 10% alkaline activator were studied. Tile samples were fabricated by heating the compacted mixtures at 50°C for 72 hours, followed by a higher heating temperature of 200°C for 24 hours. The effect of rubber crumbs aggregates as a substitution for the raw clay materials was investigated by varying their concentration from 0% to 2.5%. X-ray diffraction (XRD) and scanning electron microscopy (SEM) analyses have been conducted to study the phases and microstructures of the samples. It was found that the optimum rubber crumbs concentration was at 0.5% and 1%, while cracks and larger porosity were found at higher crumbs concentrations. Water absorption and compressive strength test results demonstrated that rubber crumbs and clay satisfied the standard requirement for the roof tiles.Keywords: rubber crumbs, clay, roof tiles, alkaline activators
Procedia PDF Downloads 10423794 Urban Sustainable Development Based on Habitat Quality Evolution: A Case Study in Chongqing, China
Abstract:
Over the last decade or so, China's urbanization has shown a rapid development trend. At the same time, it has also had a great negative impact on the habitat quality. Therefore, it is of great significance to study the impact of land use change on the level of habitat quality in mountain cities for sustainable urban development. This paper analyzed the spatial and temporal land use changes in Chongqing from 2010 to 2020 using ArcGIS 10.6, as well as the evolutionary trend of habitat quality during this period based on the InVEST 3.13.0, to obtain the impact of land use changes on habitat quality. The results showed that the habitat quality in the western part of Chongqing decreased significantly between 2010 and 2020, while the northeastern and southeastern parts remained stable. The main reason for this is the continuous expansion of urban construction land in the western area, which leads to serious habitat fragmentation and the continuous decline of habitat quality. while, in the northeast and southeast areas, due to the greater emphasis on ecological priority and urban-rural coordination in the development process, land use change is characterized by a benign transfer, which maintains the urbanization process while maintaining the coordinated development of habitat quality. This study can provide theoretical support for the sustainable development of mountain cities.Keywords: mountain cities, ecological environment, habitat quality, sustainable development
Procedia PDF Downloads 8423793 Code-Switching among Local UCSI Stem and N-Stem Undergraduates during Knowledge Sharing
Authors: Adeela Abu Bakar, Minder Kaur, Parthaman Singh
Abstract:
In the Malaysian education system, a formal setting of English language learning takes place in a content-based classroom (CBC). Until recently, there is less study in Malaysia, which researched the effects of code-switching (CS) behaviour towards the students’ knowledge sharing (KS) with their peers. The aim of this study is to investigate the frequency, reasons, and effect that CS, from the English language to Bahasa Melayu, has among local STEM and N-STEM undergraduates towards KS in a content-based classroom. The study implies a mixed-method research design with questionnaire and interviews as the instruments. The data is collected through distribution of questionnaires and interviews with the undergraduates. The quantitative data is analysed using SPSS in simple frequencies and percentages, whereas qualitative data involves organizing the data into themes, followed by analysis. Findings found that N-STEM undergraduates code-switch more as compared to STEM undergraduates. In addition to that, both the STEM and N-STEM undergraduates agree that CS acts as a catalyst towards KS in a content-based classroom. However, they also acknowledge that excess use of CS can be a hindrance towards KS. The findings of the study can benefit STEM and N-STEM undergraduates, education policymakers, language teachers, university educators, and students with significant insights into the role of CS towards KS in a content-based classroom. Some of the recommendations that can be applied for future studies are that the number of participants can be increased, an observation to be included for the data collection.Keywords: switching, content-based classroom, content and language integrated learning, knowledge sharing, STEM and N-STEM undergraduates
Procedia PDF Downloads 13523792 Synthesis, Characterization and in vitro DNA Binding and Cleavage Studies of Cu(II)/Zn(II) Dipeptide Complexes
Authors: A. Jamsheera, F. Arjmand, D. K. Mohapatra
Abstract:
Small molecules binding to specific sites along DNA molecule are considered as potential chemotherapeutic agents. Their role as mediators of key biological functions and their unique intrinsic properties make them particularly attractive therapeutic agents. Keeping in view, novel dipeptide complexes Cu(II)-Val-Pro (1), Zn(II)-Val-Pro (2), Cu(II)-Ala-Pro (3) and Zn(II)-Ala-Pro (4) were synthesized and thoroughly characterized using different spectroscopic techniques including elemental analyses, IR, NMR, ESI–MS and molar conductance measurements. The solution stability study carried out by UV–vis absorption titration over a broad range of pH proved the stability of the complexes in solution. In vitro DNA binding studies of complexes 1–4 carried out employing absorption, fluorescence, circular dichroism and viscometric studies revealed the binding of complexes to DNA via groove binding. UV–vis titrations of 1–4 with mononucleotides of interest viz., 5´-GMP and 5´-TMP were also carried out. The DNA cleavage activity of the complexes 1 and 2 were ascertained by gel electrophoresis assay which revealed that the complexes are good DNA cleavage agents and the cleavage mechanism involved a hydrolytic pathway. Furthermore, in vitro antitumor activity of complex 1 was screened against human cancer cell lines of different histological origin.Keywords: dipeptide Cu(II) and Zn(II) complexes, DNA binding profile, pBR322 DNA cleavage, in vitro anticancer activity
Procedia PDF Downloads 34923791 Development of Tutorial Courseware on Selected Topics in Mathematics, Science and the English Language
Authors: Alice D. Dioquino, Olivia N. Buzon, Emilio F. Aguinaldo, Ruel Avila, Erwin R. Callo, Cristy Ocampo, Malvin R. Tabajen, Marla C. Papango, Marilou M. Ubina, Josephine Tondo, Cromwell L. Valeriano
Abstract:
The main purpose of this study was to develop, evaluate and validate courseware on Selected Topics in Mathematics, Science, and the English Language. Specifically, it aimed to: 1. Identify the appropriate Instructional Systems Design (ISD) model in the development of the courseware material; 2. Assess the courseware material according to its: a. Content Characteristics; b. Instructional Characteristics; and c. Technical Characteristics 3. Find out if there is a significant difference in the performance of students before and after using the tutorial CAI. This research is developmental as well as a one group pretest-posttest design. The study had two phases. Phase I includes the needs analysis, writing of lessons and storyboard by the respective experts in each field. Phase II includes the digitization or the actual development of the courseware by the faculty of the ICT department. In this phase it adapted an instructional systems design (ISD) model which is the ADDIE model. ADDIE stands for Analysis, Design, Development, Implementation and Evaluation. Formative evaluation was conducted simultaneously with the different phases to detect and remedy any bugs in the courseware along the areas of content, instructional and technical characteristics. The expected output are the digitized lessons in Algebra, Biology, Chemistry, Physics and Communication Arts in English. Students and some IT experts validated the CAI material using the Evaluation Form by Wong & Wong. They validated the CAI materials as Highly Acceptable with an overall mean rating of 4.527and standard deviation of 0 which means that they were one in the ratings they have given the CAI materials. A mean gain was recorded and computing the t-test for dependent samples it showed that there were significant differences in the mean achievement of the students before and after the treatment (using CAI). The identified ISD model used in the development of the tutorial courseware was the ADDIE model. The quantitative analyses of data based on ratings given by the respondents’ shows that the tutorial courseware possess the characteristics and or qualities of a very good computer-based courseware. The ratings given by the different evaluators with regard to content, instructional, and technical aspects of the Tutorial Courseware are in conformity towards being excellent. Students performed better in mathematics, biology chemistry, physics and the English Communication Arts after they were exposed to the tutorial courseware.Keywords: CAI, tutorial courseware, Instructional Systems Design (ISD) Model, education
Procedia PDF Downloads 34623790 Mechanistic Modelling to De-risk Process Scale-up
Authors: Edwin Cartledge, Jack Clark, Mazaher Molaei-Chalchooghi
Abstract:
The mixing in the crystallization step of active pharmaceutical ingredient manufacturers was studied via advanced modeling tools to enable a successful scale-up. A virtual representation of the vessel was created, and computational fluid dynamics were used to simulate multiphase flow and, thus, the mixing environment within this vessel. The study identified a significant dead zone in the vessel underneath the impeller and found that increasing the impeller speed and power did not improve the mixing. A series of sensitivity analyses found that to improve mixing, the vessel had to be redesigned, and found that optimal mixing could be obtained by adding two extra cylindrical baffles. The same two baffles from the simulated environment were then constructed and added to the process vessel. By identifying these potential issues before starting the manufacture and modifying the vessel to ensure good mixing, this study mitigated a failed crystallization and potential batch disposal, which could have resulted in a significant loss of high-value material.Keywords: active pharmaceutical ingredient, baffles, computational fluid dynamics, mixing, modelling
Procedia PDF Downloads 9723789 Fuzzy Multi-Component DEA with Shared and Undesirable Fuzzy Resources
Authors: Jolly Puri, Shiv Prasad Yadav
Abstract:
Multi-component data envelopment analysis (MC-DEA) is a popular technique for measuring aggregate performance of the decision making units (DMUs) along with their components. However, the conventional MC-DEA is limited to crisp input and output data which may not always be available in exact form. In real life problems, data may be imprecise or fuzzy. Therefore, in this paper, we propose (i) a fuzzy MC-DEA (FMC-DEA) model in which shared and undesirable fuzzy resources are incorporated, (ii) the proposed FMC-DEA model is transformed into a pair of crisp models using cut approach, (iii) fuzzy aggregate performance of a DMU and fuzzy efficiencies of components are defined to be fuzzy numbers, and (iv) a numerical example is illustrated to validate the proposed approach.Keywords: multi-component DEA, fuzzy multi-component DEA, fuzzy resources, decision making units (DMUs)
Procedia PDF Downloads 40723788 A Computational Cost-Effective Clustering Algorithm in Multidimensional Space Using the Manhattan Metric: Application to the Global Terrorism Database
Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami
Abstract:
The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.Keywords: pattern recognition, global terrorism database, Manhattan distance, k-means clustering, terrorism data analysis
Procedia PDF Downloads 38623787 Impact of Rapid Urbanization on Health Sector in India
Authors: Madhvi Bhayani
Abstract:
Introduction: Due to the rapid pace of urbanization, the urban health issues have become one of the significant threats to future development in India. It also poses serious repercussions on the citizen’s health. As urbanization in India is increasing at an unprecedented rate and it has generated the urban health crisis among the city dwellers especially the urban poor. The increasing proportion of the urban poor and vulnerable to the health indicators worse than the rural counterparts, they face social and financial barriers in accessing healthcare services and these conditions make human health at risk. The Local as well as the State and National governments are alike tackling with the challenges of urbanization as it has become very essential for the government to provide the basic necessities and better infrastructure that make life in cities safe and healthy. Thus, the paper argues that if no major realistic steps are taken with immediate effect, the citizens will face a huge burden of health hazards. Aim: This paper attempts to analyze the current infrastructure, government planning, and its future policy, it also discusses the challenges and outcomes of urbanization on health and its impact on it and it will also predict the future trend with regard to disease burden in the urban areas. Methods: The paper analyzes on the basis of the secondary data by taking into consideration the connection between the Rapid Urbanization and Public Health Challenges, health and health care system and its services delivery to the citizens especially to the urban poor. Extensive analyses of government census reports, health information and policy, the government health-related schemes, urban development and based on the past trends, the future status of urban infrastructure and health outcomes are predicted. The social-economic and political dimensions are also taken into consideration from regional, national and global perspectives, which are incorporated in the paper to make realistic predictions for the future. Findings and Conclusion: The findings of the paper show that India suffers a lot due to the double burden of rapidly increasing in diseases and also growing health inequalities and disparities in health outcomes. Existing tools of governance of urban health are falling short to provide the better health care services. They need to strengthen the collaboration and communication among the state, national and local governments and also with the non-governmental partners. Based on the findings the policy implications are then described and areas for future research are defined.Keywords: health care, urbanization, urban health, service delivery
Procedia PDF Downloads 209