Search results for: Spatial Data Analyses
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28646

Search results for: Spatial Data Analyses

24026 A Study on the Establishment of a 4-Joint Based Motion Capture System and Data Acquisition

Authors: Kyeong-Ri Ko, Seong Bong Bae, Jang Sik Choi, Sung Bum Pan

Abstract:

A simple method for testing the posture imbalance of the human body is to check for differences in the bilateral shoulder and pelvic height of the target. In this paper, to check for spinal disorders the authors have studied ways to establish a motion capture system to obtain and express motions of 4-joints, and to acquire data based on this system. The 4 sensors are attached to the both shoulders and pelvis. To verify the established system, the normal and abnormal postures of the targets listening to a lecture were obtained using the established 4-joint based motion capture system. From the results, it was confirmed that the motions taken by the target was identical to the 3-dimensional simulation.

Keywords: inertial sensor, motion capture, motion data acquisition, posture imbalance

Procedia PDF Downloads 515
24025 Predictive Analytics in Oil and Gas Industry

Authors: Suchitra Chnadrashekhar

Abstract:

Earlier looked as a support function in an organization information technology has now become a critical utility to manage their daily operations. Organizations are processing huge amount of data which was unimaginable few decades before. This has opened the opportunity for IT sector to help industries across domains to handle the data in the most intelligent manner. Presence of IT has been a leverage for the Oil & Gas industry to store, manage and process the data in most efficient way possible thus deriving the economic value in their day-to-day operations. Proper synchronization between Operational data system and Information Technology system is the need of the hour. Predictive analytics supports oil and gas companies by addressing the challenge of critical equipment performance, life cycle, integrity, security, and increase their utilization. Predictive analytics go beyond early warning by providing insights into the roots of problems. To reach their full potential, oil and gas companies need to take a holistic or systems approach towards asset optimization and thus have the functional information at all levels of the organization in order to make the right decisions. This paper discusses how the use of predictive analysis in oil and gas industry is redefining the dynamics of this sector. Also, the paper will be supported by real time data and evaluation of the data for a given oil production asset on an application tool, SAS. The reason for using SAS as an application for our analysis is that SAS provides an analytics-based framework to improve uptimes, performance and availability of crucial assets while reducing the amount of unscheduled maintenance, thus minimizing maintenance-related costs and operation disruptions. With state-of-the-art analytics and reporting, we can predict maintenance problems before they happen and determine root causes in order to update processes for future prevention.

Keywords: hydrocarbon, information technology, SAS, predictive analytics

Procedia PDF Downloads 360
24024 The Interrelationship between Aggression and Frustration Brought about by Computer Games with Incentives among LPU Male Students

Authors: Dior Grita F. De Torres, Edielyn Gonzalvo, Jovielyn Manibo

Abstract:

The experimental study aims to measure the level of aggression and frustration brought about by computer games with incentives and the interrelationship of the said variables. With 50 participants for each four groups, a total of 200 males who are avid of playing computer games participated in the study. The results and analyses presented in the study concluded that incentives differentially affect the level of aggression and frustration of the players with tobt = 7.18 and 6.521 > tcrit = 2.021 using t-test for dependent groups and Fobt = 4.527 and 8.340 > Fcrit = 3.89 using ANOVA with alpha level of 0.05, two tailed. At the same time, computer game’s level of difficulty also affects the level of aggression and frustration of the players with tobt = 7.53 and 4.783 > tcrit = 2.021 respectively and Fobt = 6.524 and 10.167 > Fcrit = 3.89. Moreover, there is also an interaction between incentive and the level of difficulty of computer game with tobt = 9.68 for aggression and tobt = 7.356 > 2.021 for frustration. Computer games and /with incentives has a large effect on the among male students of LPU.

Keywords: aggression, frustration, computer game, incentive

Procedia PDF Downloads 535
24023 Urban Change Detection and Pattern Analysis Using Satellite Data

Authors: Shivani Jha, Klaus Baier, Rafiq Azzam, Ramakar Jha

Abstract:

In India, generally people migrate from rural area to the urban area for better infra-structural facilities, high standard of living, good job opportunities and advanced transport/communication availability. In fact, unplanned urban development due to migration of people causes seriou damage to the land use, water pollution and available water resources. In the present work, an attempt has been made to use satellite data of different years for urban change detection of Chennai metropolitan city along with pattern analysis to generate future scenario of urban development using buffer zoning in GIS environment. In the analysis, SRTM (30m) elevation data and IRS-1C satellite data for the years 1990, 2000, and 2014, are used. The flow accumulation, aspect, flow direction and slope maps developed using SRTM 30 m data are very useful for finding suitable urban locations for industrial setup and urban settlements. Normalized difference vegetation index (NDVI) and Principal Component Analysis (PCA) have been used in ERDAS imagine software for change detection in land use of Chennai metropolitan city. It has been observed that the urban area has increased exponentially in Chennai metropolitan city with significant decrease in agriculture and barren lands. However, the water bodies located in the study regions are protected and being used as freshwater for drinking purposes. Using buffer zone analysis in GIS environment, it has been observed that the development has taken place in south west direction significantly and will do so in future.

Keywords: urban change, satellite data, the Chennai metropolis, change detection

Procedia PDF Downloads 408
24022 HelpMeBreathe: A Web-Based System for Asthma Management

Authors: Alia Al Rayssi, Mahra Al Marar, Alyazia Alkhaili, Reem Al Dhaheri, Shayma Alkobaisi, Hoda Amer

Abstract:

We present in this paper a web-based system called “HelpMeBreathe” for managing asthma. The proposed system provides analytical tools, which allow better understanding of environmental triggers of asthma, hence better support of data-driven decision making. The developed system provides warning messages to a specific asthma patient if the weather in his/her area might cause any difficulty in breathing or could trigger an asthma attack. HelpMeBreathe collects, stores, and analyzes individuals’ moving trajectories and health conditions as well as environmental data. It then processes and displays the patients’ data through an analytical tool that leads to an effective decision making by physicians and other decision makers.

Keywords: asthma, environmental triggers, map interface, web-based systems

Procedia PDF Downloads 294
24021 The Labyrinth - Circular Choral Chant of Dithyramb in the 7th BC, Mirroring the Conjuction of the Planets and the Milky Way Circle

Authors: Kleopatra Chatzigiosi

Abstract:

The paper delves into the spatial and mythological examination of the choral chant of Dithyramb in the 7th BC, its connections to Dionysus, and its role in the origin of drama, exploring harmonious and symbolic aspects of early Greek culture. The primary aim is to analyze the development of Dithyramb in relation to harmonic systems and early musical scales, linking them to circular time and celestial movements. Additionally, the study seeks to unveil the mythological ties of Dithyramb with ancient rituals worshipping Mother Earth Cybele. The methodology involves researching etymology and mythology related to Dithyramb based on Pindar's works and proposing a harmonious design for the performance space of Dithyramb through harmonic spirals inspired by ancient practices. Ιt is also included a comparative study with similar choral traditions from other ancient cultures, providing a broader context for the findings of the work. The research uncovers the symbolic significance of Dithyramb as a dramatized representation of harmonic phenomena, leading to human deification within a context of Sacred Architecture, highlighting the intricate connections between music, rituals, and divine worship in ancient Greek culture. The study enriches understanding of the harmonic and symbolic underpinnings of ancient Greek choral traditions, shedding light on the complex interplay between music, mythology, and ritual practices in the development of early theatrical performances. Data was collected through an in-depth analysis of ancient texts, specifically Pindar's Dithyrambs, to trace the etymology and mythological origins of Dithyramb and its associated symbolism. The analysis involved scrutinizing ancient sources to draw connections between Dithyramb, harmonic systems, celestial movements, and mythological narratives, culminating in a comprehensive exploration of the cultural and symbolic significance of this choral tradition. The study addresses how the choral chant of Dithyramb evolved harmoniously within the ancient Greek cultural framework, its connections to celestial phenomena and ritual practices, and the symbolic implications of its mythological associations within a sacred architectural context. The research illuminates the profound cultural, symbolic, and harmonic dimensions of the choral chant of Dithyramb, offering valuable insights into the intersections between music, mythology, and ritual in ancient Greece, enriching scholarly understanding of early theatrical traditions.

Keywords: circular choral chant of dithyramb, “exarchon”( leader), great “eniautos” (year), harmony labyrinth

Procedia PDF Downloads 22
24020 Development of Geo-computational Model for Analysis of Lassa Fever Dynamics and Lassa Fever Outbreak Prediction

Authors: Adekunle Taiwo Adenike, I. K. Ogundoyin

Abstract:

Lassa fever is a neglected tropical virus that has become a significant public health issue in Nigeria, with the country having the greatest burden in Africa. This paper presents a Geo-Computational Model for Analysis and Prediction of Lassa Fever Dynamics and Outbreaks in Nigeria. The model investigates the dynamics of the virus with respect to environmental factors and human populations. It confirms the role of the rodent host in virus transmission and identifies how climate and human population are affected. The proposed methodology is carried out on a Linux operating system using the OSGeoLive virtual machine for geographical computing, which serves as a base for spatial ecology computing. The model design uses Unified Modeling Language (UML), and the performance evaluation uses machine learning algorithms such as random forest, fuzzy logic, and neural networks. The study aims to contribute to the control of Lassa fever, which is achievable through the combined efforts of public health professionals and geocomputational and machine learning tools. The research findings will potentially be more readily accepted and utilized by decision-makers for the attainment of Lassa fever elimination.

Keywords: geo-computational model, lassa fever dynamics, lassa fever, outbreak prediction, nigeria

Procedia PDF Downloads 93
24019 An Activatable Theranostic for Targeted Cancer Therapy and Imaging

Authors: Sankarprasad Bhuniya, Sukhendu Maiti, Eun-Joong Kim, Hyunseung Lee, Jonathan L. Sessler, Kwan Soo Hong, Jong Seung Kim

Abstract:

A new theranostic strategy is described. It is based on the use of an “all in one” prodrug, namely the biotinylated piperazine-rhodol conjugate 4a. This conjugate, which incorporates the anticancer drug SN-38, undergoes self-immolative cleavage when exposed to biological thiols. This leads to the tumor-targeted release of the active SN-38 payload along with fluorophore 1a. This release is made selective as the result of the biotin functionality. Fluorophore 1a is 32-fold more fluorescent than prodrug 4a. It permits the delivery and release of the SN-38 payload to be monitored easily in vitro and in vivo, as inferred from cell studies and ex vivo analyses of mice xenografts derived HeLa cells, respectively. Prodrug 4a also displays anticancer activity in the HeLa cell murine xenograft tumor model. On the basis of these findings we suggest that the present strategy, which combines within a single agent the key functions of targeting, release, imaging, and treatment, may have a role to play in cancer diagnosis and therapy.

Keywords: theranostic, prodrug, cancer therapy, fluorescence

Procedia PDF Downloads 537
24018 Long Term Changes of Water Quality in Latvia

Authors: Maris Klavins, Valery Rodinov

Abstract:

The aim of this study was to analyze long term changes of surface water quality in Latvia, spatial variability of water chemical composition, possible impacts of different pollution sources as well as to analyze the measures to protect national water resources - river basin management. Within this study, the concentrations of major water ingredients and microelements in major rivers and lakes of Latvia have been determined. Metal concentrations in river and lake waters were compared with water chemical composition. The mean concentrations of trace metals in inland waters of Latvia are appreciably lower than the estimated world averages for river waters and close to or lower than background values, unless regional impacts determined by local geochemistry. This may be explained by a comparatively lower level of anthropogenic load. In the same time in several places, direct anthropogenic impacts are evident, regarding influences of point sources both transboundary transport impacts. Also, different processes related to pollution of surface waters in Latvia have been analyzed. At first the analysis of changes and composition of pollutant emissions in Latvia has been realized, and the obtained results were compared with actual composition of atmospheric precipitation and their changes in time.

Keywords: water quality, trend analysis, pollution, human impact

Procedia PDF Downloads 268
24017 Geographic Information Systems and Remotely Sensed Data for the Hydrological Modelling of Mazowe Dam

Authors: Ellen Nhedzi Gozo

Abstract:

Unavailability of adequate hydro-meteorological data has always limited the analysis and understanding of hydrological behaviour of several dam catchments including Mazowe Dam in Zimbabwe. The problem of insufficient data for Mazowe Dam catchment analysis was solved by extracting catchment characteristics and aerial hydro-meteorological data from ASTER, LANDSAT, Shuttle Radar Topographic Mission SRTM remote sensing (RS) images using ILWIS, ArcGIS and ERDAS Imagine geographic information systems (GIS) software. Available observed hydrological as well as meteorological data complemented the use of the remotely sensed information. Ground truth land cover was mapped using a Garmin Etrex global positioning system (GPS) system. This information was then used to validate land cover classification detail that was obtained from remote sensing images. A bathymetry survey was conducted using a SONAR system connected to GPS. Hydrological modelling using the HBV model was then performed to simulate the hydrological process of the catchment in an effort to verify the reliability of the derived parameters. The model output shows a high Nash-Sutcliffe Coefficient that is close to 1 indicating that the parameters derived from remote sensing and GIS can be applied with confidence in the analysis of Mazowe Dam catchment.

Keywords: geographic information systems, hydrological modelling, remote sensing, water resources management

Procedia PDF Downloads 336
24016 A Bayesian Model with Improved Prior in Extreme Value Problems

Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro

Abstract:

In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).

Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior

Procedia PDF Downloads 198
24015 Single-Molecule Analysis of Structure and Dynamics in Polymer Materials by Super-Resolution Technique

Authors: Hiroyuki Aoki

Abstract:

The physical properties of polymer materials are dependent on the conformation and molecular motion of a polymer chain. Therefore, the structure and dynamic behavior of the single polymer chain have been the most important concerns in the field of polymer physics. However, it has been impossible to directly observe the conformation of the single polymer chain in a bulk medium. In the current work, the novel techniques to study the conformation and dynamics of a single polymer chain are proposed. Since a fluorescence method is extremely sensitive, the fluorescence microscopy enables the direct detection of a single molecule. However, the structure of the polymer chain as large as 100 nm cannot be resolved by conventional fluorescence methods because of the diffraction limit of light. In order to observe the single chains, we developed the labeling method of polymer materials with a photo-switchable dye and the super-resolution microscopy. The real-space conformational analysis of single polymer chains with the spatial resolution of 15-20 nm was achieved. The super-resolution microscopy enables us to obtain the three-dimensional coordinates; therefore, we succeeded the conformational analysis in three dimensions. The direct observation by the nanometric optical microscopy would reveal the detailed information on the molecular processes in the various polymer systems.

Keywords: polymer materials, single molecule, super-resolution techniques, conformation

Procedia PDF Downloads 306
24014 Quantitative, Preservative Methodology for Review of Interview Transcripts Using Natural Language Processing

Authors: Rowan P. Martnishn

Abstract:

During the execution of a National Endowment of the Arts grant, approximately 55 interviews were collected from professionals across various fields. These interviews were used to create deliverables – historical connections for creations that began as art and evolved entirely into computing technology. With dozens of hours’ worth of transcripts to be analyzed by qualitative coders, a quantitative methodology was created to sift through the documents. The initial step was to both clean and format all the data. First, a basic spelling and grammar check was applied, as well as a Python script for normalized formatting which used an open-source grammatical formatter to make the data as coherent as possible. 10 documents were randomly selected to manually review, where words often incorrectly translated during the transcription were recorded and replaced throughout all other documents. Then, to remove all banter and side comments, the transcripts were spliced into paragraphs (separated by change in speaker) and all paragraphs with less than 300 characters were removed. Secondly, a keyword extractor, a form of natural language processing where significant words in a document are selected, was run on each paragraph for all interviews. Every proper noun was put into a data structure corresponding to that respective interview. From there, a Bidirectional and Auto-Regressive Transformer (B.A.R.T.) summary model was then applied to each paragraph that included any of the proper nouns selected from the interview. At this stage the information to review had been sent from about 60 hours’ worth of data to 20. The data was further processed through light, manual observation – any summaries which proved to fit the criteria of the proposed deliverable were selected, as well their locations within the document. This narrowed that data down to about 5 hours’ worth of processing. The qualitative researchers were then able to find 8 more connections in addition to our previous 4, exceeding our minimum quota of 3 to satisfy the grant. Major findings of the study and subsequent curation of this methodology raised a conceptual finding crucial to working with qualitative data of this magnitude. In the use of artificial intelligence there is a general trade off in a model between breadth of knowledge and specificity. If the model has too much knowledge, the user risks leaving out important data (too general). If the tool is too specific, it has not seen enough data to be useful. Thus, this methodology proposes a solution to this tradeoff. The data is never altered outside of grammatical and spelling checks. Instead, the important information is marked, creating an indicator of where the significant data is without compromising the purity of it. Secondly, the data is chunked into smaller paragraphs, giving specificity, and then cross-referenced with the keywords (allowing generalization over the whole document). This way, no data is harmed, and qualitative experts can go over the raw data instead of using highly manipulated results. Given the success in deliverable creation as well as the circumvention of this tradeoff, this methodology should stand as a model for synthesizing qualitative data while maintaining its original form.

Keywords: B.A.R.T.model, keyword extractor, natural language processing, qualitative coding

Procedia PDF Downloads 29
24013 Culture and Commodification: A Study of William Gibson's the Bridge Trilogy

Authors: Aruna Bhat

Abstract:

Culture can be placed within the social structure that embodies both the creation of social groups, and the manner in which they interact with each other. As many critics have pointed out, culture in the Postmodern context has often been considered a commodity, and indeed it shares many attributes with commercial products. Popular culture follows many patterns of behavior derived from Economics, from the simple principle of supply and demand, to the creation of marketable demographics which fit certain criterion. This trend is exemplary visible in contemporary fiction, especially in contemporary science fiction; Cyberpunk fiction in particular which is an off shoot of pure science fiction. William Gibson is one such author who in his works portrays such a scenario, and in his The Bridge Trilogy he adds another level of interpretation to this state of affairs, by describing a world that is centered on industrialization of a new kind – that focuses around data in the cyberspace. In this new world, data has become the most important commodity, and man has become nothing but a nodal point in a vast ocean of raw data resulting into commodification of each thing including Culture. This paper will attempt to study the presence of above mentioned elements in William Gibson’s The Bridge Trilogy. The theories applied will be Postmodernism and Cultural studies.

Keywords: culture, commodity, cyberpunk, data, postmodern

Procedia PDF Downloads 504
24012 Effective Urban Design on Environmental Quality Improvement of Historical Textures: A Case Study on Khajeh Khezr Neighborhood in Kerman City

Authors: Saman Sobhani

Abstract:

Historical neighborhoods have special values inside them, and, in addition to inducing a sense of collective memories, they have to have some criteria in respect of achieving desirable environmental quality in order for citizens to live. From the perspective of urban planners and designers, a neighborhood as an urban space has to satisfy various needs of citizens in terms of activities as well as their spiritual requirements. In the research based on the component of environmental quality in one of the neighborhoods with historical value resulting from the theoretical model presented (functional-structural, physical-spatial, and substantive), integrated analysis has been performed on the Khajeh Khezr neighborhood in Kerman. Then, after studying the weaknesses and strengths points of it based on the AIDA model, some mechanisms have been presented to promote environmental quality based on neighborhood organization, and related urban design projects have been defined accordingly. Analyzing the findings shows that inhabitants in the Khajeh Khezr neighborhood are not much satisfied with the quality of the urban environment of the neighborhood. In the research, the descriptive-analytical method and review of texts have been used in the form of library studies, and case study has been applied as well as observation and questionnaire in the form of field studies.

Keywords: environmental quality, Kerman, Khajeh Khezr, neighborhood

Procedia PDF Downloads 85
24011 A Predictive Model for Turbulence Evolution and Mixing Using Machine Learning

Authors: Yuhang Wang, Jorg Schluter, Sergiy Shelyag

Abstract:

The high cost associated with high-resolution computational fluid dynamics (CFD) is one of the main challenges that inhibit the design, development, and optimisation of new combustion systems adapted for renewable fuels. In this study, we propose a physics-guided CNN-based model to predict turbulence evolution and mixing without requiring a traditional CFD solver. The model architecture is built upon U-Net and the inception module, while a physics-guided loss function is designed by introducing two additional physical constraints to allow for the conservation of both mass and pressure over the entire predicted flow fields. Then, the model is trained on the Large Eddy Simulation (LES) results of a natural turbulent mixing layer with two different Reynolds number cases (Re = 3000 and 30000). As a result, the model prediction shows an excellent agreement with the corresponding CFD solutions in terms of both spatial distributions and temporal evolution of turbulent mixing. Such promising model prediction performance opens up the possibilities of doing accurate high-resolution manifold-based combustion simulations at a low computational cost for accelerating the iterative design process of new combustion systems.

Keywords: computational fluid dynamics, turbulence, machine learning, combustion modelling

Procedia PDF Downloads 91
24010 The Women-In-Mining Discourse: A Study Combining Corpus Linguistics and Discourse Analysis

Authors: Ylva Fältholm, Cathrine Norberg

Abstract:

One of the major threats identified to successful future mining is that women do not find the industry attractive. Many attempts have been made, for example in Sweden and Australia, to create organizational structures and mining communities attractive to both genders. Despite such initiatives, many mining areas are developing into gender-segregated fly-in/fly out communities dominated by men with both social and economic consequences. One of the challenges facing many mining companies is thus to break traditional gender patterns and structures. To do this increased knowledge about gender in the context of mining is needed. Since language both constitutes and reproduces knowledge, increased knowledge can be gained through an exploration and description of the mining discourse from a gender perspective. The aim of this study is to explore what conceptual ideas are activated in connection to the physical/geographical mining area and to work within the mining industry. We use a combination of critical discourse analysis implying close reading of selected texts, such as policy documents, interview materials, applications and research and innovation agendas, and analyses of linguistic patterns found in large language corpora covering millions of words of contemporary language production. The quantitative corpus data serves as a point of departure for the qualitative analysis of the texts, that is, suggests what patterns to explore further. The study shows that despite technological and organizational development, one of the most persistent discourses about mining is the conception of dangerous and unfriendly areas infused with traditional notions of masculinity ideals and manual hard work. Although some of the texts analyzed highlight gender issues, and describe gender-equalizing initiatives, such as wage-mapping systems, female networks and recruitment efforts for women executives, and thereby render the discourse less straightforward, it is shown that these texts are not unambiguous examples of a counter-discourse. They rather illustrate that discourses are not stable but include opposing discourses, in dialogue with each other. For example, many texts highlight why and how women are important to mining, at the same time as they suggest that gender and diversity are all about women: why mining is a problem for them, how they should be, and what they should do to fit in. Drawing on a constitutive view of discourse, knowledge about such conflicting perceptions of women is a prerequisite for succeeding in attracting women to the mining industry and thereby contributing to the development of future mining.

Keywords: discourse, corpus linguistics, gender, mining

Procedia PDF Downloads 264
24009 Impact of Safety and Quality Considerations of Housing Clients on the Construction Firms’ Intention to Adopt Quality Function Deployment: A Case of Construction Sector

Authors: Saif Ul Haq

Abstract:

The current study intends to examine the safety and quality considerations of clients of housing projects and their impact on the adoption of Quality Function Deployment (QFD) by the construction firm. Mixed method research technique has been used to collect and analyze the data wherein a survey was conducted to collect the data from 220 clients of housing projects in Saudi Arabia. Then, the telephonic and Skype interviews were conducted to collect data of 15 professionals working in the top ten real estate companies of Saudi Arabia. Data were analyzed by using partial least square (PLS) and thematic analysis techniques. Findings reveal that today’s customer prioritizes the safety and quality requirements of their houses and as a result, construction firms adopt QFD to address the needs of customers. The findings are of great importance for the clients of housing projects as well as for the construction firms as they could apply QFD in housing projects to address the safety and quality concerns of their clients.

Keywords: construction industry, quality considerations, quality function deployment, safety considerations

Procedia PDF Downloads 125
24008 Laser Beam Micro-Drilling Effect on Ti-6Al-4V Titanium Alloy Sheet Properties

Authors: Petr Homola, Roman Růžek

Abstract:

Laser beam micro-drilling (LBMD) is one of the most important non-contact machining processes of materials that are difficult to machine by means oeqf conventional machining methods used in various industries. The paper is focused on LBMD knock-down effect on Ti-6Al-4V (Grade 5) titanium alloy sheets properties. Two various process configurations were verified with a focus on laser damages in back-structure parts affected by the process. The effects of the LBMD on the material properties were assessed by means of tensile and fatigue tests and fracture surface analyses. Fatigue limit of LBMD configurations reached a significantly lower value between 15% and 30% of the static strength as compared to the reference raw material with 58% value. The farther back-structure configuration gives a two-fold fatigue life as compared to the closer LBMD configuration at a given stress applied.

Keywords: fatigue, fracture surface, laser beam micro-drilling, titanium alloy

Procedia PDF Downloads 156
24007 Restrictedly-Regular Map Representation of n-Dimensional Abstract Polytopes

Authors: Antonio Breda d’Azevedo

Abstract:

Regularity has often been present in the form of regular polyhedra or tessellations; classical examples are the nine regular polyhedra consisting of the five Platonic solids (regular convex polyhedra) and the four Kleper-Poinsot polyhedra. These polytopes can be seen as regular maps. Maps are cellular embeddings of graphs (with possibly multiple edges, loops or dangling edges) on compact connected (closed) surfaces with or without boundary. The n-dimensional abstract polytopes, particularly the regular ones, have gained popularity over recent years. The main focus of research has been their symmetries and regularity. Planification of polyhedra helps its spatial construction, yet it destroys its symmetries. To our knowledge there is no “planification” for n-dimensional polytopes. However we show that it is possible to make a “surfacification” of the n-dimensional polytope, that is, it is possible to construct a restrictedly-marked map representation of the abstract polytope on some surface that describes its combinatorial structures as well as all of its symmetries. We also show that there are infinitely many ways to do this; yet there is one that is more natural that describes reflections on the sides ((n−1)-faces) of n-simplices with reflections on the sides of n-polygons. We illustrate this construction with the 4-tetrahedron (a regular 4-polytope with automorphism group of size 120) and the 4-cube (a regular 4-polytope with automorphism group of size 384).

Keywords: abstract polytope, automorphism group, N-simplicies, symmetry

Procedia PDF Downloads 165
24006 GIS-Based Automatic Flight Planning of Camera-Equipped UAVs for Fire Emergency Response

Authors: Mohammed Sulaiman, Hexu Liu, Mohamed Binalhaj, William W. Liou, Osama Abudayyeh

Abstract:

Emerging technologies such as camera-equipped unmanned aerial vehicles (UAVs) are increasingly being applied in building fire rescue to provide real-time visualization and 3D reconstruction of the entire fireground. However, flight planning of camera-equipped UAVs is usually a manual process, which is not sufficient to fulfill the needs of emergency management. This research proposes a Geographic Information System (GIS)-based approach to automatic flight planning of camera-equipped UAVs for building fire emergency response. In this research, Haversine formula and lawn mowing patterns are employed to automate flight planning based on geometrical and spatial information from GIS. The resulting flight mission satisfies the requirements of 3D reconstruction purposes of the fireground, in consideration of flight execution safety and visibility of camera frames. The proposed approach is implemented within a GIS environment through an application programming interface. A case study is used to demonstrate the effectiveness of the proposed approach. The result shows that flight mission can be generated in a timely manner for application to fire emergency response.

Keywords: GIS, camera-equipped UAVs, automatic flight planning, fire emergency response

Procedia PDF Downloads 125
24005 Customers’ Acceptability of Islamic Banking: Employees’ Perspective in Peshawar

Authors: Tahira Imtiaz, Karim Ullah

Abstract:

This paper aims to incorporate the banks employees’ perspective on acceptability of Islamic banking by the customers of Peshawar. A qualitative approach is adopted for which six in-depth interviews with employees of Islamic banks are conducted. The employees were asked to share their experience regarding customers’ acceptance attitude towards acceptability of Islamic banking. Collected data was analyzed through thematic analysis technique and its synthesis with the current literature. Through data analysis a theoretical framework is developed, which highlights the factors which drive customers towards Islamic banking, as witnessed by the employees. The practical implication of analyzed data evident that a new model could be developed on the basis of four determinants of human preference namely: inner satisfaction, time, faith and market forces.

Keywords: customers’ attraction, employees’ perspective, Islamic banking, Riba

Procedia PDF Downloads 333
24004 Customized Design of Amorphous Solids by Generative Deep Learning

Authors: Yinghui Shang, Ziqing Zhou, Rong Han, Hang Wang, Xiaodi Liu, Yong Yang

Abstract:

The design of advanced amorphous solids, such as metallic glasses, with targeted properties through artificial intelligence signifies a paradigmatic shift in physical metallurgy and materials technology. Here, we developed a machine-learning architecture that facilitates the generation of metallic glasses with targeted multifunctional properties. Our architecture integrates the state-of-the-art unsupervised generative adversarial network model with supervised models, allowing the incorporation of general prior knowledge derived from thousands of data points across a vast range of alloy compositions, into the creation of data points for a specific type of composition, which overcame the common issue of data scarcity typically encountered in the design of a given type of metallic glasses. Using our generative model, we have successfully designed copper-based metallic glasses, which display exceptionally high hardness or a remarkably low modulus. Notably, our architecture can not only explore uncharted regions in the targeted compositional space but also permits self-improvement after experimentally validated data points are added to the initial dataset for subsequent cycles of data generation, hence paving the way for the customized design of amorphous solids without human intervention.

Keywords: metallic glass, artificial intelligence, mechanical property, automated generation

Procedia PDF Downloads 56
24003 Thermodynamic Analyses of Information Dissipation along the Passive Dendritic Trees and Active Action Potential

Authors: Bahar Hazal Yalçınkaya, Bayram Yılmaz, Mustafa Özilgen

Abstract:

Brain information transmission in the neuronal network occurs in the form of electrical signals. Neural work transmits information between the neurons or neurons and target cells by moving charged particles in a voltage field; a fraction of the energy utilized in this process is dissipated via entropy generation. Exergy loss and entropy generation models demonstrate the inefficiencies of the communication along the dendritic trees. In this study, neurons of 4 different animals were analyzed with one dimensional cable model with N=6 identical dendritic trees and M=3 order of symmetrical branching. Each branch symmetrically bifurcates in accordance with the 3/2 power law in an infinitely long cylinder with the usual core conductor assumptions, where membrane potential is conserved in the core conductor at all branching points. In the model, exergy loss and entropy generation rates are calculated for each branch of equivalent cylinders of electrotonic length (L) ranging from 0.1 to 1.5 for four different dendritic branches, input branch (BI), and sister branch (BS) and two cousin branches (BC-1 & BC-2). Thermodynamic analysis with the data coming from two different cat motoneuron studies show that in both experiments nearly the same amount of exergy is lost while generating nearly the same amount of entropy. Guinea pig vagal motoneuron loses twofold more exergy compared to the cat models and the squid exergy loss and entropy generation were nearly tenfold compared to the guinea pig vagal motoneuron model. Thermodynamic analysis show that the dissipated energy in the dendritic tress is directly proportional with the electrotonic length, exergy loss and entropy generation. Entropy generation and exergy loss show variability not only between the vertebrate and invertebrates but also within the same class. Concurrently, single action potential Na+ ion load, metabolic energy utilization and its thermodynamic aspect contributed for squid giant axon and mammalian motoneuron model. Energy demand is supplied to the neurons in the form of Adenosine triphosphate (ATP). Exergy destruction and entropy generation upon ATP hydrolysis are calculated. ATP utilization, exergy destruction and entropy generation showed differences in each model depending on the variations in the ion transport along the channels.

Keywords: ATP utilization, entropy generation, exergy loss, neuronal information transmittance

Procedia PDF Downloads 393
24002 Complex Decision Rules in Quality Assurance Processes for Quick Service Restaurant Industry: Human Factors Determining Acceptability

Authors: Brandon Takahashi, Marielle Hanley, Gerry Hanley

Abstract:

The large-scale quick-service restaurant industry is a complex business to manage optimally. With over 40 suppliers providing different ingredients for food preparation and thousands of restaurants serving over 50 unique food offerings across a wide range of regions, the company must implement a quality assurance process. Businesses want to deliver quality food efficiently, reliably, and successfully at a low cost that the public wants to buy. They also want to make sure that their food offerings are never unsafe to eat or of poor quality. A good reputation (and profitable business) developed over the years can be gone in an instant if customers fall ill eating your food. Poor quality also results in food waste, and the cost of corrective actions is compounded by the reduction in revenue. Product compliance evaluation assesses if the supplier’s ingredients are within compliance with the specifications of several attributes (physical, chemical, organoleptic) that a company will test to ensure that a quality, safe to eat food is given to the consumer and will deliver the same eating experience in all parts of the country. The technical component of the evaluation includes the chemical and physical tests that produce numerical results that relate to shelf-life, food safety, and organoleptic qualities. The psychological component of the evaluation includes organoleptic, which is acting on or involving the use of the sense organs. The rubric for product compliance evaluation has four levels: (1) Ideal: Meeting or exceeding all technical (physical and chemical), organoleptic, & psychological specifications. (2) Deviation from ideal but no impact on quality: Not meeting or exceeding some technical and organoleptic/psychological specifications without impact on consumer quality and meeting all food safety requirements (3) Acceptable: Not meeting or exceeding some technical and organoleptic/psychological specifications resulting in reduction of consumer quality but not enough to lessen demand and meeting all food safety requirements (4) Unacceptable: Not meeting food safety requirements, independent of meeting technical and organoleptic specifications or meeting all food safety requirements but product quality results in consumer rejection of food offering. Sampling of products and consumer tastings within the distribution network is a second critical element of the quality assurance process and are the data sources for the statistical analyses. Each finding is not independently assessed with the rubric. For example, the chemical data will be used to back up/support any inferences on the sensory profiles of the ingredients. Certain flavor profiles may not be as apparent when mixed with other ingredients, which leads to weighing specifications differentially in the acceptability decision. Quality assurance processes are essential to achieve that balance of quality and profitability by making sure the food is safe and tastes good but identifying and remediating product quality issues before they hit the stores. Comprehensive quality assurance procedures implement human factors methodologies, and this report provides recommendations for systemic application of quality assurance processes for quick service restaurant services. This case study will review the complex decision rubric and evaluate processes to ensure the right balance of cost, quality, and safety is achieved.

Keywords: decision making, food safety, organoleptics, product compliance, quality assurance

Procedia PDF Downloads 188
24001 R Data Science for Technology Management

Authors: Sunghae Jun

Abstract:

Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.

Keywords: technology management, R system, R data science, statistics, machine learning

Procedia PDF Downloads 458
24000 Mixture statistical modeling for predecting mortality human immunodeficiency virus (HIV) and tuberculosis(TB) infection patients

Authors: Mohd Asrul Affendi Bi Abdullah, Nyi Nyi Naing

Abstract:

The purpose of this study was to identify comparable manner between negative binomial death rate (NBDR) and zero inflated negative binomial death rate (ZINBDR) with died patients with (HIV + T B+) and (HIV + T B−). HIV and TB is a serious world wide problem in the developing country. Data were analyzed with applying NBDR and ZINBDR to make comparison which a favorable model is better to used. The ZINBDR model is able to account for the disproportionately large number of zero within the data and is shown to be a consistently better fit than the NBDR model. Hence, as a results ZINBDR model is a superior fit to the data than the NBDR model and provides additional information regarding the died mechanisms HIV+TB. The ZINBDR model is shown to be a use tool for analysis death rate according age categorical.

Keywords: zero inflated negative binomial death rate, HIV and TB, AIC and BIC, death rate

Procedia PDF Downloads 432
23999 Identifying the Structural Components of Old Buildings from Floor Plans

Authors: Shi-Yu Xu

Abstract:

The top three risk factors that have contributed to building collapses during past earthquake events in Taiwan are: "irregular floor plans or elevations," "insufficient columns in single-bay buildings," and the "weak-story problem." Fortunately, these unsound structural characteristics can be directly identified from the floor plans. However, due to the vast number of old buildings, conducting manual inspections to identify these compromised structural features in all existing structures would be time-consuming and prone to human errors. This study aims to develop an algorithm that utilizes artificial intelligence techniques to automatically pinpoint the structural components within a building's floor plans. The obtained spatial information will be utilized to construct a digital structural model of the building. This information, particularly regarding the distribution of columns in the floor plan, can then be used to conduct preliminary seismic assessments of the building. The study employs various image processing and pattern recognition techniques to enhance detection efficiency and accuracy. The study enables a large-scale evaluation of structural vulnerability for numerous old buildings, providing ample time to arrange for structural retrofitting in those buildings that are at risk of significant damage or collapse during earthquakes.

Keywords: structural vulnerability detection, object recognition, seismic capacity assessment, old buildings, artificial intelligence

Procedia PDF Downloads 89
23998 Impact of Firm Location and Organizational Structure on Receipt and Effectiveness of Social Assistance

Authors: Nalanda Matia, Julia Zhao, Amber Jaycocks, Divya Sinha

Abstract:

Social assistance programs for businesses are intended to improve their survival and growth in the face of catastrophic events like the COVID-19 pandemic. However, that goal remains unfulfilled when the mostwantingbusinesses fail to participate in such programs. Reasons for non-participation can include lack of information, inability to cope with applications and program compliance, as well as some programs’ non-entitlement status. Some of these factors may be associated with the organizational and locational characteristics of these businesses. This research investigates these organizational and locational factorsthat determine receipt and effectiveness of social assistance among the firms that receive it. of A sample of firms from the universe of 3 rounds of Small Business Administration backed Paycheck Protection Program recipient and similarly profiled non recipient businesses are used to analyze this question. Initial results show firm organizational factors like size and spatial factors like broadband coverage at firm location impact application for and subsequent receipt of assistance for digitally administered programs. Further, Line of business and wage structure of recipients’ impact effectiveness of the assistance dollars.

Keywords: public economics, economics of social assistance, firm organizational structure, survival analysis

Procedia PDF Downloads 168
23997 Building Safety Through Real-time Design Fire Protection Systems

Authors: Mohsin Ali Shaikh, Song Weiguo, Muhammad Kashan Surahio, Usman Shahid, Rehmat Karim

Abstract:

When the area of a structure that is threatened by a disaster affects personal safety, the effectiveness of disaster prevention, evacuation, and rescue operations can be summarized by three assessment indicators: personal safety, property preservation, and attribution of responsibility. These indicators are applicable regardless of the disaster that affects the building. People need to get out of the hazardous area and to a safe place as soon as possible because there's no other way to respond. The results of the tragedy are thus closely related to how quickly people are advised to evacuate and how quickly they are rescued. This study considers present fire prevention systems to address catastrophes and improve building safety. It proposes the methods of Prevention Level for Deployment in Advance and Spatial Transformation by Human-Machine Collaboration. We present and prototype a real-time fire protection system architecture for building disaster prevention, evacuation, and rescue operations. The design encourages the use of simulations to check the efficacy of evacuation, rescue, and disaster prevention procedures throughout the planning and design phase of the structure.

Keywords: prevention level, building information modeling, quality management system, simulated reality

Procedia PDF Downloads 69