Search results for: scripting
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24

Search results for: scripting

24 A Goal-Driven Crime Scripting Framework

Authors: Hashem Dehghanniri

Abstract:

Crime scripting is a simple and effective crime modeling technique that aims to improve understanding of security analysts about security and crime incidents. Low-quality scripts provide a wrong, incomplete, or sophisticated understanding of the crime commission process, which oppose the purpose of their application, e.g., identifying effective and cost-efficient situational crime prevention (SCP) measures. One important and overlooked factor in generating quality scripts is the crime scripting method. This study investigates the problems within the existing crime scripting practices and proposes a crime scripting approach that contributes to generating quality crime scripts. It was validated by experienced crime scripters. This framework helps analysts develop better crime scripts and contributes to their effective application, e.g., SCP measures identification or policy-making.

Keywords: attack modelling, crime commission process, crime script, situational crime prevention

Procedia PDF Downloads 100
23 Cross Site Scripting (XSS) Attack and Automatic Detection Technology Research

Authors: Tao Feng, Wei-Wei Zhang, Chang-Ming Ding

Abstract:

Cross-site scripting (XSS) is one of the most popular WEB Attacking methods at present, and also one of the most risky web attacks. Because of the population of JavaScript, the scene of the cross site scripting attack is also gradually expanded. However, since the web application developers tend to only focus on functional testing and lack the awareness of the XSS, which has made the on-line web projects exist many XSS vulnerabilities. In this paper, different various techniques of XSS attack are analyzed, and a method automatically to detect it is proposed. It is easy to check the results of vulnerability detection when running it as a plug-in.

Keywords: XSS, no target attack platform, automatic detection,XSS detection

Procedia PDF Downloads 374
22 The Integration and Automation of EDA Tools in an Integrated Circuit Design Environment

Authors: Rohaya Abdul Wahab, Raja Mohd Fuad Tengku Aziz, Nazaliza Othman, Sharifah Saleh, Nabihah Razali, Rozaimah Baharim, M. Hanif M. Nasir

Abstract:

This paper will discuss how EDA tools are integrated and automated in an Integrated Circuit Design Environment. Some of the problems face in our current environment is that users need to configure manually on the library paths, start-up files and project directories. Certain manual processes that happen between the users and applications can be automated but they must be transparent to the users. For example, the users can run the applications directly after login without knowing the library paths and start-up files locations. The solution to these problems is to automate the processes using standard configuration files which will benefit the users and EDA support. This paper will discuss how the implementation is done to automate the process using scripting languages such as Perl, Tcl, Scheme and Shell Script. These scripting tools are great assets for design engineers to build a robust and powerful design flow and this technique is widely used to integrate all the tools together.

Keywords: EDA tools, Integrated Circuits, scripting, integration, automation

Procedia PDF Downloads 292
21 CVOIP-FRU: Comprehensive VoIP Forensics Report Utility

Authors: Alejandro Villegas, Cihan Varol

Abstract:

Voice over Internet Protocol (VoIP) products is an emerging technology that can contain forensically important information for a criminal activity. Without having the user name and passwords, this forensically important information can still be gathered by the investigators. Although there are a few VoIP forensic investigative applications available in the literature, most of them are particularly designed to collect evidence from the Skype product. Therefore, in order to assist law enforcement with collecting forensically important information from variety of Betamax VoIP tools, CVOIP-FRU framework is developed. CVOIP-FRU provides a data gathering solution that retrieves usernames, contact lists, as well as call and SMS logs from Betamax VoIP products. It is a scripting utility that searches for data within the registry, logs and the user roaming profiles in Windows and Mac OSX operating systems. Subsequently, it parses the output into readable text and html formats. One superior way of CVOIP-FRU compared to the other applications that due to intelligent data filtering capabilities and cross platform scripting back end of CVOIP-FRU, it is expandable to include other VoIP solutions as well. Overall, this paper reveals the exploratory analysis performed in order to find the key data paths and locations, the development stages of the framework, and the empirical testing and quality assurance of CVOIP-FRU.

Keywords: betamax, digital forensics, report utility, VoIP, VoIPBuster, VoIPWise

Procedia PDF Downloads 264
20 Revolutionizing Gaming Setup Design: Utilizing Generative and Iterative Methods to Prop and Environment Design, Transforming the Landscape of Game Development Through Automation and Innovation

Authors: Rashmi Malik, Videep Mishra

Abstract:

The practice of generative design has become a transformative approach for an efficient way of generating multiple iterations for any design project. The conventional way of modeling the game elements is very time-consuming and requires skilled artists to design. A 3D modeling tool like 3D S Max, Blender, etc., is used traditionally to create the game library, which will take its stipulated time to model. The study is focused on using the generative design tool to increase the efficiency in game development at the stage of prop and environment generation. This will involve procedural level and customized regulated or randomized assets generation. The paper will present the system design approach using generative tools like Grasshopper (visual scripting) and other scripting tools to automate the process of game library modeling. The script will enable the generation of multiple products from the single script, thus creating a system that lets designers /artists customize props and environments. The main goal is to measure the efficacy of the automated system generated to create a wide variety of game elements, further reducing the need for manual content creation and integrating it into the workflow of AAA and Indie Games.

Keywords: iterative game design, generative design, gaming asset automation, generative game design

Procedia PDF Downloads 41
19 Improving the Run Times of Existing and Historical Demand Models Using Simple Python Scripting

Authors: Abhijeet Ostawal, Parmjit Lall

Abstract:

The run times for a large strategic model that we were managing had become too long leading to delays in project delivery, increased costs and loss in productivity. Software developers are continuously working towards developing more efficient tools by changing their algorithms and processes. The issue faced by our team was how do you apply the latest technologies on validated existing models which are based on much older versions of software that do not have the latest software capabilities. The multi-model transport model that we had could only be run in sequential assignment order. Recent upgrades to the software now allowed the assignment to be run in parallel, a concept called parallelization. Parallelization is a Python script working only within the latest version of the software. A full model transfer to the latest version was not possible due to time, budget and the potential changes in trip assignment. This article is to show the method to adapt and update the Python script in such a way that it can be used in older software versions by calling the latest version and then recalling the old version for assignment model without affecting the results. Through a process of trial-and-error run time savings of up to 30-40% have been achieved. Assignment results were maintained within the older version and through this learning process we’ve applied this methodology to other even older versions of the software resulting in huge time savings, more productivity and efficiency for both client and consultant.

Keywords: model run time, demand model, parallelisation, python scripting

Procedia PDF Downloads 92
18 Creating Futures: Using Fictive Scripting Methods for Institutional Strategic Planning

Authors: Christine Winberg, James Garraway

Abstract:

Many key university documents, such as vision and mission statements and strategic plans, are aspirational and future-oriented. There is a wide range of future-oriented methods that are used in planning applications, ranging from mathematical modelling to expert opinions. Many of these methods have limitations, and planners using these tools might, for example, make the technical-rational assumption that their plans will unfold in a logical and inevitable fashion, thus underestimating the many complex forces that are at play in planning for an unknown future. This is the issue that this study addresses. The overall project aim was to assist a new university of technology in developing appropriate responses to its social responsibility, graduate employability and research missions in its strategic plan. The specific research question guiding the research activities and approach was: how might the use of innovative future-oriented planning tools enable or constrain a strategic planning process? The research objective was to engage collaborating groups in the use of an innovative tool to develop and assess future scenarios, for the purpose of developing deeper understandings of possible futures and their challenges. The scenario planning tool chosen was ‘fictive scripting’, an analytical technique derived from Technology Forecasting and Innovation Studies. Fictive scripts are future projections that also take into account the present shape of the world and current developments. The process thus began with a critical diagnosis of the present, highlighting its tensions and frictions. The collaborative groups then developed fictive scripts, each group producing a future scenario that foregrounded different institutional missions, their implications and possible consequences. The scripts were analyzed with a view to identifying their potential contribution to the university’s strategic planning exercise. The unfolding fictive scripts revealed a number of insights in terms of unexpected benefits, unexpected challenges, and unexpected consequences. These insights were not evident in previous strategic planning exercises. The contribution that this study offers is to show how better choices can be made and potential pitfalls avoided through a systematic foresight exercise. When universities develop strategic planning documents, they are looking into the future. In this paper it is argued that the use of appropriate tools for future-oriented exercises, can help planners to understand more fully what achieving desired outcomes might entail, what challenges might be encountered, and what unexpected consequences might ensue.

Keywords: fictive scripts, scenarios, strategic planning, technological forecasting

Procedia PDF Downloads 98
17 Development of a Wind Resource Assessment Framework Using Weather Research and Forecasting (WRF) Model, Python Scripting and Geographic Information Systems

Authors: Jerome T. Tolentino, Ma. Victoria Rejuso, Jara Kaye Villanueva, Loureal Camille Inocencio, Ma. Rosario Concepcion O. Ang

Abstract:

Wind energy is rapidly emerging as the primary source of electricity in the Philippines, although developing an accurate wind resource model is difficult. In this study, Weather Research and Forecasting (WRF) Model, an open source mesoscale Numerical Weather Prediction (NWP) model, was used to produce a 1-year atmospheric simulation with 4 km resolution on the Ilocos Region of the Philippines. The WRF output (netCDF) extracts the annual mean wind speed data using a Python-based Graphical User Interface. Lastly, wind resource assessment was produced using a GIS software. Results of the study showed that it is more flexible to use Python scripts than using other post-processing tools in dealing with netCDF files. Using WRF Model, Python, and Geographic Information Systems, a reliable wind resource map is produced.

Keywords: wind resource assessment, weather research and forecasting (WRF) model, python, GIS software

Procedia PDF Downloads 417
16 An Integrated Web-Based Workflow System for Design of Computational Pipelines in the Cloud

Authors: Shuen-Tai Wang, Yu-Ching Lin

Abstract:

With more and more workflow systems adopting cloud as their execution environment, it presents various challenges that need to be addressed in order to be utilized efficiently. This paper introduces a method for resource provisioning based on our previous research of dynamic allocation and its pipeline processes. We present an abstraction for workload scheduling in which independent tasks get scheduled among various available processors of distributed computing for optimization. We also propose an integrated web-based workflow designer by taking advantage of the HTML5 technology and chaining together multiple tools. In order to make the combination of multiple pipelines executing on the cloud in parallel, we develop a script translator and an execution engine for workflow management in the cloud. All information is known in advance by the workflow engine and tasks are allocated according to the prior knowledge in the repository. This proposed effort has the potential to provide support for process definition, workflow enactment and monitoring of workflow processes. Users would benefit from the web-based system that allows creation and execution of pipelines without scripting knowledge.

Keywords: workflow systems, resources provisioning, workload scheduling, web-based, workflow engine

Procedia PDF Downloads 128
15 Ontology for Cross-Site-Scripting (XSS) Attack in Cybersecurity

Authors: Jean Rosemond Dora, Karol Nemoga

Abstract:

In this work, we tackle a frequent problem that frequently occurs in the cybersecurity field which is the exploitation of websites by XSS attacks, which are nowadays considered a complicated attack. These types of attacks aim to execute malicious scripts in a web browser of the client by including code in a legitimate web page. A serious matter is when a website accepts the “user-input” option. Attackers can exploit the web application (if vulnerable), and then steal sensitive data (session cookies, passwords, credit cards, etc.) from the server and/or from the client. However, the difficulty of the exploitation varies from website to website. Our focus is on the usage of ontology in cybersecurity against XSS attacks, on the importance of the ontology, and its core meaning for cybersecurity. We explain how a vulnerable website can be exploited, and how different JavaScript payloads can be used to detect vulnerabilities. We also enumerate some tools to use for an efficient analysis. We present detailed reasoning on what can be done to improve the security of a website in order to resist attacks, and we provide supportive examples. Then, we apply an ontology model against XSS attacks to strengthen the protection of a web application. However, we note that the existence of ontology does not improve the security itself, but it has to be properly used and should require a maximum of security layers to be taken into account.

Keywords: cybersecurity, web application vulnerabilities, cyber threats, ontology model

Procedia PDF Downloads 139
14 Analysis of Spamming Threats and Some Possible Solutions for Online Social Networking Sites (OSNS)

Authors: Dilip Singh Sisodia, Shrish Verma

Abstract:

Spamming is the most common issue seen nowadays in the Internet especially in Online Social Networking Sites (like Facebook, Twitter, and Google+ etc.). Spam messages keep wasting Internet bandwidth and the storage space of servers. On social network sites; spammers often disguise themselves by creating fake accounts and hijacking user’s accounts for personal gains. They behave like normal user and they continue to change their spamming strategy. To prevent this, most modern spam-filtering solutions are deployed on the receiver side; they are good at filtering spam for end users. In this paper we are presenting some spamming techniques their behaviour and possible solutions. We have analyzed how Spammers enters into online social networking sites (OSNSs) and how they target it and the techniques they use for it. The five discussed techniques of spamming techniques which are clickjacking, social engineered attacks, cross site scripting, URL shortening, and drive by download. We have used elgg framework for demonstration of some of spamming threats and respective implementation of solutions.

Keywords: online social networking sites, spam, attacks, internet, clickjacking / likejacking, drive-by-download, URL shortening, networking, socially engineered attacks, elgg framework

Procedia PDF Downloads 306
13 NextCovps: Design and Stress Analysis of Dome Composite Overwrapped Pressure Vessels using Geodesic Trajectory Approach

Authors: Ammar Maziz, Prateek Gupta, Thiago Vasconcellos Birro, Benoit Gely

Abstract:

Hydrogen as a sustainable fuel has the highest energy density per mass as compared to conventional non-renewable sources. As the world looks to move towards sustainability, especially in the sectors of aviation and automotive, it becomes important to address the issue of storage of hydrogen as compressed gas in high-pressure tanks. To improve the design for the efficient storage and transportation of Hydrogen, this paper presents the design and stress analysis of Dome Composite Overwrapped Pressure Vessels (COPVs) using the geodesic trajectory approach. The geodesic trajectory approach is used to optimize the dome design, resulting in a lightweight and efficient structure. Python scripting is employed to implement the mathematical modeling of the COPV, and after validating the model by comparison to the published paper, stress analysis is conducted using Abaqus commercial code. The results demonstrate the effectiveness of the geodesic trajectory approach in achieving a lightweight and structurally sound dome design, as well as the accuracy and reliability of the stress analysis using Abaqus commercial code. This study provides insights into the design and analysis of COPVs for aerospace applications, with the potential for further optimization and application in other industries.

Keywords: composite overwrapped pressure vessels, carbon fiber, geodesic trajectory approach, dome design, stress analysis, plugin python

Procedia PDF Downloads 60
12 ParkedGuard: An Efficient and Accurate Parked Domain Detection System Using Graphical Locality Analysis and Coarse-To-Fine Strategy

Authors: Chia-Min Lai, Wan-Ching Lin, Hahn-Ming Lee, Ching-Hao Mao

Abstract:

As world wild internet has non-stop developments, making profit by lending registered domain names emerges as a new business in recent years. Unfortunately, the larger the market scale of domain lending service becomes, the riskier that there exist malicious behaviors or malwares hiding behind parked domains will be. Also, previous work for differentiating parked domain suffers two main defects: 1) too much data-collecting effort and CPU latency needed for features engineering and 2) ineffectiveness when detecting parked domains containing external links that are usually abused by hackers, e.g., drive-by download attack. Aiming for alleviating above defects without sacrificing practical usability, this paper proposes ParkedGuard as an efficient and accurate parked domain detector. Several scripting behavioral features were analyzed, while those with special statistical significance are adopted in ParkedGuard to make feature engineering much more cost-efficient. On the other hand, finding memberships between external links and parked domains was modeled as a graph mining problem, and a coarse-to-fine strategy was elaborately designed by leverage the graphical locality such that ParkedGuard outperforms the state-of-the-art in terms of both recall and precision rates.

Keywords: coarse-to-fine strategy, domain parking service, graphical locality analysis, parked domain

Procedia PDF Downloads 387
11 Finite Element Modelling of a 3D Woven Composite for Automotive Applications

Authors: Ahmad R. Zamani, Luigi Sanguigno, Angelo R. Maligno

Abstract:

A 3D woven composite, designed for automotive applications, is studied using Abaqus Finite Element (FE) software suite. Python scripts were developed to build FE models of the woven composite in Complete Abaqus Environment (CAE). They can read TexGen or WiseTex files and automatically generate consistent meshes of the fabric and the matrix. A user menu is provided to help define parameters for the FE models, such as type and size of the elements in fabric and matrix as well as the type of matrix-fabric interaction. Node-to-node constraints were imposed to guarantee periodicity of the deformed shapes at the boundaries of the representative volume element of the composite. Tensile loads in three axes and biaxial loads in x-y directions have been applied at different Fibre Volume Fractions (FVFs). A simple damage model was implemented via an Abaqus user material (UMAT) subroutine. Existing tools for homogenization were also used, including voxel mesh generation from TexGen as well as Abaqus Micromechanics plugin. Linear relations between homogenised elastic properties and the FVFs are given. The FE models of composite exhibited balanced behaviour with respect to warp and weft directions in terms of both stiffness and strength.

Keywords: 3D woven composite (3DWC), meso-scale finite element model, homogenisation of elastic material properties, Abaqus Python scripting

Procedia PDF Downloads 113
10 Actresses as Eunuchs: The Versatility of Cross-Gendered Roles in Eighteenth-Century Orientalist Theatre

Authors: Anne Greenfield

Abstract:

Introductory Statement: During the eighteenth century in London, there were over two dozen theatrical productions that featured eunuchoid characters, most of which were set in 'Eastern' locales, including the Ottoman Empire, Persia, India, and China. These characters have gone largely overlooked by recent scholars, and more analysis is needed in order to illustrate the contemporary values and anxieties reflected in these popular and recurring figures at the time. Methodology: This paper adopts a New Historical and Cultural Studies approach to the subject of theatrical depictions of eunuchs, drawing insights from seventeenth- and eighteenth-century literary works, travel narratives, medical treatises, and histories of the age. Major Findings: As this paper demonstrates, there was a high degree of complexity, variety, and -at times- respect underlying orientalist theatrical depictions of eunuchs. Not only were eunuchoid characters represented in strikingly diverse ways in scripts, but these roles were also played by a heterogeneous group of actors and even actresses. More specifically, this paper looks closely at three actresses who took roles as eunuchs in tragedies: Mrs. Verbruggen (aka Mrs. Mountfort), Mrs. Rogers, and Mrs. Bicknell—all of whom were otherwise best known as comediennes. These casting choices provided an entertaining twist on the breeches roles these actresses often played. In fact, the staging and scripting of these roles, when analyzed through the lens of these cross-gendered roles, becomes ironic and comical in several scenes that are usually assumed (by recent scholars) to be thoroughly tragic. Conclusion: Ultimately, a careful look at the staging of eunuchoid characters sheds light on not only how these productions were performed and understood, but also on how writers and theatre managers navigated the Other, whether in gender identity or culture, during this era.

Keywords: eunuch, actress, literature, drama

Procedia PDF Downloads 104
9 Application of a Lighting Design Method Using Mean Room Surface Exitance

Authors: Antonello Durante, James Duff, Kevin Kelly

Abstract:

The visual needs of people in modern work based buildings are changing. Self-illuminated screens of computers, televisions, tablets and smart phones have changed the relationship between people and the lit environment. In the past, lighting design practice was primarily based on providing uniform horizontal illuminance on the working plane, but this has failed to ensure good quality lit environments. Lighting standards of today continue to be set based upon a 100 year old approach that at its core, considers the task illuminance of the utmost importance, with this task typically being located on a horizontal plane. An alternative method focused on appearance has been proposed, as opposed to the traditional performance based approach. Mean Room Surface Exitance (MRSE) and Target-Ambient Illuminance Ratio (TAIR) are two new metrics proposed to assess illumination adequacy in interiors. The hypothesis is that these factors will be superior to the existing metrics used, which are horizontal illuminance led. For the six past years, research has examined this, within the Dublin Institute of Technology, with a view to determining the suitability of this approach for application to general lighting practice. Since the start of this research, a number of key findings have been produced that centered on how occupants will react to various levels of MRSE. This paper provides a broad update on how this research has progressed. More specifically, this paper will: i) Demonstrate how MRSE can be measured using HDR images technology, ii) Illustrate how MRSE can be calculated using scripting and an open source lighting computation engine, iii) Describe experimental results that demonstrate how occupants have reacted to various levels of MRSE within experimental office environments.

Keywords: illumination hierarchy (IH), mean room surface exitance (MRSE), perceived adequacy of illumination (PAI), target-ambient illumination ratio (TAIR)

Procedia PDF Downloads 161
8 Profiling Risky Code Using Machine Learning

Authors: Zunaira Zaman, David Bohannon

Abstract:

This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.

Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties

Procedia PDF Downloads 76
7 Scalable UI Test Automation for Large-scale Web Applications

Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani

Abstract:

This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.

Keywords: aws, elastic container service, scalability, serverless, ui automation test

Procedia PDF Downloads 66
6 Wildlife Habitat Corridor Mapping in Urban Environments: A GIS-Based Approach Using Preliminary Category Weightings

Authors: Stefan Peters, Phillip Roetman

Abstract:

The global loss of biodiversity is threatening the benefits nature provides to human populations and has become a more pressing issue than climate change and requires immediate attention. While there have been successful global agreements for environmental protection, such as the Montreal Protocol, these are rare, and we cannot rely on them solely. Thus, it is crucial to take national and local actions to support biodiversity. Australia is one of the 17 countries in the world with a high level of biodiversity, and its cities are vital habitats for endangered species, with more of them found in urban areas than in non-urban ones. However, the protection of biodiversity in metropolitan Adelaide has been inadequate, with over 130 species disappearing since European colonization in 1836. In this research project we conceptualized, developed and implemented a framework for wildlife Habitat Hotspots and Habitat Corridor modelling in an urban context using geographic data and GIS modelling and analysis. We used detailed topographic and other geographic data provided by a local council, including spatial and attributive properties of trees, parcels, water features, vegetated areas, roads, verges, traffic, and census data. Weighted factors considered in our raster-based Habitat Hotspot model include parcel size, parcel shape, population density, canopy cover, habitat quality and proximity to habitats and water features. Weighted factors considered in our raster-based Habitat Corridor model include habitat potential (resulting from the Habitat Hotspot model), verge size, road hierarchy, road widths, human density, and presence of remnant indigenous vegetation species. We developed a GIS model, using Python scripting and ArcGIS-Pro Model-Builder, to establish an automated reproducible and adjustable geoprocessing workflow, adaptable to any study area of interest. Our habitat hotspot and corridor modelling framework allow to determine and map existing habitat hotspots and wildlife habitat corridors. Our research had been applied to the study case of Burnside, a local council in Adelaide, Australia, which encompass an area of 30 km2. We applied end-user expertise-based category weightings to refine our models and optimize the use of our habitat map outputs towards informing local strategic decision-making.

Keywords: biodiversity, GIS modeling, habitat hotspot, wildlife corridor

Procedia PDF Downloads 83
5 Sexual Consent: Exploring the Perceptions of Heterosexual, Gay, and Bisexual Men

Authors: Shulamit Sternin, Raymond M. McKie, Carter Winberg, Robb N. Travers, Terry P. Humphreys, Elke D. Reissing

Abstract:

Issues surrounding sexual consent negotiation have become a major topic of societal concern. The majority of current research focuses on the complexities of sexual consent negotiations and the multitude of nuanced issues that surround the consent obtainment of heterosexual adults in post-secondary educational institutions. To date, the only study that has addressed sexual consent negotiation behaviour in same-sex relationships focused on the extent to which individuals used a variety of different verbal and nonverbal sexual consent behaviours to initiate or respond to sexual activity. The results were consistent with trends found within heterosexual individuals; thus, suggesting that the current understanding of sexual consent negotiation, which is grounded in heterosexual research, can serve as a strong foundation for further exploration of sexual consent negotiation within same-sex relationships populations. The current study quantitatively investigated the differences between heterosexual men and gay and bisexual men (GBM) in their understanding of sexual consent negotiation. Exploring how the perceptions of GBM differ from heterosexual males provides insight into some of the unique challenges faced by GBM. Data were collected from a sample of 252 heterosexual men and 314 GBM from Canada, the United States, and Western Europe. Participants responded to the question, 'do you think sexual consent and sex negotiation is different for heterosexual men compared to gay men? If so, how?' by completed an online survey. Responses were analysed following Braun & Clarke’s (2006) six phase thematic analysis guidelines. Inter-rater coding was validated using Cohen’s Kappa value and was calculated at (ϰ = 0.84), indicating a very strong level of agreement between raters. The final thematic structure yielded four major themes: understanding of sexual interaction, unique challenges, scripted role, and universal consent. Respondents spoke to their understanding of sexual interaction, believing GBM sexual consent negotiation to be faster and more immediate. This was linked to perceptions of emotional attachment and the idea that sexual interaction and emotional involvement were distinct and separate processes in GBM sexual consent negotiation, not believed to be the case in heterosexual interactions. Unique challenges such as different protection concerns, role declaration, and sexualization of spaces were understood to hold differing levels of consideration for heterosexual men and GBM. The perception of a clearly defined sexual script for GBM was suggested as a factor that may create ambiguity surrounding sexual consent negotiation, which in turn holds significant implications on unwanted sexual experiences for GBM. Broadening the scope of the current understanding of sexual consent negotiation by focusing on heterosexual and GBM population, the current study has revealed variations in perception of sexual consent negotiation between these two populations. These differences may be understood within the context of sexual scripting theory and masculinity gender role theory. We suggest that sexual consent negotiation is a health risk factor for GBM that has not yet been adequately understood and addressed. Awareness of the perceptions that surround the sexual consent negotiation of both GBM and heterosexual men holds implications on public knowledge, which in turn can better inform policy making, education, future research, and clinical treatment.

Keywords: sexual consent, negotiation, heterosexual men, GBM, sexual script

Procedia PDF Downloads 170
4 Automation of Finite Element Simulations for the Design Space Exploration and Optimization of Type IV Pressure Vessel

Authors: Weili Jiang, Simon Cadavid Lopera, Klaus Drechsler

Abstract:

Fuel cell vehicle has become the most competitive solution for the transportation sector in the hydrogen economy. Type IV pressure vessel is currently the most popular and widely developed technology for the on-board storage, based on their high reliability and relatively low cost. Due to the stringent requirement on mechanical performance, the pressure vessel is subject to great amount of composite material, a major cost driver for the hydrogen tanks. Evidently, the optimization of composite layup design shows great potential in reducing the overall material usage, yet requires comprehensive understanding on underlying mechanisms as well as the influence of different design parameters on mechanical performance. Given the type of materials and manufacturing processes by which the type IV pressure vessels are manufactured, the design and optimization are a nuanced subject. The manifold of stacking sequence and fiber orientation variation possibilities have an out-standing effect on vessel strength due to the anisotropic property of carbon fiber composites, which make the design space high dimensional. Each variation of design parameters requires computational resources. Using finite element analysis to evaluate different designs is the most common method, however, the model-ing, setup and simulation process can be very time consuming and result in high computational cost. For this reason, it is necessary to build a reliable automation scheme to set up and analyze the di-verse composite layups. In this research, the simulation process of different tank designs regarding various parameters is conducted and automatized in a commercial finite element analysis framework Abaqus. Worth mentioning, the modeling of the composite overwrap is automatically generated using an Abaqus-Python scripting interface. The prediction of the winding angle of each layer and corresponding thickness variation on dome region is the most crucial step of the modeling, which is calculated and implemented using analytical methods. Subsequently, these different composites layups are simulated as axisymmetric models to facilitate the computational complexity and reduce the calculation time. Finally, the results are evaluated and compared regarding the ultimate tank strength. By automatically modeling, evaluating and comparing various composites layups, this system is applicable for the optimization of the tanks structures. As mentioned above, the mechanical property of the pressure vessel is highly dependent on composites layup, which requires big amount of simulations. Consequently, to automatize the simulation process gains a rapid way to compare the various designs and provide an indication of the optimum one. Moreover, this automation process can also be operated for creating a data bank of layups and corresponding mechanical properties with few preliminary configuration steps for the further case analysis. Subsequently, using e.g. machine learning to gather the optimum by the data pool directly without the simulation process.

Keywords: type IV pressure vessels, carbon composites, finite element analy-sis, automation of simulation process

Procedia PDF Downloads 93
3 Generative Design of Acoustical Diffuser and Absorber Elements Using Large-Scale Additive Manufacturing

Authors: Saqib Aziz, Brad Alexander, Christoph Gengnagel, Stefan Weinzierl

Abstract:

This paper explores a generative design, simulation, and optimization workflow for the integration of acoustical diffuser and/or absorber geometry with embedded coupled Helmholtz-resonators for full-scale 3D printed building components. Large-scale additive manufacturing in conjunction with algorithmic CAD design tools enables a vast amount of control when creating geometry. This is advantageous regarding the increasing demands of comfort standards for indoor spaces and the use of more resourceful and sustainable construction methods and materials. The presented methodology highlights these new technological advancements and offers a multimodal and integrative design solution with the potential for an immediate application in the AEC-Industry. In principle, the methodology can be applied to a wide range of structural elements that can be manufactured by additive manufacturing processes. The current paper focuses on a case study of an application for a biaxial load-bearing beam grillage made of reinforced concrete, which allows for a variety of applications through the combination of additive prefabricated semi-finished parts and in-situ concrete supplementation. The semi-prefabricated parts or formwork bodies form the basic framework of the supporting structure and at the same time have acoustic absorption and diffusion properties that are precisely acoustically programmed for the space underneath the structure. To this end, a hybrid validation strategy is being explored using a digital and cross-platform simulation environment, verified with physical prototyping. The iterative workflow starts with the generation of a parametric design model for the acoustical geometry using the algorithmic visual scripting editor Grasshopper3D inside the building information modeling (BIM) software Revit. Various geometric attributes (i.e., bottleneck and cavity dimensions) of the resonator are parameterized and fed to a numerical optimization algorithm which can modify the geometry with the goal of increasing absorption at resonance and increasing the bandwidth of the effective absorption range. Using Rhino.Inside and LiveLink for Revit, the generative model was imported directly into the Multiphysics simulation environment COMSOL. The geometry was further modified and prepared for simulation in a semi-automated process. The incident and scattered pressure fields were simulated from which the surface normal absorption coefficients were calculated. This reciprocal process was repeated to further optimize the geometric parameters. Subsequently the numerical models were compared to a set of 3D concrete printed physical twin models, which were tested in a .25 m x .25 m impedance tube. The empirical results served to improve the starting parameter settings of the initial numerical model. The geometry resulting from the numerical optimization was finally returned to grasshopper for further implementation in an interdisciplinary study.

Keywords: acoustical design, additive manufacturing, computational design, multimodal optimization

Procedia PDF Downloads 135
2 Solymorph: Design and Fabrication of AI-Driven Kinetic Facades with Soft Robotics for Optimized Building Energy Performance

Authors: Mohammadreza Kashizadeh, Mohammadamin Hashemi

Abstract:

Solymorph, a kinetic building facade designed for optimal energy capture and architectural expression, is explored in this paper. The system integrates photovoltaic panels with soft robotic actuators for precise solar tracking, resulting in enhanced electricity generation compared to static facades. Driven by the growing interest in dynamic building envelopes, the exploration of novel facade systems is necessitated. Increased energy generation and regulation of energy flow within buildings are potential benefits offered by integrating photovoltaic (PV) panels as kinetic elements. However, incorporating these technologies into mainstream architecture presents challenges due to the complexity of coordinating multiple systems. To address this, Solymorph leverages soft robotic actuators, known for their compliance, resilience, and ease of integration. Additionally, the project investigates the potential for employing Large Language Models (LLMs) to streamline the design process. The research methodology involved design development, material selection, component fabrication, and system assembly. Grasshopper (GH) was employed within the digital design environment for parametric modeling and scripting logic, and an LLM was experimented with to generate Python code for the creation of a random surface with user-defined parameters. Various techniques, including casting, 3D printing, and laser cutting, were utilized to fabricate the physical components. Finally, a modular assembly approach was adopted to facilitate installation and maintenance. A case study focusing on the application of Solymorph to an existing library building at Politecnico di Milano is presented. The facade system is divided into sub-frames to optimize solar exposure while maintaining a visually appealing aesthetic. Preliminary structural analyses were conducted using Karamba3D to assess deflection behavior and axial loads within the cable net structure. Additionally, Finite Element (FE) simulations were performed in Abaqus to evaluate the mechanical response of the soft robotic actuators under pneumatic pressure. To validate the design, a physical prototype was created using a mold adapted for a 3D printer's limitations. Casting Silicone Rubber Sil 15 was used for its flexibility and durability. The 3D-printed mold components were assembled, filled with the silicone mixture, and cured. After demolding, nodes and cables were 3D-printed and connected to form the structure, demonstrating the feasibility of the design. Solymorph demonstrates the potential of soft robotics and Artificial Intelligence (AI) for advancements in sustainable building design and construction. The project successfully integrates these technologies to create a dynamic facade system that optimizes energy generation and architectural expression. While limitations exist, Solymorph paves the way for future advancements in energy-efficient facade design. Continued research efforts will focus on cost reduction, improved system performance, and broader applicability.

Keywords: artificial intelligence, energy efficiency, kinetic photovoltaics, pneumatic control, soft robotics, sustainable building

Procedia PDF Downloads 24
1 Optimized Electron Diffraction Detection and Data Acquisition in Diffraction Tomography: A Complete Solution by Gatan

Authors: Saleh Gorji, Sahil Gulati, Ana Pakzad

Abstract:

Continuous electron diffraction tomography, also known as microcrystal electron diffraction (MicroED) or three-dimensional electron diffraction (3DED), is a powerful technique, which in combination with cryo-electron microscopy (cryo-ED), can provide atomic-scale 3D information about the crystal structure and composition of different classes of crystalline materials such as proteins, peptides, and small molecules. Unlike the well-established X-ray crystallography method, 3DED does not require large single crystals and can collect accurate electron diffraction data from crystals as small as 50 – 100 nm. This is a critical advantage as growing larger crystals, as required by X-ray crystallography methods, is often very difficult, time-consuming, and expensive. In most cases, specimens studied via 3DED method are electron beam sensitive, which means there is a limitation on the maximum amount of electron dose one can use to collect the required data for a high-resolution structure determination. Therefore, collecting data using a conventional scintillator-based fiber coupled camera brings additional challenges. This is because of the inherent noise introduced during the electron-to-photon conversion in the scintillator and transfer of light via the fibers to the sensor, which results in a poor signal-to-noise ratio and requires a relatively higher and commonly specimen-damaging electron dose rates, especially for protein crystals. As in other cryo-EM techniques, damage to the specimen can be mitigated if a direct detection camera is used which provides a high signal-to-noise ratio at low electron doses. In this work, we have used two classes of such detectors from Gatan, namely the K3® camera (a monolithic active pixel sensor) and Stela™ (that utilizes DECTRIS hybrid-pixel technology), to address this problem. The K3 is an electron counting detector optimized for low-dose applications (like structural biology cryo-EM), and Stela is also a counting electron detector but optimized for diffraction applications with high speed and high dynamic range. Lastly, data collection workflows, including crystal screening, microscope optics setup (for imaging and diffraction), stage height adjustment at each crystal position, and tomogram acquisition, can be one of the other challenges of the 3DED technique. Traditionally this has been all done manually or in a partly automated fashion using open-source software and scripting, requiring long hours on the microscope (extra cost) and extensive user interaction with the system. We have recently introduced Latitude® D in DigitalMicrograph® software, which is compatible with all pre- and post-energy-filter Gatan cameras and enables 3DED data acquisition in an automated and optimized fashion. Higher quality 3DED data enables structure determination with higher confidence, while automated workflows allow these to be completed considerably faster than before. Using multiple examples, this work will demonstrate how to direct detection electron counting cameras enhance 3DED results (3 to better than 1 Angstrom) for protein and small molecule structure determination. We will also show how Latitude D software facilitates collecting such data in an integrated and fully automated user interface.

Keywords: continuous electron diffraction tomography, direct detection, diffraction, Latitude D, Digitalmicrograph, proteins, small molecules

Procedia PDF Downloads 59