Open Theses

21 Entries found


CSMA/CD for Wi-Fi

Master Thesis

Carrier Sense Multiple Access with Collision Detection (CSMA/CD) is a technique used in wired networks like Ethernet (IEEE 802.3) to improve network performance by efficient medium access. When a collision is detected, the colliding nodes terminate their transmissions to keep the collision time as short as possible. This effectively improves the utilization of the transmission medium, since less time is spent in collisions and the time between transmission attempts is reduced.

In wireless networks, however, CSMA/CD is generally assumed to be impractical due to the physical characteristics of the wireless channel. In fact, the power of a signal degrades by orders of magnitudes on its way from transmitter to receiver due to free space path loss and signal propagation effects, such as attenuation and reflections. Therefore, even if a transmitter was equipped with a separate receive antenna, its own transmission would typically drown out the weak signals from other transmitters, which would render the detection of weak signals impossible. Nevertheless, recent research has demonstrated that self-interference cancellation techniques become feasible, which allows to design full-duplex radios [1]. This might effectively be key to the design of CSMA/CD for IEEE 802.11-based networks, allowing for enhanced network performance under high load conditions [2].

[1] Mayank Jain, Jung Il Choi, Taemin Kim, Dinesh Bharadia, Siddharth Seth, Kannan Srinivasan, Philip Levis, Sachin Katti, and Prasun Sinha. "Practical, Real-Time, Full Duplex Wireless", 17th annual international conference on Mobile computing and networking (ACM MobiCom '11). Las Vegas, Nevada, USA, 2011, pp. 301-312.
[2] Konstantinos Voulgaris, Athanasios Gkelias, Imran Ashraf, Mischa Dohler and A. H. Aghvami. "Throughput Analysis of Wireless CSMA/CD for a Finite User Population", IEEE Vehicular Technology Conference, Montreal, Quebec, CA, 2006, pp. 1-5.

Knowledge of global network state is crucial for several innovative network optimization techniques. However, these techniques are often evaluated in simulation environments with omniscient knowledge about the network at individual nodes, which is not realistic in practical scenarios. In fact, an individual node's scope of the network is limited in practice since it is able to overhear the wireless channel only locally, and explicit notification about global network state would result in large overhead.

In this thesis project, you are going to engineer features and learning algorithms that allow nodes to gain knowledge about distant parts of a network just by overhearing the wireless channel. The difficulty is to identify features that comprise valuable information from distant nodes, which we believe might be feasible since multi-hop packet transfers may implicitly allow to monitor how distant nodes interact with the network.

 This topic is for you if you are interested in machine learning, wireless networks, and practical experimenting. The project might be co-supervised by another researcher from the collaborative research center MAKI, who is specialized either in the field of topology control, autonomous agents or machine learning techniques.

Bluetooth allows direct device-to-device communication, for instance, between smartphones. Especially BT Low Energy was conceived to be very energy-efficient. This is why vendors allow Bluetooth background operation which is crucial for Disruption-Tolerant Networks (DTNs) as smartphones act as data mules and therefore have to accept new “bundles” as they pass other nodes without user interaction.

In this thesis, you will first explore whether Bluetooth (Low Energy) is a suitable candidate link layer for ad hoc and disruption-tolerant networks. This includes energy efficiency, transmission speed, disruption tolerance, and cross-vendor compatibility.

Finally, you will implement a Bluetooth convergence layer in IBR-DTN [1] to enable DTN communication between (Android) smartphones without infrastructure.

  • BT/BT-LE performance/practicality analysis
    • Energy consumption (scanning, data transmission, …)
    • Transmission speed (1-1, 1-n, …, depending on distance, …)
    • Disruption tolerance (how long does it take for BT to realize that a connection is broken?)
    • Cross-vendor (Android, iOS, …)
  • Implement a BT convergence layer for IBR-DTN [1]
    • Neighbor discovery (energy efficient, …)
    • Data transmission (based on TCPCL?)



Millimeter-Wave (mm-wave) communication systems such as IEEE 802.11ad use directional beams that need to be trained prior to establishing a high-throughput connection. Such beam training protocols--the backbone of mm-wave communications--have a high impacts of the security of performance. Jamming or manipulating the frames associated with the beam steering might prevent a connection from being established or steer the beam for an adversary's benefit. We already obtained access to a WiFi chip of state-of-the-art routers at firmware level. 

Directional transmission used for millimeter wave communication arises many challenges. However, extreme spatial sharing of the millimeter wave spectrum boost the throughput per area by a significant amount. The increase in per area throughput is nevertheless still an open research! 

  • The state-of-the art of the channel access sharing in millimeter-wave and non-millimeter wave communications.
  • Define the challenges that are important to have an optimal sharing  between medium access. 
  • Development of a simulation tool or a simple test-bed to analyze the result of the proposed technique.

With industrial collaboration option: This project may require you to spend part of your time in a company and collaborate with them to achieve a practical solution. 

With industrial collaboration option: This project may require you to spend part of your time in a company and collaborate with them to achieve a practical solution. 

The channel characteristics of millimeter-wave communication systems at 60 GHz differ those in lower frequency bands and require a fundamental rethinking of network design. To investigate such aspects of network performance, we developed a raytracing based simulation framework to predict the signal quality in arbitrary environments. However, the internals in the simulation are based on theoretical considerations and models. So far, simulation results have not been compared to realistic measurements. 

Prof. Dr.-Ing. Matthias Hollick

Technische Universität Darmstadt
Department of Computer Science
Secure Mobile Networking Lab 

Mornewegstr. 32 (S4/14)
64293 Darmstadt, Germany

Phone: +49 6151 16-25472
Fax: +49 6151 16-25471

A A A | Drucken Drucken | Impressum Impressum | Sitemap Sitemap | Suche Suche | Kontakt Kontakt | Webseitenanalyse: Mehr Informationen
zum Seitenanfangzum Seitenanfang