Some Ideas of mine


Tax specifically 0.01 isking

CCP already tried to prevent bots from spamming order updates with relist fees. However, it had the opposite effect, as it made small-order spam and constant babysitting more effective - two traits that are found on bots.

The idea is to be sure players lose money when they cut an order by a small amount. The difficult part is to not make it prone to abuse, eg if a player spams a lot of small orders with varying prices then there is no room for competition.

Implementation : considering a 10% threshold, and only SO orders (almost same for BO but region-wide)
When a player places or update an order, he pays more isk than he would have not gained by cutting under the threshold.

example : there is a 1000 isk SO . I place a 999 isk SO. Then I have to pay the broker an additional fee of basically the difference to the threshold : since I cut very close, then the full difference , that is 10% of my order, will be added to the broker fee.

example 2 : there is a 100isk, 2 volume SO, as well as a 105isk, 2volume SO ; I place a SO of 5 volume at price 99.

  • two of my volume are matched against the 2 of the 100isk order : they are 1 isk below, with threshold 10 so a fee of 18 is added for both.
  • two of my volume are matched against the 2 of 105 isk : they are 6 isk below, with threshold 10.5 so 4.5×2=9 isk added to the broker fee for both.
  • the next of my volume is not matched against an existing offer, so no fee for this one.

This should avoid littering market with small orders, since the smaller your order and the closer to an existing one, the more (in %) your fees will be.

Price randomisation

This is the same problem as above, but a different approach. Instead of adding a tax, we make the price of the order be changed depending on near other orders.


The main goal is that if you try to cut, then you may fail.

The goals are :

  • The more other close orders there are, the higher chance of variation
    so that placing a second order to cut has more chance to fail.
  • The closer we are to other orders, the higher the chance of variation
    This means that if you are on the edge of “being close” you have lower chance to be have a change
  • The bigger the other orders, the more likely yours is changed
    This avoids small orders spamming.


We start with a few definition.

  • the close threshold ct is a variable that defines the % of an order into which another one is close. It is default 20%, with a non-alpha skill that can reduce it to 10%.
  • the dispersion d is a game variable that defines how much we consider the closeness. typically it is one, and is used to power the closeness variable.
  • the proximity p(a,b) of a price b to a price a indicates how close b is to a, in base 1. It is basically (a×ct/100 - abs(b-a) ) / (a×ct/100) , to the power of d .
  • the weight *w(a,b)*of an order b to an order a is how heavily the order b impacts the probability of change of the order a. It is p(a.price,b.price)×b.volume_remain . Since we can use 1-item volume, then the weight is a float.

Then, when an order a is created/updated , we find the actual price :

  1. list all the orders of same type and is_buy_order in the region
  2. for each order b that is close to a and is not a, assign it its cumulated weight (start weight is 0) cumul(b)
  3. then add the weight of a (which is a.volume_remain) to the last order cumulated weight to get the cumulated weight of a
  4. roll a dice between 0 and cumul(a) . The order that replaces the price of a is the one with the highest cumul for which the cumul is < dice. If none, then the price is not changed.

In termes of SQL, this should be something like

  pow( 1 - abs(o.price-price) / (price*ct/100), d) * o.volume_remain weight
  orders o
  and o.is_buy_order <> is_buy_order
  and o.order_id <> order_id
order by weight desc

(with a sum over if you want the sum directly)
“order by weight desc” makes it less costly to find the new value if there is a change.


Here we assume a ct=10% threshold and d=1 dispersion.

I want to place/update an order of 10 trit at 3.99 isk while there is an existing order of 10 trit at 4.0 isk. In that case, I have a 50% chance to have my order replaced to 4.0 isk instead : the other order has a proximity of [ (0.399 -0.01)/0.399 ]^1 = 0.975 so a weight of 9.75, while my order has a weight of 10 => 49.37% chance to be changed

Same new/updated order but there also is another order of 10 trit at 4.05. This order has a weight of 10× [ (0.399 -0.06)/0.399 ]^1 = 8.45 . Total weight of all orders is 28.2 so my order has a chance of 10/28.2 = 35% to not be changed.

Constant evolution

This model has two variables (base threshold and dispersion) that can be modified dynamically to prevent people (or bots) from adapting to the known variables, without the need to restart the server. Of course the delay between updates should also be random. Typically a 2-20 threshold range and 0.6-1.5 dispersion range seems good, with an update delay in minutes between 15 and 300(5h) . A 10h-period watchdog should be present to restart the job if it did not schedule itself correctly.
Also, the two variables should be updated in two different jobs (so with two different update times)

If this is not enough, we can add other variables to evolve. eg the self_weight sw multiplies the weight of the new order by this value. Range should be 1-1.5 .

Repeatable orders

Orders have a new “repeat_qtty” attributes. When the order is finished, then a new order is automatically created with the same attributes. For example I can create one SO of 5000 tritanium for 5isk, or one SO of 1000 tritanium and a repeat_quantity of 4000 .
In the later case, if someone purchases 900, the order has 100 remaining ; if the next purchase is of 500, only 100 are taken from the existing order, 400 next are taken from another order, and THEN a new order is created, qtty 1000/100, price 5 , broker fees are deduces, repeat_qtty is 3000 .

This allows to hide what you actually sell.

1 Like