Month: April 2019

Reddit

What is the functionality we need to support:
1. Post Messages on a Topic
2. Show All Messages under a Topic.
3. Search for a Topic
There can be other features, but let us restrict ourselves to these core features. We can always extend later.
What are our Entities?
1. Topic 2. Messages 3. Users.
Ok, Is this a Write-Heavy system or a Read Heavy System? I would say, read-Heavy ( Reads should be around 10/100 times more than writes- new posts would be not as frequent as post reads). From Scalability purpose, how do we scale our DB. We can use a MySQL DB with a Single Write Master and Multiple Read Slaves for scaling. This would hurt consistency – some users may not be able to see certain posts, but we should be fine with that at the cost of high availability. The problem here is that single master may not be enough , since our writes are also reasonably high. So, a sharded architecture would be preferrable. Should we shard using user_id, or post_id? Post_id should be preferred, so that all messages of a post go to same shard. We would use consistent hashing to distribute the posts to the shards, so that adding/removal of shards does not cause a problem.
Caching – We should cache the recent messages of each post and the recent posts- so that they can be fetched directly from cache, and do not need to hit the DB. We should use a separate Memcache/Redis Cluster for the Cache. This would be a write-Through Cache, so the recent posts would be written to the cache, and the cache would write to the db. This process should actually be asynchronous. When the user posts a message, we should push that to a message queue like SQS/Kafka and later consume these messages to write to the cache and then the DB.
MicroServices – We would ideally like to break out into multiple Services like ReadService and WriteService, so that they can scale separately. Distributed Transactions would be a challenge here, we should aim for eventual Consistency

https://www.careercup.com/question?id=5758206748917760

Multithreaded Cache

package cache;

import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentLinkedQueue;
import java.util.concurrent.locks.Lock;
import java.util.concurrent.locks.ReadWriteLock;
import java.util.concurrent.locks.ReentrantReadWriteLock;

public class  LRUCache<K,V> {
	
	private  ConcurrentLinkedQueue<K> concurrentLinkedQueue = new ConcurrentLinkedQueue<K>();
	
	private  ConcurrentHashMap<K,V> concurrentHashMap = new ConcurrentHashMap<K, V>();
	
	private ReadWriteLock readWriteLock = new ReentrantReadWriteLock();
	
	private Lock readLock = readWriteLock.readLock();
	
	private Lock writeLock = readWriteLock.writeLock();
	
	int maxSize=0;
	
	public LRUCache(final int MAX_SIZE){
		this.maxSize=MAX_SIZE;
	}
	
	public V getElement(K key){
		
		readLock.lock();
		try {
		V v=null;
		  if(concurrentHashMap.contains(key)){
			  concurrentLinkedQueue.remove(key);
			  v= concurrentHashMap.get(key);
				concurrentLinkedQueue.add(key);
		  }
		  
		
		return v;
		}finally{
			readLock.unlock();
		}
	}
	
	public V removeElement(K key){
		 writeLock.lock();
		 try {
		V v=null;
		if(concurrentHashMap.contains(key)){
		v=concurrentHashMap.remove(key);
			concurrentLinkedQueue.remove(key);
		}
		
		return v;
		 } finally {
			 writeLock.unlock();
		 }
	}
	
	public V addElement(K key,V value){
		writeLock.lock();
		try {
		if(concurrentHashMap.contains(key)){
			 concurrentLinkedQueue.remove(key);
		}
		while(concurrentLinkedQueue.size() >=maxSize){
			 K queueKey=concurrentLinkedQueue.poll();
			 concurrentHashMap.remove(queueKey);
		}
		concurrentLinkedQueue.add(key);
		concurrentHashMap.put(key, value);
		
		return value;
		} finally{
			writeLock.unlock();
		}
	}
}

https://www.careercup.com/question?id=17230678

class Node
    {
        Node prev,next;
        int val;
        int key;
        public Node(int key,int val)
        {
            this.key=key;
            this.val=val;
        }
    }
public class Solution {
   
    private Map<Integer,Node> map;
    int capacity;
    Node head,end;
    
    public Solution(int capacity) {
        this.capacity=capacity;
        head=null;end=null;
        map=new HashMap<>();
    }
    
    public int get(int key) {
        if(map.containsKey(key)){
            Node n= map.get(key);
            remove(n);
            setHead(n);
            return n.val;
        }
        return -1;
        
    }
    
    public void set(int key, int value) {
        Node n=map.get(key);
        if(n!=null)
        {
            n.val=value;
            remove(n);
        }
        else
        {
            if(map.size()>=capacity)
            {
                //evict
                map.remove(end.key);
                remove(end);
            }
            n=new Node(key,value);
            
        }
    map.put(key,n);
     setHead(n);
    }
    
    private void setHead(Node n)
    {
        if(head==null)
        {
            head=n;
            end=head;
        }
        else
        {
            n.prev=null;
            n.next=head;
            head.prev=n;
            head=n;
        }
    }
    private void remove(Node n)
    {
        if(n.prev==null)
        {
         head=n.next;
        }
        else if(n.next==null)
        {
            end=n.prev;
            end.next=null;
        }
        else
        {
        n.prev.next=n.next;
        n.next.prev=n.prev;
        }
    }
}